Previous Top News: 2021


  • The U.S. Supreme Court has vacated the Ninth Circuit's decision in LinkedIn v. hiQ Labs but will not decide the merits of the case, instead sending the case back to the Ninth Circuit for a new decision in light of Van Buren v. United States. EPIC had filed an amicus brief in support of the Petition for Certiorari. The LinkedIn v. hiQ petition asked whether hiQ lacked authorization to access LinkedIn's servers under the Computer Fraud and Abuse Act after LinkedIn used a combination of technical and verbal methods to cut off hiQ's access to the website to stop the company from scraping user data. hiQ sued LinkedIn to regain access to the website, arguing that its business model depended on access to LinkedIn user data. A district court granted hiQ's request for an injunction, which LinkedIn appealed. EPIC filed an amicus brief in the Ninth Circuit arguing that the injunction was "contrary to the interests of individual LinkedIn users" and contrary to the public interest "because it undermines the principles of modern privacy and data protection law." The Ninth Circuit upheld the injunction, finding that hiQ's economic interests outweighed the interests in protecting users' personal information. In its amicus brief in support of LinkedIn's petition for cert, EPIC explained that the Ninth Circuit's decision "makes it impossible" for companies to protect personal data and sets "a dangerous precedent that could threaten the privacy of user data." The EPIC amicus brief highlighted the business practices of Clearview AI, a company that scraped billions of photographs to create a secretive facial recognition system. The case will most likely be sent back to the district court for a new decision that accords with Van Buren v. United States. (Jun. 14, 2021)

  • In a report to Parliament the Canadian Privacy Commissioner concluded that the Royal Canadian Mounted Police (RMCP) violated the Canadian Privacy Act by using Clearview AI's facial recognition project. The Commissioner's report follows a February 2021 investigative report that Clearview AI violated Canada's Personal Information Protection and Electronic Documents Act by scraping images off social media sites to create a facial recognition database so that "billions of people essentially found themselves in a '24/7' police line-up." Recently, in an open letter EPIC and a coalition of more than 175 civil society organizations and prominent individuals called for "an outright ban on uses of facial recognition and remote biometric recognition technologies that enable mass surveillance and discriminatory targeted surveillance." (Jun. 10, 2021)

  • In a DC Council Hearing (video starts at 13:22), Chairman Phil Mendelson asked Metropolitan Washington Council of Government's (MWCOG's) Executive Director Chuck Bean for more information on the soon to be shuttered DC-area facial recognition system. The Chairman's questions were prompted by a meeting with EPIC in which EPIC staff pushed for more disclosures on the MWCOG's role in the creation of a secret facial recognition system used to surveil Black Lives Matter protesters last year. Recently, EPIC joined over 40 other organizations to detail the issues with cops using facial recognition and call for a law enforcement ban on the technology's use. (Jun. 10, 2021)

  • EPIC and 23 other leading civil society groups sent a letter to President Biden today urging his Administration to ensure that any new transatlantic data transfer deal is coupled with the enactment of U.S. laws that reform government surveillance practices and provide comprehensive privacy protections. “The United States’ failure to ensure meaningful privacy protections for personal data is the reason that a growing number of countries are concerned about trans-border data flows,” the groups wrote. “Until the United States addresses this problem, concerns about data transfers to the United States will remain, and data flow agreements are likely to be invalidated.” In 2015, the Court of Justice of the European Union invalidated the U.S.-EU Safe Harbor agreement. And in July 2020, the successor agreement, Privacy Shield, was also invalidated by the same court. [PRESS RELEASE] (Jun. 10, 2021)

  • EPIC has filed an amicus brief in Cothron v. White Castle, a case about when violations of Illinois's Biometric Information Privacy Act ("BIPA") can be vindicated in court. Cothron alleges that White Castle collected and disclosed her fingerprints for a decade in violation of BIPA. White Castle is trying to scuttle the case, claiming that an individual is only able to sue the first time a company violates their BIPA rights because it is only then that an individual "loses control" of their biometric data and suffers a legal injury. White Castle argues that, even if the company continued to violate BIPA to this day, they shouldn't be held liable because the first violation was long enough ago that it falls outside the statute of limitations. But the Illinois Supreme Court held in Rosenbach v. Six Flags that every violation of BIPA confers the right to sue. The district court accordingly rejected White Castle's argument, but certified the question to a federal appeals court. EPIC filed an amicus brief in the appeals court and argued that White Castle's proposed rule would effectively "overrule the Illinois Supreme Court on a question of state law" by attempting "to import arguments about Article III standing into the BIPA statutory injury analysis." EPIC also argued that White Castle is "mistaken about the underlying purpose of BIPA" and that White Castle's rule "would in fact undermine BIPA’s purposes" because it "would remove the key incentive for companies who previously violated BIPA to come into compliance, adopt responsible biometric data practices, and seek informed consent." EPIC has filed amicus briefs in other BIPA cases, including Rosenbach v. Six Flags and Patel v. Facebook, and regularly participates as amicus in cases concerning the right to sue for privacy violations. (Jun. 7, 2021)

  • In an open letter, EPIC and a coalition of more than 175 civil society organizations, activists, technologists, and other experts called "for an outright ban on uses of facial recognition and remote biometric recognition technologies that enable mass surveillance and discriminatory targeted surveillance." The letter urges lawmakers around the world to stop public investment in facial recognition, prohibit government and private use of facial recognition in public spaces, and mandate disclosure and reparations to individuals monitored or harmed by biometric mass surveillance systems. The letter identifies one-to-many facial recognition identification (comparing an image to a gallery of identified images) as inherently dangerous to the public because the databases of images enable discriminatory targeted surveillance and the technology itself enables comprehensive public surveillance. EPIC began pushing for a ban in 2019 with the launch of the Ban Face Surveillance campaign and recently joined over 40 other organizations to call for a ban on U.S. law enforcement's use of facial recognition technology. (Jun. 7, 2021)

  • The Eleventh Circuit recently ruled that the $425 million class action settlement arising from the 2017 Equifax data breach, which compromised the personal data of nearly half of all Americans, should move forward. The district court previously approved the settlement in 2020, but it has been stayed pending the appeal. The settlement was supported, by various government agencies including the CFPB, the FTC, and 48 state Attorneys General, but several class members raised objections about the adequacy of the relief. The Eleventh Circuit rejected those objections, and now the settlement will move forward in the lower court. Meanwhile, a related $575 million settlement entered into by Equifax and the FTC, CFPB, and most state Attorneys General in 2019 will allow people affected by the breach to file a claim for expenses occurred between January 2020 and January 2024 as a result of identity theft or fraud related to the breach; people can also be compensated for up to 20 hours of time spent on recovering from the breach. Equifax was also required to pay $125 for each person who claimed they were wronged by the breach, but the company has so far failed to do so. This was one of the largest data breaches in history, and it has revealed the dire need to improve data security in the United States. (Jun. 4, 2021)

  • The Washington Post Editorial Board called on Congress to impose a nationwide moratorium on facial recognition technology until it can pass legislation requiring technical and legal safeguards for the use of the technology. The Post cited the recent shutdown of a DC-area facial recognition system after an EPIC-led coalition organized against the system. In 2019, EPIC launched the Ban Face Surveillance campaign and through the Public Voice coalition gathered the support of over 100 organizations and many leading experts across 30 plus countries. An EPIC-led coalition urged the Privacy and Civil Liberties Oversight Board to recommend the suspension of face surveillance systems across the federal government. EPIC has joined with other organizations to oppose school administrators' use of facial recognition, urge President Biden to halt the federal use of facial recognition, and press Congress to stop the use and investment in facial recognition. Most recently, EPIC joined over 40 other organizations to detail the issues with cops using facial recognition and call for a law enforcement ban on the technology's use. (Jun. 3, 2021)

  • In today’s decision in Van Buren v. United States, the Supreme Court determined that a police officer who improperly accessed a license plate record could not be held liable under a federal computer crimes law, the Computer Fraud and Abuse Act. EPIC highlighted the serious privacy concerns with government employees’ improper access to sensitive personal information in government databases in the amicus brief we filed in this case, and several justices echoes these concerns during oral argument. The outcome of this case highlights the urgent need for comprehensive privacy legislation. We need enforceable rules to prevent improper access to and misuse of personal information contained in both government and private databases.

    The Court also did not resolve what it means for someone to have “authorization” to access a computer or to be “entitled” to access information in the computer. The Court endorsed a general “gates-up-or-down approach”—meaning an individual either has authorization to access the computer or specific information within the computer or it does not—but explicitly left open the question whether the prohibitions on access must be technical or whether they can be contract-based. The range of criminalized activities may, in some respects, still be much broader than even the Government was advocating. Certain website terms of service that prohibit specific individuals or groups from accessing the website may still be enforceable even if the individuals have no knowledge of the restrictions and the website owners do nothing else to limit access. An 18 year-old who accesses a website restricted to those over the age of 21 may violate the CFAA, but a police officer who knowingly accesses personal information to stalk and harass the individual does not.

    The Court also did not clearly answer more complicated access questions about web scraping, and the Court should grant the pending petition in LinkedIn v. hiQ Labs to resolve these questions. Web scraping involves accessing a computer using a technical method that is often prohibited by a website's terms of service and also blocked using technical barriers. EPIC filed an amicus brief in support of the petition. (Jun. 3, 2021)

  • In a statement of concerns, EPIC and a coalition of more than 40 privacy, civil liberties, immigrants rights, and good government groups stated that "the most comprehensive approach to addressing the harms of face recognition would be to entirely cease its use by law enforcement." The statement lists six concerns with police use of the technology that can only be addressed by halting its use. The coalition calls for a moratorium or ban on use of facial recognition and urges Congress to not preempt state or local bans in any federal legislation addressing facial recognition. EPIC recently organized a coalition letter that led to the shutdown of a DC-area facial recognition system previously used on Black Lives Matter protesters. EPIC leads a campaign to Ban Face Surveillance and through the Public Voice Coalition has gathered support from over 100 organizations and experts from more than 30 countries. (Jun. 3, 2021)

  • An ordinance passed in King County, Washington bans "any person or entity acting on behalf of a King County administrative office or executive department" fromusing facial recognition technology or information derived from it. The ban includes the King County Sheriffs Department. Seattle's King County is the first county in the nation to ban government use of facial recognition technology. EPIC recently sought records on the US Postal Service's Internet Covert Operations Program use of Clearview AI facial recognition and other surveillance software. EPIC leads a campaign to Ban Face Surveillance and through the Public Voice Coalition has gathered support from over 100 organizations and experts from more than 30 countries. (Jun. 2, 2021)

  • WhatsApp previously threatened sanctions against users who would not accept the company’s new terms of use with weaker privacy protections, but backed down late Friday after a coalition of groups from around the world protested. Burcu Kilic, digital rights program director for Public Citizen, released the following statement in response: “Thank you for stopping what you never should have started. Now please also undo what you coerced millions of people into accepting.” In 2014, EPIC and the Center for Digital Democracy warned the FTC that Facebook routinelyincorporates user data from companies it acquires and that WhatsApp users objected to the acquisition. The FTC approved the merger but told EPIC and CDD that "if the acquisition is completed and WhatsApp fails to honor these promises, both companies could be in violation of Section 5 of the FTC Act and potentially the FTC's order against Facebook." (Jun. 1, 2021)


  • The U.S. Innovation and Competition Act introduced recently by Senate Majority Leader Chuck Schumer would earmark $53 billion for technological and AI development yet fails to include critical safeguards for federal AI deployment. One section of the bill, the Endless Frontier Act, would significantly increase National Science Foundation funding to expand research and improve the diversity of the STEM workforce. The bill would also allocate funds for analyzing and combatting human rights violations in China and promoting "American Leadership" in AI development. Another section of the bill, the Advancing American AI Act, would incrementally improve the transparency and accountability of government AI use. The Office of Management and Budget would be tasked with ensuring that federal contracts for AI systems address "privacy, civil rights, and civil liberties," and each agency would be required to assemble and publish (when "practicable") an inventory of its AI systems. However, the bill—much of which tracks recommendations by the NSCAI—fails to establish binding limitations on federal AI use and offers little protection for members of the public injured by government-operated AI systems. EPIC previously urged the Commission to recommend substantive limits on AI to protect individuals against harmful, biased, invasive, and unreliable AI systems. (May. 28, 2021)

  • Senator Ed Markey (MA) and Representative Doris Matsui (CA) introduced the Algorithmic Justice and Online Transparency Act of 2021 today. The bill prohibits discrimination based on protected classes for algorithmic processes on online platforms, requires online platform companies to create and maintain documentation about their algorithms for review by the FTC, and sets out a standard for what safe and effective algorithmic processes would be. The bill also calls for the creation of an inter-agency task force to investigate discriminatory algorithmic processes including the Federal Trade Commission, Department of Housing and Urban Development, Department of Education, Department of Justice, and the Department of Commerce. EPIC endorses the bill, and has been advocating for Algorithmic Transparency and Equity, specifically urging state, federal, and international governments to regulate harmful AI guided by the Universal Guidelines for AI. Last year, EPIC petitioned the FTC to establish a rule making regulating algorithmic tools in order to address discrimination. (May. 27, 2021)

  • D.C. Attorney General Karl Racine filed a lawsuit today against Amazon alleging that the online retail giant has violated the District of Columbia Antitrust Act. The complaint accuses Amazon of stifling competition by imposing contractual clauses that prevent third-party sellers from offering lower prices outside of the Amazon platform. The lawsuit explains that the agreements ultimately lead to higher prices for consumers and less innovation. “Amazon wins because it controls pricing across the online retail sales market, putting itself at an advantage over everyone else,” Racine told reporters. “These restrictions allow Amazon to build and maintain monopoly power.” In February, EPIC filed a complaint with the D.C. Attorney General alleging that Amazon unlawfully employs dark patterns to manipulate consumers when they attempt to cancel their Amazon Prime subscriptions. These dark patterns enable Amazon to continue collecting subscription fees and retain the personal data of misdirected subscribers. EPIC also signed onto a recent coalition letter calling for the Federal Trade Commission to investigate Amazon’s use of dark patterns in the Prime cancellation process. EPIC has long argued that anticompetitive practices and market consolidation in the technology sector pose a threat to privacy rights. (May. 25, 2021)

  • EPIC, through a Freedom of Information Act request and letter to the USPS Privacy Office, is seeking the required Privacy Impact Assessment for the Internet Covert Operations Program (iCOP) operated by the U.S. Postal Inspection Service. First revealed by Yahoo News in April, the iCOP uses Clearview AI's facial recognition system and a suite of social media monitoring tools to surveil individuals online, including protesters. EPIC also urged the USPS Privacy Office to fully comply with the E-Government Act of 2002 by proactively publishing privacy impact assessments online. EPIC leads a campaign to Ban Face Surveillance and through the Public Voice Coalition has gathered support from over 100 organizations and experts from more than 30 countries. (May. 25, 2021)

  • This week, the grand chamber of the European Court of Human Rights issued a final judgement in Big Brother Watch v. UK confirming that the UK's intelligence agency violated the right to privacy by systematically intercepting online communications without first applying necessary safeguards. The agency's mass surveillance program was "not in accordance with [EU] law," which only allows governments to retain data in an effort to combat "serious crime" and requires a court or administrative body to sign off on data collection. The UK law at issue was not limited to serious crime, nor did it require independent authorization; these "fundamental deficiencies" impermissibly increased the "risk of the bulk interception power being abused." Nevertheless, the grand chamber found that the agency's decision to operate a bulk interception program did not itself violate human rights, and the agency's sharing of sensitive digital intelligence with foreign counterparts--including with the NSA--was legal. Several chamber judges believed this ruling did not go far enough to condemn the sharing of wrongfully collected communications with other countries, noting the chamber "missed an excellent opportunity to fully uphold the importance of private life ... when faced with interference in the form of mass surveillance." EPIC has a strong interest in protecting the human right to privacy and has continuously opposed suspicionless mass collection of personal communications by domestic and foreign governments. EPIC participated in this case as a third-party intervenor and filed a brief describing U.S. intelligence authorities that allow the NSA to access the private communications of non-U.S. persons in violation of their rights. EPIC was also chosen by the Irish High Court to make amicus submissions in a case involving the international transfer of data from European servers to the U.S. in violation of E.U. law. (May. 25, 2021)

  • EPIC's Student Privacy Project has been selected for inclusion in the spring 2021 Tech Spotlight Casebook, a publication of the Harvard Kennedy School's Belfer Center for Science and International Affairs. The casebook "recognizes projects and initiatives that demonstrate a commitment to public purpose in the areas of digital, biotech, and future of work." The book highlights EPIC's recent efforts to halt the use of unfair, unreliable, and invasive remote proctoring tools and the D.C. consumer protection complaint EPIC filed against online proctoring firms. "Through meticulous research, the Student Privacy Project revealed the extent to which these companies collect and process student personal and biometric data," the casebook explains. "The complaint attempts to hold the five companies accountable for their practices by demonstrating how the data collection and processing practices may violate existing law." The casebook also recognizes recent work around census privacy protections, community control over police surveillance, racially biased speech recognition tools, and the use of "garbage" facial recognition to identify criminal suspects. A ceremony will be held Thursday, May 20 at 1 p.m. ET. (May. 19, 2021)

  • The Metropolitan Washington Council of Governments (MWCOG) informed EPIC today that the National Capital Region Facial Recognition System (NCR-FRILS) will be shut down by July 1, 2021. The system is used by police departments and government agencies in the DC, Maryland, and Virginia area. EPIC led a coalition that recently sent a letter to the MWCOG demanding an end to the system citing the dangerous nature of facial recognition and racial bias in facial recognition software. A recently passed law in Virginia requiring approval from the General Assembly before using facial recognition was going to curtail NCR-FRILS use in that state. The facial recognition system was first disclosed last year after it was used to identify a protester at a Black Lives Matter rally who was accused of assault. (May. 14, 2021)

  • In comments to the DHS's Data Privacy and Integrity Advisory Committee (DPIAC), EPIC urged a comprehensive review of DHS's Information Sharing Access Agreements (ISAAs) prioritizing the most sensitive types of data, information from marginalized groups, and agreements disclosing information to unreliable partners. EPIC's comments respond to DPIAC's tasking to provide guidance to the DHS Privacy Office after an OIG audit revealed that thousands of ISAAs had never been reviewed for compliance with privacy laws and regulations. EPIC previously urged DPIAC to undertake a comprehensive investigation of fusion centers for chronic privacy and civil liberties abuses. (May. 14, 2021)

  • The Irish High Court today issued an order in a follow-on case to Irish Data Protection Commissioner v. Facebook and Schrems ("Schrems II") and, as a result, the investigation into Facebook's U.S.-EU data transfers will move forward. The case arises from a complaint filed with the DPC in Ireland against Facebook by privacy activist Max Schrems in 2013 alleging that the company violated EU law when it transferred personal data to the U.S. (where the company is obliged to provide access to the government). The case has since been referred two separate times to the highest court in Europe (the CJEU), and has led to the invalidation of both the U.S.-EU Safe Harbor Agreement and the U.S.-EU Privacy Shield Agreement. The CJEU in the Schrems II decision last year remanded the case to the Irish DPC to determine whether Facebook violated the law and whether it was necessary to block Facebook's U.S.-EU data transfers. The DPC later issued a Preliminary Draft Decision to Facebook and laid out procedures for the inquiry. Both Facebook and Schrems challenged the DPC procedures. The DPC agreed in a settlement with Schrems that it would complete the investigation into his original complaint. The Irish High Court today rejected Facebook's challenge to the DPC inquiry, and both the Schrems complaint and this new DPC inquiry against Facebook will move forward. EPIC participated as an amicus curiae in Schrems II, arguing that U.S. Surveillance law does not provide adequate privacy protections or remedies for non-U.S. persons abroad. (May. 14, 2021)

  • Today, Congresswoman Lori Trahan (MA-03) led a group of fellow Congressional Hispanic Caucus members in writing a letter calling on Facebook Chairman and CEO Mark Zuckerberg to reverse the company’s decision to require WhatsApp users to accept expanded data collection or leave the platform entirely. “We write to respectfully ask Facebook to consider reversing WhatsApp’s decision to update their new terms of service. We believe Facebook is potentially offering a false choice to users across the globe: accept the sharing of metadata with Facebook by May 15th or leave the platform altogether,” the lawmakers wrote. In 2014, EPIC and the Center for Digital Democracy warned the FTC that Facebook incorporates user data from companies it acquires, and that WhatsApp users objected to the acquisition. The FTC responded to EPIC and CDD and told Facebook and WhatsApp that "if the acquisition is completed and WhatsApp fails to honor these promises, both companies could be in violation of Section 5 of the FTC Act and potentially the FTC's order against Facebook." The FTC letter noted that "hundreds of millions of users have entrusted their personal information to WhatsApp. The FTC staff continue to monitor the companies' practices to ensure that Facebook and WhatsApp honor the promises they have made to those users." In their letter, the members highlight that pledge and the FTC's statement. (May. 11, 2021)

  • More than 40 state attorneys general have sent a letter to Mark Zuckerberg pressuring Facebook to drop its plans to launch a version of Instagram for children younger than 13. The Attorneys General, led by Massachusetts Attorney General Maura Healey, expressed bipartisan support to protect children’s privacy and their physical and mental health. The AGs raised concerns about Facebook’s history of privacy incidents, stating “Facebook has a record of failing to protect the safety and privacy of children on its platform, despite claims that its products have strict privacy controls[.]” The Campaign for a Commercial-Free Childhood commented “If Facebook insists on plowing ahead, it’s the clearest sign yet that the company views itself as accountable to no one, even when it comes to the well-being of children, and must be regulated much more rigorously,” and lawmakers have similarly expressed concerns about children’s privacy issues with social media. EPIC signed on to a coalition letter by the Campaign for a Commercial-free Childhood that urged Zuckerberg to cancel plans to launch a version of Instagram for Children under 13. (May. 11, 2021)

  • According to a news report, the Biden Administration plans to rescind a proposed rule to massively expand the collection of biometric information from immigrants. The rule, proposed towards the end of the Trump Administration, would have granted the Department of Homeland Security broad authority to collect biometric data from immigrants and their families and associates. The rule would have enabled the collecting of palm prints, iris images, voiceprints, DNA, and images for facial recognition regardless of age. In comments to the Department of Homeland Security, EPIC opposed the rule and urged the agency to rescind the proposed rule. EPIC argued that DHS']s broad authorization to collect biometrics was incompatible with the Department's Fair Information Practice Principle. EPIC also specifically called on the agency to suspend the use of facial recognition technology. Last year, EPIC, joined by over 40 organizations called for the Privacy and Civil Liberties Oversight Board to recommend the suspension of face surveillance systems across the federal government. (May. 11, 2021)

  • In comments to the Health and Human Services Department (HHS), EPIC opposed proposed changes to the HIPAA Privacy Rule reducing restrictions on disclosing patients’ Protected Health Information (PHI). HHS's proposed rule would expand the entities that can receive PHI without patient consent, lower the standard for disclosing PHI in the process of care coordination, and specifically authorized certain non-consensual disclosures of PHI for patients with mental illness and substance abuse disorders. EPIC argued that the modifications will expose patients to greater risk of data breach and increase barriers to receiving care for stigmatized populations without providing benefits to patients. Recently, EPIC Executive Director Alan Butler and Counsel Enid Zhou published a paper in the American University Law Review analyzing the increased collection of health data during the Covid-19 pandemic. (May. 7, 2021)

  • The White House has launched AI.gov, the new website of the National Artificial Intelligence Initiative Office featuring reports, policy priorities, and news about artificial intelligence from across the federal government. The site lists "Advancing Trustworthy AI" and "International Cooperation" as two of six top priorities for federal AI policy, embracing the Organization for Economic Cooperation and Development AI Principles and the G20 AI Principles. EPIC has urged both the White House and Congress to prioritize human rights over AI adoption and has recommended the OECD Principles and the Universal Guidelines for Artificial Intelligence as baseline frameworks for regulating AI and mitigating algorithmic harms. EPIC has also fought for transparency in AI policymaking, successfully suing the National Security Commission on Artificial Intelligence to enforce its public records and open meetings obligations. (May. 5, 2021)

  • Through a Freedom of Information Act request to the Department of Homeland Security, EPIC obtained records circulated in a 2018 election security meeting with members of the U.S. House of Representatives. On May 22, 2018, then-DHS Secretary Kirstjen Nielsen, then-Federal Bureau of Investigation Director Christopher Wray, and then-Director of National Intelligence Dan Coats held a classified briefing for members of Congress informing them of the risks to the election process and steps the administration was taking to assist state officials in ensuring election security. The briefing materials include charts on election infrastructure cyber risk scenarios and cybersecurity considerations, as well as compiled anecdotes of the DHS's engagement with state election security officials. These anecdotes highlighted how states have taken efforts to strengthen their election systems for the 2018 mid-term elections, including some states taking up the voluntary election security resources from DHS. EPIC sued the DHS for records about the agency’s assessment of election vulnerabilities following the 2016 presidential election and its ongoing role in protecting election systems as critical infrastructure. The agency released hundreds of pages of records to EPIC about its role in election cybersecurity, with records revealing the agency's rocky initial involvement in election security following its 2017 designation of election infrastructure as critical infrastructure and how far the agency has come since then. The case is EPIC v. DHS, 17-2047 (D.D.C.). (May. 5, 2021)

  • In a letter to Spotify, EPIC and a coalition of over 100 recording artists, 69 non-profit organizations, and 10 prominent individuals urged the streaming service to publicly commit not to explore a newly-patented voice-recognition feature. Spotify's new patent would allow the company to identify individuals' "emotional state, gender, age, or accent" to recommend music. The coalition letter identified major concerns with the potential technology including emotional manipulation, discrimination, massive privacy violations, and increased inequality within the music industry. Spotify recently stated that the company has not implemented the technology, and claims to have "no plans" to do so. EPIC leads a campaign to Ban Face Surveillance and through the Public Voice Coalition gathered support from over 100 organizations and experts from more than 30 countries. (May. 4, 2021)

  • The Massachusetts Attorney General, following up on a letter from EPIC and a coalition of civil society groups, wrote to major pharmacies today seeking details about their collection and use of personal data from COVID-19 vaccine recipients. The federal government is coordinating with retail pharmacies to facilitate vaccine distribution. But as EPIC and coalition partners warned last month, some pharmacies "are requiring patients seeking access to the vaccine to register through their existing customer portals, which in turn exposes patients to broad personal data collection and marketing." The Massachusetts AG letter calls on pharmacies to explain what personal data they collect from vaccine patients, what disclosures they make, whether the pharmacies will use the data for commercial purposes, and whether the data is being stored separately from general customer information. "[A]ccess to life-saving vaccines should not be conditioned on a consumer's consent to provide personal data not necessary for the vaccination administration," the AG's letter explains. "Nor can consent to such data collection or marketing be presumed based on a consumer's desire to obtain a vaccination." The CDC recently issued a directive prohibiting health providers "from using any data gathered in the course of their participation in the CDC COVID-19 Vaccination Program, including any Protected Health Information or other Personally Identifiable Information, for commercial marketing purposes." EPIC and coalition partners also asked officials in California, Illinois, New York, and the District of Columbia to investigate and prevent pharmacies from putting vaccine patient data to commercial use. (May. 3, 2021)

  • A divided panel of the D.C. Circuit, ruling today in EPIC's case against the FAA Drone Advisory Committee, held that the committee can keep the records of its controversial working groups secret. EPIC filed suit in 2018 against the industry-dominated body, which ignored the privacy risks posed by the deployment of drones even after identifying privacy as a top public concern. As a result of EPIC's lawsuit, the committee was forced to disclose hundreds of pages of records under the Federal Advisory Committee Act. But a lower court ruled in 2019 that the records from the committee's working groups could be withheld from the public—a decision that the D.C. Circuit affirmed today. Judge Robert L. Wilkins, writing in dissent, accused the majority of "doing violence to the text" of the FACA and argued that the decision "undermines FACA's purpose and greenlights an easily abusable system[.]" Noting the "obvious privacy concerns that drones present" and the fact that the DAC was "stacked with industry representatives," Wilkins warned that "[w]e should look with suspicion upon agency efforts to circumvent FACA by using subgroups." The case is EPIC v. Drone Advisory Committee, No. 19-5238 (D.C. Cir.). (Apr. 30, 2021)

  • The Foreign Intelligence Surveillance Court (FISC) recently disclosed an opinion revealing that the FBI has repeatedly misused Section 702 of Foreign Intelligence Surveillance Act (FISA) to gather information in domestic investigations. Section 702 (sometimes referred to as the "PRISM" program) authorizes certain programs of surveillance of private communications for foreign intelligence purposes, without prior court approval, where the surveillance targets non-US persons located abroad. The law has been widely criticized, in part, because of the "backdoor search" loophole that allows domestic law enforcement officials to access Americans' communications without a warrant. The surveillance court previously found that the FBI's procedures for obtaining information through backdoor searches violated the Fourth Amendment. The newly published opinion demonstrates how the FBI has failed to reform these unlawful practices. An audit revealed that the agency searched FISA information 40 times last year while investigating a wide range of purely domestic crimes, including health-care fraud, gang violence, domestic terrorism by "racially motivated violent extremists," and public corruption. Again, the FISC expressed "concern[] about the [FBI's] apparent widespread [Section 702] violations." EPIC has long tracked FISA court orders and advocated for FISA reform. More recently, EPIC filed a Freedom of Information Act lawsuit seeking disclosure of a report concerning FBI use of Section 702 authority for domestic criminal investigations and participated as amicus to address the scope of U.S. surveillance authorities in the Court of Justice of the European Union. (Apr. 29, 2021)

  • In a letter to the Metropolitan Washington Council of Governments, an EPIC-led coalition of privacy, civil liberties, and good government groups urged the Council to end the National Capital Region Facial Recognition System (NCR-FRILS) project and disclose all documents associated with it. In a Washington Post article covering the coalition letter, EPIC Senior Counsel, Jeramie Scott, argued that "facial recognition is a particularly invasive surveillance technology that undermines democracy and First Amendment rights." NCR-FRILS is a facial recognition system used by police departments and government agencies in the DC, Maryland, and Virginia area. The system runs comparisons against a database of 1.4 million local mug shots. The project was never publicly announced and was only revealed during the prosecution of a Black Lives Matter protester last fall. EPIC previously submitted a series of open government requests to police departments in the DC-area seeking more information on the system. (Apr. 28, 2021)

  • A new poll from Morning Consult found that 83% of voters say that Congress should pass national data privacy legislation this year. Democrats (86%) and Republicans (81%) expressed bipartisan support for Congress to prioritize a federal privacy bill. The poll also found that voters place similar amounts of responsibility on both federal and state lawmakers, as well as federal regulators, to regulate data privacy. With respect to regulating how companies collect, store, and share personal information, 72% of voters said Congress is either “very responsible” or “somewhat responsible” while 79% said the same for federal agencies and 75% for state governments. Nearly 9 in 10 adults said it was either “very” or “somewhat” important to protect their most sensitive identifiable information under a privacy law, including Social Security number (89%), banking information (89%), biometric data (88%), and driver’s license number (88%). EPIC has called for comprehensive baseline federal legislation and the creation of a U.S. data protection agency, and has advocated for strong state privacy laws. (Apr. 27, 2021)

  • EPIC has filed an amicus brief urging an Alabama federal court not to upend the Census Bureau's system for protecting personal data collected in the 2020 Census. Alabama is challenging the Bureau's use of differential privacy, in which controlled amounts of statistical noise are added to published census data to prevent individuals from being identified and linked with their census responses. The Bureau recently demonstrated that sophisticated reidentification "attacks" can identify tens of millions of people from published census data unless stronger privacy safeguards are used. As EPIC argues in its brief, "differential privacy is the only credible technique to protect against such attacks, including those that may be developed in the future." EPIC's brief explains that federal law imposes on the Bureau an "affirmative duty to protect the privacy of census respondents—not merely to avoid direct, unfiltered publication of census responses." EPIC also argues that differential privacy "is not the enemy of statistical accuracy," but rather "vital to securing robust public participation in Census Bureau surveys[.]" EPIC has long advocated for the confidentiality of personal data collected by the Census Bureau. In 2004, Bureau revised its "sensitive data" policy after an EPIC FOIA request revealed that the Department of Homeland Security had improperly acquired census data on Arab Americans from following 9/11. In 2018, EPIC filed suit to block the citizenship question from the 2020 Census, alleging that the Bureau failed to complete several privacy impact assessments required under the E-Government Act. (Apr. 26, 2021)

  • The Supreme Court, ruling Thursday in AMG Capital Management v. Federal Trade Commission, sharply limited the FTC’s ability to obtain restitution for individuals harmed by companies’ unlawful trade practices. Disagreeing with years of FTC practice and numerous decisions by appellate courts, the Court ruled that a key provision in the FTC Act “does not authorize the Commission to seek, or a court to award, equitable monetary relief such as restitution or disgorgement.” As a result of the decision, the FTC must now go through a burdensome administrative process to force companies to give up ill-gotten gains rather than going directly to court. Acting Chairwoman Rebecca Kelly Slaughter responded that the decision is a ruling “in favor of scam artists and dishonest corporations, leaving average Americans to pay for illegal behavior.” Members of Congress have already proposed amendments to Section 13(b) of the FTC Act that would restore the Commission’s power to seek consumer redress. EPIC routinely advocates before the FTC for meaningful financial penalties against companies whose unlawful data and privacy practices harm consumers. (Apr. 23, 2021)

  • The European Commission released a long-awaited proposal for how to regulate AI throughout the European Union. The proposed regulation includes a ban on “unacceptable” uses of AI such as general social scoring and “real time remote biometric identification” for law enforcement. The proposal also imposes testing and transparency obligations for "high-risk" uses of AI, including a publicly accessible EU database on stand-alone “high-risk” systems. The proposal requires notice to individuals when they interact with certain types of AI and “conformity” assessments for "high-risk" systems. The prohibitions on unacceptable AI are very limited and many of the strongest provisions are subject to vast exceptions. However, a penalty of up to 4% of annual revenue on companies that violate the regulation is included. EPIC has called for prohibitions on secret scoring, mass surveillance, and facial recognition. EPIC urges legislators to implement the OECD Principles on AI and adopt the Universal Guidelines of AI. (Apr. 22, 2021)

  • The Florida House of Representatives today passed the Florida Privacy Protection Act, HB 969, on a 118-1 vote. The bill gives Floridians the right to know what information companies have collected about them, the right to delete and correct that information, the right to opt-out of the sale or sharing of their personal information, strong limits on the retention of their data, and additional protections for their children’s privacy. Critically, the bill would create robust enforcement mechanisms, including a private right of action, to ensure companies do not flout the law. EPIC and a coalition of privacy and consumer organizations had previously sent letters to Florida Governor Ron DeSantis, the Florida House Commerce Committee, and Florida's Senate Rules Committee urging them to preserve private rights of action the bill. "The inclusion of a private right of action in HB 969 and SB 1734 is the most important tool the Legislature can give to Floridians to protect their privacy," the groups wrote. "The statutory damages set in privacy laws are not large in an individual case, but they can provide a powerful incentive in large cases and are necessary to ensure that privacy rights will be taken seriously and violations not tolerated. In the absence of a private right of action, there is a very real risk that companies will not comply with the law because they think it is unlikely that they would get caught or fined." The Senate Rules Committee removed the private right of action provisions from the Senate bill, but the Senate could restore the crucial enforcement provision on the floor this week. (Apr. 21, 2021)

  • Following a report by the Tampa Bay Times about the Pasco County Sherriff’s broad-ranging predictive policing and scoring program, the Department of Education is investigating a Florida school district’s practice of giving the Sherriff access to students’ personal data. The disclosures may have violated the Federal Education Rights and Privacy Act, which places strict limits on the use of students’ educational records. In January, Rep. Robert Scott (D-VA), Chair of the House Committee on Education and Labor, called for an investigation of the Sheriff’s program, which used personal data to compile a list of students that the Sheriff believed could “fall into a life of crime.” EPIC has called for bans on secret scoring and mass surveillance and strict limits on the use of AI in the criminal justice system. (Apr. 21, 2021)

  • As part of EPIC's ongoing lawsuit for cell phone surveillance orders issued by federal prosecutors, the Department of Justice identified 75 orders and warrants for cell phone location data under § 2703(d) from the U.S. Attorney's Office for the Virgin Islands from 2016-2019. During the same period, the attorneys handled 283 criminal cases. The U.S. Attorney's Office for the Virgin Islands is one of the smallest districts in the country. In February, EPIC obtained the number of location data requests for the District of Delaware, the first of five districts that the DOJ has agreed to search for location data requests. EPIC is still waiting for responses from 3 of the agency's other prosecutors' offices and will continue to update its comparative table as each district releases more information. Currently prosecutors do not release any comprehensive or uniform data about their surveillance of cell phone location data. In 2018, the U.S. Supreme Court ruled in Carpenter v. United States that the collection of cell phone location data without a warrant violated the Fourth Amendment. The case is EPIC v. DOJ, No. 18-1814 (D.D.C.). (Apr. 20, 2021)

  • The FTC announced Monday that the sale or use of racially biased algorithms is an unfair and deceptive trade practice in violation of the FTC Act. In a blog post, the Commission warned companies to ensure fairness and equity in their use of AI. The FTC cautioned companies to "Start with the right foundation," "Watch out for discriminatory outcomes," "Embrace transparency and independence," "Don't exaggerate what your algorithm can do or whether it can deliver fair or unbiased results," "Tell the truth about how you use data," "Do more good than harm," and "Hold yourself accountable–or be ready for the FTC to do it for you." The FTC cited its 2016 report on big data analytics and machine learning; its 2018 hearing on algorithms, AI and predictive analytics; and its 2020 business guidance on AI and algorithms. The post also cited a recent study from the Journal of the American Medical Informatics Association finding that AI may worsen healthcare disparities for people of color, even if an AI system was meant to benefit all patients. In 2019, EPIC filed a complaint with the FTC asking the Commission to investigate HireVue's use of opaque, unproven AI and to require baseline protections for AI use. Last year, EPIC petitioned the FTC to conduct a rulemaking on commercial uses of AI, including protections against discrimination and unfair bias. (Apr. 20, 2021)

  • A leaked draft of the European Commission's proposed AI regulation includes a ban on social scoring and strict limits on mass surveillance and other "high-risk" uses of AI. The draft regulation would generally prohibit AI which "manipulates human behaviour, opinions or decisions" to a person's detriment or which "exploits information or prediction about a person or group of persons in order to target their vulnerabilities[.]" The draft also requires notice to individuals when they interact with AI, prior authorization for the use of remote biometric identification tools (including facial recognition), and data impact assessments for "high-risk" systems. The draft is broadly worded and subject to exceptions—including exemptions for "investigating serious crime and terrorism"—but would impose a penalty of up to 4% of annual revenue on companies that violate the regulation. The official release of the proposed regulation is expected on April 21. EPIC has called for prohibitions on secret scoring, mass surveillance, and facial recognition. (Apr. 14, 2021)

  • In an open letter released today, EPIC and twenty four civil rights and social justice organizations called on elected officials to ban corporate, private, and government use of facial recognition technology, suggesting Portland, OR's recent ban on facial recognition as a model. The letter also urges corporate leaders to ban the technology within their companies. The coalition notes recent uses of facial recognition to monitor workers and instances of wrongful firings when facial recognition systems mis-identified black gig workers. EPIC and a coalition recently urged New York City Council to enact a comprehensive ban on facial recognition. EPIC leads a campaign to Ban Face Surveillance and through the Public Voice Coalition gathered support from over 100 organizations and experts from more than 30 countries. (Apr. 14, 2021)

  • As the Florida Legislature considers pending privacy bills, HB 969 and SB 1734, EPIC is urging lawmakers to enact strong privacy protections for all Floridians. The House Commerce Committee is today hearing HB969, would give Floridians the right to know what information companies have collected about them, the right to delete and correct that information, the right to opt-out of the sale or sharing of their personal information, strong limits on the retention of their data, and additional protections for their children’s privacy. Critically, the bill would create robust enforcement mechanisms, including a private right of action, to ensure companies do not flout the law. In written testimony, EPIC urged committee members to further strengthen the bill to prohibit discriminatory uses of data, remove the "right to cure" provision, require data minimization, support global opt-out mechanisms, ban pay-for-privacy schemes, and provide enhanced safeguards for sensitive uses of data. EPIC had previously led a coalition of groups urging Florida lawmakers to preserve the private right of action in the bills. (Apr. 14, 2021)

  • A bill passed in Virginia will ban local law enforcement agencies from using facial recognition technology without prior legislative approval starting July 1, 2021. Even when such approval is given, the bill further requires local police agencies to have "exclusive control" over the facial recognition systems they use, preventing the use of Clearview AI and other commercial FR products. However, Virginia State Police and other state law enforcement agencies may continue to use facial recognition without legislative approval. EPIC and a coalition recently urged New York City Council to enact a comprehensive ban on facial recognition. EPIC leads a campaign to Ban Face Surveillance and through the Public Voice Coalition gathered support from over 100 organizations and experts from more than 30 countries. (Apr. 9, 2021)

  • EPIC and a coalition sent letters to Attorney General Garland and the Senate Judiciary Committee urging them to conduct oversight and review agency implementation of the Freedom of Information Act. The coalition requested the Senate Judiciary to hold an oversight hearing on agency FOIA compliance. The committee's last oversight hearing on FOIA was more than three years ago. The letter to Senate Judiciary states, "[I]t is imperative that the Committee provide oversight of agencies' compliance with FOIA, both to understand FOIA implementation by the Trump administration, as well as to seek commitments to comply with the law from the newly confirmed Biden administration officials." The coalition also asked Attorney General Garland to follow the precedent of many former AGs and issue a memorandum to agencies on how to interpret and apply the FOIA and to support legislative reform. During Sunshine Week, Attorney General Garland remarked that for the Justice Department to succeed, it must adhere to "the principles that have become core to our DNA" and that "faithful administration of FOIA is essential to American democracy." EPIC recently published its 2021 FOIA Gallery highlighting EPIC's most significant open government cases and records obtained through government records requests. (Apr. 8, 2021)

  • A trove of sensitive personal data from more than 500 million Facebook users was posted online over the weekend, according to press reports. The leaked data includes names, phone numbers, email addresses, birthdates, location information, and biographical details. The original breach of personal data appears to have occurred in 2019. At least one privacy regulator, the Irish Data Protection Commissioner, has launched an investigation into Facebook's handling of the breach. The Commissioner's office said today that it had "received no proactive communication from Facebook" following the disclosure of personal data. EPIC has fought for transparency and accountability for Facebook's privacy abuses for over a decade, from filing the original FTC Complaint in 2009 that led to the FTC's 2012 Consent Order with the company, to moving to intervene in and filing an amicus brief challenging the FTC's 2019 settlement with Facebook. (Apr. 6, 2021)

  • In September 2020, the Department of Housing and Urban Development released a final rule creating a defense to a discrimination claim under the Fair Housing Act where “predictive analysis” tools are not "overly restrictive on a protected class" or where they “accurately assessed risk.” Shortly after, a federal judge in Massachusetts blocked the rule, saying the regulation would "run the risk of effectively neutering disparate impact liability under the Fair Housing Act.” Today, American Bar Association President Patricia Lee Refo urged the agency to "act immediately to withdraw the 2020 FHA Rule and to adopt new guidance and a new rule to ensure the danger of algorithmic bias is adequately tackled.” EPIC and several others warned the federal housing agency during the initial rule announcement that providing such a safe harbor for the use of algorithms in housing without imposing transparency, accountability, or data protection regulations would exacerbate harms to individuals subject to discrimination. EPIC has called for greater accountability in the use of automated decision-making systems, including the adoption of the UGAI principles and requirements for algorithmic transparency. (Apr. 6, 2021)

  • EPIC and a coalition of privacy and consumer organizations today sent letters to Florida Governor Ron DeSantis, the Florida House Commerce Committee, and Florida's Senate Rules Committee urging them to preserve private rights of action in two pending privacy bills, SB 1734 and HB 969. "The inclusion of a private right of action in HB 969 and SB 1734 is the most important tool the Legislature can give to Floridians to protect their privacy," the groups wrote. "The statutory damages set in privacy laws are not large in an individual case, but they can provide a powerful incentive in large cases and are necessary to ensure that privacy rights will be taken seriously and violations not tolerated. In the absence of a private right of action, there is a very real risk that companies will not comply with the law because they think it is unlikely that they would get caught or fined." (Apr. 5, 2021)

  • The Privacy and Civil Liberties Oversight Board released its report on Executive Order 12333, which provides broad legal authority for data collection. The Oversight Board conducted three deep-dives into 12333-related counterterrorism activities—two on classified CIA programs and one on NSA’s XKEYSCORE. XKEYSCORE is a tool used to search data collected under Executive Order 12333 that was revealed by the Snowden revelations. The report lacks specifics on the 12333 programs the Board reviewed, but according to the Board the focus was on programs that either likely collected US persons information, targeted US persons, or occurred in the US. The report also does not indicate the specific advice or recommendations the Board provided, but it does reveal that many intelligence agencies were using guidelines to protect US persons that had not been updated since the 1980s or were never implemented as required by 12333. EPIC previously urged the Oversight Board to conduct a review of 12333. (Apr. 2, 2021)

  • EPIC and a coalition of civil society groups urged officials in five states today to investigate major pharmacy chains over their collection and use of personal data from patients receiving COVID-19 vaccines. The federal government has partnered with retail pharmacies to expand vaccine distribution, including CVS, Walgreens, Walmart, and Kroger. But as the coalition letter explains, some pharmacies "are requiring patients seeking access to the vaccine to register through their existing customer portals, which in turn exposes patients to broad personal data collection and marketing." According to a recent report, CVS executives "plan to stay in touch with vaccine recipients beyond receiving their second shot and use information gleaned in the process to better market to them." The coalition urged state consumer protection authorities in California, Illinois, Massachusetts, New York, and the District of Columbia to conduct investigations, to prohibit the use of vaccine registrant data for commercial purposes, and to require pharmacies to separate vaccine registrant information from their general customer data. "Patients should not have to trade unrestricted use of their sensitive personal information for a life-saving vaccine," the letter argues. "We believe these practices are unfair and deceptive and should be halted immediately." The coalition called on state officials "to remove barriers to access the vaccine and promote an equitable vaccine distribution process by protecting the personal data of vaccine recipients." (Apr. 2, 2021)

  • The California Supreme Court held today that all parties must consent to the recording of a cellular phone call under the state's Invasion of Privacy Act. In Smith v. LoanMe, an individual alleged that a loan servicer had recorded their call without obtaining consent from the called party. The lower court found that the law's ban on recording calls without consent only applied to eavesdroppers and did not apply when one of the parties to the communication recorded the call. The lower court ruling went against decades of cases and guidance that held California was a "two party consent" state. The California Supreme Court reversed and held that the law prohibited both eavesdroppers and parties to a call from recording without consent. The Court recognized that the California legislature intended to create an all-party consent regime and that recording a call without consent of all parties "can implicate significant privacy concerns, regardless of whether a party or someone else is performing the recording." EPIC filed an amicus brief arguing that recording a call without consent of all parties "poses unique threats to privacy." EPIC routinely files amicus briefs in cases implicating consumer privacy. (Apr. 1, 2021)

  • Today, the U.S. Supreme Court ruled in Facebook v. Duguid that individuals can only claim protection under the Telephone Consumer Protection Act from unwanted calls made using a mass dialing system or "autodialer" if the system uses a random or sequential number generator to either store or produce the numbers called. EPIC filed an amicus brief urging the Court to interpret the autodialer restriction broadly to include systems that automatically dial numbers stored in lists or databases. EPIC argued that "narrowing the autodialer definition would not protect privacy" but would instead "put the most widely used mass dialing systems outside the scope" of the ban.

    Many robocallers and would-be robocallers will interpret the Court’s decision today as essentially abrogating the autodialer restriction, which will likely lead to a surge in unwanted automated calls to cell phones. Automated calls are already a daily nuisance for Americans. Individuals increasingly ignore calls from unknown numbers because they assume the calls are robocalls, which has caused particular harm to contact tracing during COVID-19. Congress must update the autodialer restriction to protect Americans from the coming onslaught of unwanted automated calls.

    But the Court’s decision today is not a total victory for robocallers. The decision does not limit the definition of an autodialer to systems that create random or sequential telephone numbers. The Court says that autodialers include systems that use random or sequential number generators to order numbers in a list. Because computer programs commonly use sequential number generators to store or pull information from a list, it is hard to think of a mass dialing system that would not use a sequential number generator at some point in the program.

    Litigation will continue over the scope of the autodialer definition. Americans need protection from robocallers now, and Congress should act swiftly to update the autodialer restriction.

    (Apr. 1, 2021)

  • EPIC and a coalition of civil-rights and community-based organizations submitted a letter to New York City Council Speaker Corey Johnson urging the council to introduce a comprehensive ban on government use of facial recognition. The letter highlights NYPD's use of facial recognition along with other NYC agencies, the potential for far-reaching surveillance posed by facial recognition technology, and the risk of errors from racial bias in facial recognition algorithms and poor police practices. EPIC leads a campaign to Ban Face Surveillance and through the Public Voice Coalition, gathered support from over 100 organizations and experts from more than 30 countries. (Mar. 30, 2021)

  • Acting FTC Chairwoman Rebecca Kelly Slaughter today announced the creation of a new rulemaking group within the FTC. The announcement follows criticism that the FTC has not adequately used its authorities, including its rulemaking power, to address consumer protection harms and promote competition. Section 18 of the FTC Act enables the Commission to issue trade regulation rules to address unfair or deceptive practices that occur commonly. Once the commission has promulgated a trade regulation rule, it may seek civil penalties for each violation of the rule. “I believe that we can and must use our rulemaking authority to deliver effective deterrence for the novel harms of the digital economy and persistent old scams alike,” Acting Chair Slaughter said. EPIC has long urged the FTC to impose clear privacy obligations on companies that collect and use personal data, including by exercising the Commission's underused rulemaking power. In 2020, EPIC filed a petition with the FTC calling on the Commission to conduct a rulemaking on the use of artificial intelligence in commercial settings. "By defining unfair and deceptive practices ex ante, and with specificity, a trade regulation rule would make it easier for the FTC to take action against parties that harm consumers," EPIC explained. (Mar. 25, 2021)

  • The Massachusetts Supreme Judicial Court ruled today that Facebook could be required to disclose to the Attorney General certain factual information about privacy-abusive apps discovered during the company's investigation into the Cambridge Analytica scandal. Facebook had claimed that all information it collected was protected by attorney-client and attorney work product privileges because the company's investigation was led by attorneys in anticipation of litigation. The Massachusetts high court disagreed that the attorney client privilege applied to all of the records, and remanded to the trial court to determine if the records contain factual work product that must be turned over to the Attorney General. EPIC filed an amicus brief in the case urging the court to "reject Facebook's attempt to use litigation threats as an excuse to prevent the facts of its breach of user trust from coming to light." EPIC has fought for transparency and accountability for Facebook's privacy abuses for over a decade, from filing the original FTC Complaint in 2009 that led to the FTC's 2012 Consent Order with the company, to moving to intervene in and filing an amicus brief challenging the FTC's 2019 settlement with Facebook. (Mar. 24, 2021)

  • This week, the U.S. Supreme Court denied a petition for review in In re: Facebook, Inc. Internet Tracking Litigation, a case challenging Facebook's use of "cookies" to track internet browsing activity even when users were logged out of their Facebook accounts. The U.S. Court of Appeals for the Ninth Circuit held that Facebook's use of cookies to track Internet users browsing other websites might violate the federal Wiretap Act because Facebook was not an authorized party to those communications. Facebook's efforts to get the Supreme Court to reject this holding of the Ninth Circuit failed, and now the case will move forward. EPIC filed an amicus brief in the Ninth Circuit in this case and has filed briefs opposing settlements in other cases challenging cookie-based surveillance. EPIC has long advocated against the use of cookies and other surveillance tools to track people online. EPIC continues to advocate for clear rules and restrictions on web tracking as companies replace cookies with new surveillance techniques that would do little to protect privacy online. (Mar. 22, 2021)

  • EPIC filed a series of open government requests seeking information on fusion centers' role in monitoring Black Lives Matter protests this summer and on fusion centers' possession of advanced surveillance technologies including location tracking services, cell phone data extraction tools, facial recognition, and social media monitoring tools. EPIC sent requests to federally funded fusion centers in Pennsylvania, South Carolina, Northern California, and North Dakota. Fusion centers are state or regional intelligence units that provide police with access to advanced surveillance technologies while relaying information to the Department of Homeland Security. EPIC previously urged DHS's DPIAC committee to investigate fusion centers and recommend ending federal funding of fusion centers. (Mar. 18, 2021)

  • Yesterday, Megan Iorio, counsel at EPIC, presented oral argument as a friend of the court in Bozzi v. Jersey City, a New Jersey Supreme Court case concerning a commercial open government request for names and addresses of dog license registrants. The lower court found no privacy interest in the information and ordered its release. Ms. Iorio urged the court to reverse and to find that personal information in government records should only be disclosed when a government transparency interest could be served by disclosure. The argument drew on the historic and constitutional origins of the right to informational privacy, federal courts' interpretation of the Freedom of Information Act's privacy exemptions, and New Jersey's strong constitutional right to privacy. EPIC filed an amicus brief in the case and argued before the court last year in State v. Andrews about whether an individual can be compelled to disclose their cell phone passcode. (Mar. 16, 2021)

  • The Federal Trade Commission's 2013 failure to sue Google for antitrust violations went against the advice of FTC staff and disregarded evidence of Google's growing market dominance, according to records obtained by Politico. FTC antitrust attorneys advised the Commission to bring suit against Google to block future deals with mobile companies making Google an exclusive search provider. But the Commission rejected that recommendation on the view that mobile search was only a small part of the search market, a conclusion that quickly proved outdated. The records published by Politico also reveal that Amazon and Facebook—both of which are now facing their own antitrust proceedings—privately pushed the FTC to take enforcement action against Google. Google's anticompetitive practices in search and targeted advertising are the basis of two antitrust lawsuits brought by the Department of Justice and state attorneys general last year. On Monday, Texas announced that it would broaden its lawsuit to cover Google's planned replacement for third-party cookies—so-called "FLoCs"—which would do little to protect privacy but further consolidate Google's market power. EPIC has long targeted anticompetitive practices by Google, including its acquisition of DoubleClick and bias in YouTube search rankings. EPIC also helped bring about the FTC's 2011 order establishing privacy safeguards for Google users and sued when Google violated that order. (Mar. 16, 2021)

  • California Attorney General Xavier Becerra has announced updated regulations under the California Consumer Privacy Act (CCPA) that ban so-called “dark patterns” that delay or obscure the process for opting out of the sale of personal information. Specifically, the regulations prohibit companies from burdening consumers with confusing language or unnecessary steps such as forcing them to click through multiple screens or listen to reasons why they shouldn’t opt out. "These protections ensure that consumers will not be confused or misled when seeking to exercise their data privacy rights," said Attorney General Becerra. Dark patterns "are design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent." Last month, EPIC filed a complaint with the D.C. Attorney General alleging that Amazon unlawfully employs manipulative "dark patterns" in the Amazon Prime subscription cancellation process. Next month, the FTC plans a workshop on "Bringing Dark Patterns to Light." (Mar. 16, 2021)


  • In celebration of Sunshine Week, EPIC has unveiled the 2021 FOIA Gallery. Since 2001, EPIC has annually published highlights of EPIC's most significant open government cases and documents obtained through government records requests. For example, EPIC's 19-month legal effort in EPIC v. DOJ resulted in the release of new sections from the previously redacted Mueller Report, including details about Roger Stone and passages concerning decisions by Special Counsel Mueller to not charge particular individuals with criminal offenses. EPIC also prevailed twice in EPIC v. AI Commission, in which the court forced the Commission to hold public meetings and disclose thousands of pages of records to EPIC. In this year's FOIA gallery, EPIC also highlights records about DHS's initial response to election cybersecurity threats, a DOJ report on predictive policing and AI, records about contact tracing efforts from North Dakota and Utah, and records about CBP's electronic device border search audits. (Mar. 15, 2021)

  • In a joint statement by the Department of Health and Human Services (HHS) and the Department of Homeland Security (DHS), the agencies terminated a 2018 agreement that previously formalized the practice of using information obtained from unaccompanied migrant children to deport relatives and other potential sponsors. EPIC previously urged HHS to abandon the practice of sending this data to DHS when the agency proposed a rule in 2018 to formalize the policy. EPIC argued the proposed rule conflicted with a Privacy Impact Assessment and undermine the welfare of unaccompanied children. EPIC also joined over 100 other groups to call for an end of the practice, stating that DHS has "taken a process designed to protect children and made it into a tool that uses them to find and deport their families." EPIC has previously warned Congress about the misuse of immigrant data by DHS. (Mar. 12, 2021)

  • Around 150,000 networked and facial recognition-capable security cameras located in hospitals, schools, homes, and prisons were accessed in a security breach of Verkada, a surveillance company. The breach exposed vulnerable populations surveilled by Verkada’s cameras and highlights the degree to which unregulated surveillance and data collection are ubiquitous within the United States. Verkada’s software offerings include facial recognition tools, exacerbating the risks created by its surveillance systems. EPIC, along with a coalition of advocates, warned about similar risks for Amazon’s Ring Doorbell and called for a ban on facial recognition as well as regulation of surveillance and data governance. (Mar. 12, 2021)

  • In a letter to Secretary of Homeland Security Alejandro Mayorkas, EPIC and a coalition of civil rights, civil liberties, immigrant's rights, technology, and privacy organizations urged the agency to rescind a Notice of Proposed Rulemaking massively expanding Customs and Border Protection's (CBP's) use of biometrics, and to suspend the use of facial recognition across DHS. The NPRM was originally issued November 19, 2020 and re-published on February 9, 2021 in a sign that DHS and the Biden Administration intend to go forward with the rulemaking. EPIC submitted comments on the original NPRM, urging CBP to suspend its use of facial recognition, or in the alternative use only 1:1 face comparison. Earlier, EPIC voiced opposition to a broader DHS rulemaking authorizing widespread use of biometrics, including facial recognition, throughout the agency. (Mar. 10, 2021)

  • EPIC, as part of the open government case EPIC v. AI Commission, has obtained additional records from the National Security Commission on Artificial Intelligence. The documents include further internal emails from Commission chair and former Google CEO Eric Schmidt. The Commission recently issued its final report on the use of AI in national security and defense settings. The report makes key recommendations concerning AI impact assessments and audits but fails to propose substantive limits on AI use for Congressional enactment, as EPIC urged the Commission to do last year. EPIC successfully sued the AI Commission in 2019 to enforce its transparency obligations, forcing the Commission to hold open meetings and disclose thousands of pages of records. The case is EPIC v. AI Commission, No. 19-2906 (D.D.C.). (Mar. 9, 2021)

  • EPIC has filed an amicus brief in TransUnion LLC v. Ramirez, urging the U.S. Supreme Court to hold that people can sue when their privacy rights are violated, regardless of whether they allege that the violation led to other harms. The case concerns a suit brought under the Fair Credit Reporting Act (FCRA), one of many laws that create privacy rights for individuals to help them maintain control over their personal information. Ramirez and many others sued after TransUnion violated the FCRA, but the company argued that they don't have "standing" to sue. Other tech companies also filed a brief arguing that the Supreme Court should limit standing in privacy lawsuits. Standing is a constitutional doctrine that dictates when federal courts have authority to resolve cases. EPIC argued that privacy plaintiffs have standing to sue and that "standing was never meant to be a complicated inquiry or a substantial barrier to the vindication of legal rights." EPIC warned that "[c]ourts that require proof of consequential harm are usurping the legislative role and rewriting these privacy laws" because "it is not the business of courts to tell Congress which rights are enforceable, and which are not." EPIC previously filed an amicus brief in Spokeo and frequently files amicus briefs in cases interpreting standing under a variety of privacy laws. (Mar. 9, 2021)

  • EPIC, together with the ACLU and EFF, recently filed an amicus brief in Wisconsin v. Burch, urging the Wisconsin Supreme Court to stop police from conducting warrantless forensic searches of cell phones and indefinitely retaining the data based on vague consent forms. The defendant in the case had verbally consented to a limited search of his text messages during a hit-and-run investigation. Police then asked him to sign a vague consent form that did not specify his phone would be forensically analyzed and the data stored indefinitely. Police used a forensic device to download the entire contents of the phone, retained a full copy, and disclosed data that was outside the scope of his limited verbal consent to another department for use in an unrelated investigation. In their brief, EPIC, ACLU, and EFF argued that someone who consents to a limited search does not reasonably expect police may access, copy, and store vast amounts of personal information held on their phone. These searches violate the Fourth Amendment by “enabl[ing] the State to rummage at will among a person’s most personal and private information whenever it wanted, for as long as it wanted” without a warrant. EPIC regularly files amicus briefs challenging unlawful access to cell phone data. (Mar. 8, 2021)

  • Virginia Governor Ralph Northam has signed the Virginia Consumer Data Protection Act into law. "It is good to see Virginia and other states taking action to protect the privacy of their residents. States have always played a key role in establishing privacy protections," EPIC Policy Director Caitriona Fitzgerald said. "But in 2021 we need a more comprehensive and proactive approach to privacy than what Virginia adopted. We need privacy laws in the United States that address current business practices and protect individuals from all forms of corporate surveillance, algorithmic unfairness, manipulative design, and discrimination. We need privacy laws that minimize the data collected about us and encourage innovation in privacy enhancing technologies. And we need robust enforcement of these rules to make sure that the underlying business practices actually change." (Mar. 3, 2021)

  • The National Security Commission on Artificial Intelligence has issued its final report on the use of AI in national security and defense settings. The report urges Congress and the President to implement key safeguards on federal AI deployment, including mandating AI impact and risk assessments, updating standards for Privacy Act notices and privacy impact assessments, establishing an independent auditor for AI systems, empowering the Privacy and Civil Liberties Oversight Board to conduct AI oversight, and establishing a task force to recommend legal restrictions on the use of AI. However, the report fails to propose any substantive limits on AI use for Congressional enactment, as EPIC urged the Commission to do last year. "Unless express, binding limits on the use of AI are established now, the technology will quickly outpace our collective ability to regulate it," EPIC wrote. "The Commission cannot simply kick the can down the road, particularly when governments, civil society, and private sector actors have already laid extensive groundwork for the regulation of AI." Controversially, the AI Commission's final report also fails to endorse a ban on the use of autonomous weapons. The report was approved at the Commission's final meeting, which was open to the public as a result of EPIC's lawsuit. EPIC successfully sued the AI Commission in order to enforce its transparency obligations, forcing the Commission to hold open meetings and disclose thousands of pages of records. The case is EPIC v. AI Commission, No. 19-2906 (D.D.C.). (Mar. 2, 2021)

  • EPIC has filed a complaint with the D.C. Attorney General alleging that Amazon unlawfully employs manipulative "dark patterns" in the Amazon Prime subscription cancellation process. Dark patterns "are design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent." Amazon employs dark patterns when customers attempt to cancel their Amazon Prime subscriptions, effectively preventing them from ending their memberships, charging users recurring fees, and continuing to collect, retain, and use the personal data of misdirected subscribers. EPIC's complaint calls on the D.C. Attorney General to halt Amazon's use of dark patterns. EPIC also warned the company that it is prepared to file suit under D.C.'s consumer protection law if Amazon fails to correct its unlawful business practices. EPIC recently signed onto a coalition letter urging the FTC to investigate Amazon's use of dark patterns in the Prime cancellation process. (Feb. 28, 2021)

  • In letter to the Biden administration, EPIC and a coalition of 40 privacy, immigration, and civil liberties organizations urged the administration to abandon the proposed U.S. Citizenship Act of 2021 as an extension of the Trump administration's border policy. The proposed legislation would direct DHS to deploy a bevy of biometric and other surveillance technologies at points of entry and along the southern border. The letter describes how such technologies endanger the lives of migrants by pushing them onto more dangerous travel routes. The use of surveillance technologies at the border inevitably extends into the interior, where they are deployed against protesters, communities of color, and indigenous peoples. EPIC recently urged DHS to rescind a proposed rule increasing the agency's collection of biometric information. (Feb. 25, 2021)

  • In comments to the New York Police Department, EPIC called for meaningful limits on the use of mass surveillance technologies including facial recognition, airplanes and drones, automated license plate readers, and social media monitoring tools. EPIC also joined with privacy and civil liberties advocates and academics in coalition comments urging the NYPD to make a good faith effort to meet the requirements of the Public Oversight of Surveillance Technologies (POST) Act. The POST Act requires the NYPD to publish impact statements and use policies for 36 surveillance technologies. The Department's draft policies fail to disclose necessary information including detailed data storage, retention, and auditing practices, do not name the vendors of these technologies, and gloss over systemic racial discrimination in the use of these technologies with boilerplate language. The disclosures illuminate the use of technologies by the NYPD that enable mass surveillance and have extensive documented risks of bias and inaccuracy. EPIC leads a campaign to Ban Face Surveillance, and through the Public Voice coalition gathered support from over 100 organizations and experts from more than 30 countries. (Feb. 25, 2021)

  • EPIC and a coalition of 42 other organizations sent a letter to President Biden to commit to making transparency a top priority in his new administration. President Biden has pledged to "bring transparency and truth back to government," and advocates like EPIC intend to hold his administration accountable to these promises. The group asked the President to, among other things: direct agencies to adopt new Freedom of Information Act guidelines that prioritize transparency and the public interest; direct the Attorney General to issue new FOIA guidance; assess, preserve, and disclose key records of the previous administration; endorse legislative improvements for public records laws like FOIA and the Public Records Act; and seek funding increases for public records laws. The letter emphasized that "[a]s our country's history has shown us time and time again, when government secrecy proliferates, so do civil liberties violations and obstacles to democratic accountability." EPIC's Open Government Project frequently makes use of the FOIA to obtain information from the government, often litigating to force disclosure of agency records that impact critical privacy interests. (Feb. 22, 2021)

  • The Department of Justice has, after more than three years, finally begun to respond to EPIC's request for cell phone surveillance orders issued by federal prosecutors. EPIC first requested copies of the orders in 2017 and then filed a lawsuit against the Justice Department in 2018 when the agency failed to respond. The agency has now begun issuing responses from 5 of its U.S. Attorneys' Offices. The first response is from the District of Delaware, and shows that from 2016-2019 the prosecutors in that office had 150 applications and orders for cell phone location data under § 2703(d). Over that same period the attorneys handled 351 criminal cases. EPIC is still waiting for responses from 4 of the agency's other prosecutors' offices. EPIC will maintain a comparative table as each district releases more information. Prosecutors do not currently release any comprehensive or uniform data about their surveillance of cell phone location data. In contrast, the Administrative Office for the U.S. Courts releases detailed reports each year about the use of federal wiretap authority. The U.S. Supreme Court ruled in 2018 in Carpenter v. United States that collection of cell phone location data without a warrant violated the Fourth Amendment. The case is EPIC v. DOJ, No. 18-1814 (D.D.C). (Feb. 22, 2021)

  • In a coalition letter, EPIC and over 40 other privacy, civil liberties, and civil rights groups called on the Biden administration to 1) place a moratorium on federal use of facial recognition and other biometric technologies, 2) stop state and local governments from purchasing facial recognition services with federal funds, and 3) support the Facial Recognition and Biometric Technology Act. The coalition letter highlights the threat of facial recognition to create a panopticon of surveillance, the particular harms to people of color, women, and youth from mis-identification by facial recognition, and widespread adoption of facial recognition without public input. Last year, EPIC and a coalition of privacy, civil liberties, and civil rights groups urged Congress to pass Senator Markey's Facial Recognition and Biometric Technology Act bill. In 2019, EPIC launched a campaign to Ban Face Surveillance and through the Public Voice coalition gathered the support of over 100 organizations and many leading experts across 30 plus countries. (Feb. 17, 2021)

  • Christine Wilson, one of four current members of the Federal Trade Commission, said Friday that she is open to using the FTC's rulemaking authority to regulate data privacy. "I would hope that Congress will act, but if Congress doesn't act, maybe we do spend that time," Politico quoted Commissioner Wilson as saying during a Silicon Flatirons event. EPIC has long urged the FTC to impose clear privacy obligations on companies that collect and use personal data, including by exercising the Commission's underused rulemaking power. In 2020, EPIC filed a petition with the FTC calling on the Commission to conduct a rulemaking on the use of artificial intelligence in commercial settings. "By defining unfair and deceptive practices ex ante, and with specificity, a trade regulation rule would make it easier for the FTC to take action against parties that harm consumers," EPIC explained. Acting FTC Chair Rebecca Kelly Slaughter and Commissioner Rohit Chopra have previously signaled their support for using the FTC's rulemaking authority to address consumer privacy issues. (Feb. 12, 2021)

  • EPIC Interim Associate Director and Policy DIrector Caitriona Fitzgerald will testify today before the Maryland Senate Committee on Finance in support of stronger authentication methods to protect consumers. Senate Bill 185 requires financial institutions who choose to use security questions as a authentication method to provide customers with more than one security question option. EPIC noted that there are plenty of alternative authentication methods available today and that financial institutions truly should no longer be using basic security questions. "The requirement that your password contain one uppercase letter, one lowercase letter, one symbol, and one number is meaningless if all that is required to bypass that password is your pet’s name," EPIC told the Committee. But, EPIC said, if security questions are going to be used, institutions should ensure that multiple question options are given, and that users are permitted to answer the questions with randomly-generated password-like answers rather than factual, semantic answers. (Feb. 9, 2021)

  • EPIC and the National Consumer Law Center have filed an amicus brief in Lindenbaum v. Realgy, LLC, urging the Sixth Circuit to reject immunity for illegal robocalls made between 2015 and 2020. The case follows the Supreme Court’s decision in Barr v. American Association of Political Consultants, in which the Court held that an exception added in 2015 to the decades-old robocall restriction was unconstitutional and must be severed from the broad robocall ban. As defendant in a separate robocall suit, Realgy argued that the Supreme Court’s decision meant that the broad robocall ban was unenforceable for the period that the unconstitutional exception was in effect, from 2015-2020. The district court agreed and granted Realgy’s motion to dismiss. EPIC and NCLC filed an amicus brief arguing that granting robocallers immunity “would reward those who made tens of billions of unwanted robocalls and deprive consumers of any remedy for the incessant invasion of their privacy.” EPIC regularly files amicus briefs supporting consumers in illegal robocall cases. (Feb. 2, 2021)

  • In comments responding to the National Institute of Standards and Technology's draft Federal Information Processing Standards for personal identity verification (ID cards and digital identity verification), EPIC urged the agency to adopt more privacy protective technology for federal employees and contractors. EPIC drew upon expertise from the Advisory Board for these comments. EPIC recently urged the Department of Homeland Security to suspend a new counterintelligence system of records which will collect biometric information. EPIC previously urged the Department of Transportation to provide more privacy protections for federal employees in the Insider Threat database. (Feb. 2, 2021)

  • EPIC presented the 2021 International Privacy Champion Awards this week to Justice K.S. Puttaswamy, retired Justice of the Kamataka High Court and lead plaintiff in the case that established the constitutional right to privacy in India and challenged the country’s mandatory biometric data collection program Aadhaar, and Senior Advocate Shyam Divan, who was the lead attorney on the case. EPIC Interim Executive Director Alan Butler emphasized the significance of the Puttaswamy case, noting that it “was groundbreaking in ways that will reverberate for decades to come.” The decision supports the recognition of privacy as a fundamental human right, and the case forced limits on the collection of biometric data in the world’s second largest country. The ceremony took place online at the annual conference on Computers, Privacy, and Data Protection. (Jan. 28, 2021)

  • The Hamburg Data Protection Authority has ruled that Clearview AI’s searchable database of biometric profiles is illegal under the EU’s GDPR and ordered the U.S. company to delete the claimant’s biometric profile. Clearview AI scrapes photos from websites to create a searchable database of biometric profiles. The database, which is marketed to private companies and U.S. law enforcement, contains over 3 billion images gathered from websites and social media. The claimant submitted a complaint to the Hamburg DPA after discovering that Clearview AI had added his biometric profile to the searchable database without his knowledge or consent. The DPA ordered Clearview to delete the mathematical hash values representing his profile but did not order Clearview to delete his captured photos. The DPA’s narrow order protects only the individual complainant because it is not a pan-European order banning the collection of any EU resident’s photos. The DPA decided that Clearview AI must comply with the GDPR, yet this narrow order places a burden on Europeans to have their profiles removed from the database. EPIC has long opposed systems like Clearview AI, filing an amicus brief before the 9th Circuit defending an individual's right to sue companies who violate BIPA and other privacy laws, submitting FOIA requests with several government agencies that use Clearview AI technology, and urgingthe Privacy and Civil Liberties Oversight Board to recommend the suspension of face surveillance systems across the federal government. (Jan. 28, 2021)

  • EPIC Senior Counsel Jeramie Scott testified today to Senate and House Committees of the Maryland General Assembly in support of legislation protecting biometric information privacy. HB218 and SB16 are modeled after the Illinois Biometric Information Privacy Act (BIPA). Passed in 2008, BIPA has been referred to as one of the most effective and important privacy laws in America. "Unlike a password or account number, a person’s biometrics cannot be changed if they are compromised," EPIC told the Committees. EPIC stressed the importance of strong enforcement measures in privacy laws, particularly a private right of action. EPIC also submitted a recent case study on the Illinois law written by EPIC Advisory Board member Woody Hartzog. EPIC previously filed an amicus brief in Rosenbach v. Six Flags, where the Illinois Supreme Court unanimously decided that consumers can sue companies that violate the state's biometric privacy law. [Watch the hearing]

    (Jan. 27, 2021)

  • In a report released on January 20, the European Parliament outlines the need for new legal frameworks for artificial intelligence and biometric surveillance. The report raises objections to both civilian and military uses of artificial intelligence, mass surveillance, and deepfakes. The European Parliament was particularly concerned with facial recognition technology, proposing a moratorium on its use in public and semi-public spaces. EPIC leads a campaign to Ban Face Surveillance through the Public Voice coalition. (Jan. 22, 2021)

  • EPIC, as part of the open government case EPIC v. AI Commission, has obtained additional records from the National Security Commission on Artificial Intelligence. The documents include emails from Commission chair and former Google CEO Eric Schmidt illustrating Schmidt’s close relationship with members of Congress. The records also reveal that the ethics disclosure form Schmidt filed with the Commission—a document that usually tops out at a dozen pages—was 38 pages long. EPIC’s FOIA request was recently highlighted in an American Prospect article on Schmidt’s role in Rebellion Defense, “a shadowy defense startup” that markets AI systems to the Defense Department. EPIC has twice prevailed in its open government case against the AI Commission, forcing the Commission to hold public meetings and disclose thousands of pages of records. In recent comments, EPIC called on the AI Commission to "advise Congress, as the nation's highest policymaking authority, to establish government-wide principles and safeguards for the use and development of AI." The case is EPIC v. AI Commission, No. 19-2906 (D.D.C.). (Jan. 20, 2021)

  • EPIC Equal Justice Works Fellow Ben Winters testified today before the Washington Legislature in support of a bill to establish transparency and accountability around state automated decision-making and ban certain dangerous applications of AI. Under SB5116, public and regularly updated algorithmic accountability reports of state uses of automated decision-making systems will be completed, AI-enabled profiling that produces significant legal effects will be prohibited, and other baseline protections will be enacted. EPIC has advocated for algorithmic transparency for several years, has issued calls to ban face surveillance, and tracks use of AI in the Criminal Justice System. (Jan. 20, 2021)

  • The Massachusetts Legislature has enacted a new law that prevents Massachusetts transit authorities from disclosing personal information related to individuals' transit system use for non-transit purposes and requires police obtain a search warrant before accessing personal data collected by the authorities. The law resolves many of the issues raised in Commonwealth v. Zachery, a case pending before the Massachusetts Supreme Judicial Court in which the government obtained, without a warrant, location data generated by the defendant's use of a Massachusetts Bay Transit Authority transit card. EPIC filed an amicus brief in the case. EPIC argued that disclosure of data collected by the transit authority should be limited to the purposes for which it was collected. EPIC further stated that "if the government seeks to access Charlie Card data for investigative purposes, it must do so with a warrant." The new law adopts both the disclosure limitation and warrant requirement that EPIC advocated for in its amicus brief to the Court. (Jan. 20, 2021)

  • The Federal Aviation Administration published the final rule for the operation of drones over people. The rule allows drones to operate over people without first obtaining a waiver to do so. The drone must meet certain requirements (e.g. the drone can't have exposed rotating blades), and the rule doesn't generally allow sustained flight over large gatherings of people outside. EPIC, in comments to the agency, argued that all drones operating over people should broadcast identifying information. In response to comments by EPIC and others, the FAA's final rule prohibits the operation of drones over "open-air assemblies" unless the drone meets the broadcast ID requirement that takes effect in September 2023. Through lawsuits and previous comments to the FAA, EPIC has repeatedly argued the FAA has an obligation to implement privacy safeguards for drones before they operate regularly over people. (Jan. 15, 2021)

  • Recently unveiled changes to WhatsApp's terms of service highlight the privacy and legal objections has EPIC long raised to Facebook's 2014 acquisition of the messaging platform. In early January, WhatsApp introduced a revision to its privacy policy that seemed to require app users to share extensive personal data with Facebook—an apparent violation of the privacy protections that originally fueled WhatsApp's growth. The policy change drove many WhatsApp users to turn to other secure messaging platforms including Signal and Telegram. WhatsApp later delayed the revision of its terms of service by several months and argued that the change only affected "business communication," but the episode underscores the dangers of a company built on the exploitation of personal data acquiring a company that has made explicit privacy commitments to its users. In 2014, EPIC and the Center for Digital Democracy warned the FTC that Facebook routinely incorporates user data from companies it acquires and that WhatsApp users objected to the acquisition. The FTC approved the merger but told EPIC and CDD that "if the acquisition is completed and WhatsApp fails to honor these promises, both companies could be in violation of Section 5 of the FTC Act and potentially the FTC's order against Facebook." (Jan. 15, 2021)

  • Today, Google announced that it "completed its acquisition of Fitbit" in a $2.1 billion deal, even though the Department of Justice has not yet approved the merger. DOJ said that its investigation into the deal remains ongoing, and "[a]lthough the division has not reached a final decision about whether to pursue an enforcement action, the division continues to investigate whether Google's acquisition of Fitbit may harm competition and consumers in the United States." The announcement comes after Google gained EU antitrust approval for its Fitbit bid last month subject to limits on how it will use consumers' data, including pledging to not use Fitbit data for advertising purposes in Europe. EPIC has long opposed Google's acquisition of Fitbit, citing concerns about Google's history of data protection and privacy violations. In November 2019, EPIC told the House Judiciary Committee that the FTC should block the acquisition. EPIC brought the 2012 case against the FTC for the agency's failure to enforce the 2011 consent order against Google after the company consolidated user data across multiple services. (Jan. 14, 2021)

  • EPIC submitted comments to the Department of Homeland Security in response to a system of records notice and proposed exemptions from Privacy Act requirements for a new counterintelligence records system. DHS's proposed records system would permit nearly limitless collection of sensitive personal information and unchecked disclosure of that information to state, local and international agencies, and to private companies. DHS's proposed exemptions would eliminate all individual rights under the Privacy Act and exempt DHS from basic Privacy Act requirements, including limiting data collection to necessary information. EPIC recently insisted that DHS rescind a proposed expansion of the use of biometrics, including facial recognition, across the agency. (Jan. 13, 2021)

  • The National Artificial Intelligence Initiative Office, created as part of the National Artificial Intelligence Initiative Act of 2020, was recently announced by the White House. According to the Act, the office will act as a point of contact for various federal artificial intelligence activities, conduct regular outreach about AI, and “promote access to and early adoption of the technologies, innovations, [and] lessons learned.” EPIC has recently submitted comments to the Office of Management and Budget and the National Security Commission on Artificial Intelligence advising the agencies to follow the Universal Guidelines for AI and push for actionable legal rights to protect against algorithmic harms. (Jan. 13, 2021)

  • HireVue, a major vendor of AI-based hiring tools, announced today that it will stop relying on "facial analysis" to assess job candidates. The move comes a year after EPIC filed a Federal Trade Commission complaint targeting HireVue's use of opaque algorithms and facial recognition. EPIC argued that HireVue's AI tools—which the company claimed could measure the "cognitive ability," "psychological traits," "emotional intelligence," and "social aptitudes" of job candidates—were unproven, invasive, and prone to bias. EPIC also highlighted HireVue's deceptive claim that it did not use facial recognition in its assessments. In announcing the change, HireVue acknowledged the public outcry over its use of facial analysis and said the technology "wasn't worth the concern." However, HireVue will continue to analyze biometric data from job applicants including speech, intonation, and behavior—all of which present similar privacy and discrimination risks. EPIC advocates for a moratorium on facial recognition and recently filed a complaint with the D.C. Attorney General explaining how online test proctoring companies use opaque, unreliable AI tools to monitor students. (Jan. 12, 2021)

  • European Digital Rights (EDRi), along with 61 civil society groups including EPIC, sent a letter today calling for the EU to introduce certain red lines in their upcoming European Commission proposal on Artificial Intelligence. The letter calls on the EU to prohibit the use of biometric mass surveillance, AI at the border, use of AI with social scoring, and use of predictive policing and other AI criminal risk assessment tools. "Without regulatory limits on the use of AI-based technologies," the letter says, "we face the risk of violations of our rights and freedoms by government and companies alike." EPIC has called for a moratorium on the use of face surveillance, and maintains resources on AI in the criminal justice system. (Jan. 12, 2021)

  • The Federal Trade Commission has reached a settlement with Everalbum, Inc., a California-based developer of a photo storage app, over allegations that it deceived consumers about its use of facial recognition technology and its retention of the photos and videos of users who deactivated their accounts. The proposed order requires the company to delete the facial recognition technologies it illegally developed using user photos and videos. According to the FTC complaint, Everalbum represented that it would not apply facial recognition technology to users’ content unless users affirmatively chose to activate the feature. But the company allowed some Ever app users—those located in Illinois, Texas, and Washington state —to choose whether to turn on the face recognition feature, even though it was automatically active for all other users and could not be turned off. Commissioner Rohit Chopra noted in an accompanying statement that residents of those states were afforded stronger protections because their legislatures had passed laws regulating facial recognition and biometric identifiers. Everalbum's differential treatment of users illustrates why Congress must ensure that any proposed federal privacy law sets a baseline for the country while protecting the ability of states to enact stronger privacy laws. (Jan. 11, 2021)

  • The American Civil Liberties Union and the Electronic Frontier Foundation have asked the U.S. Supreme Court to reverse the New Jersey Supreme Court's decision in State v. Andrews, which allows the government to compel an individual to disclose their cell phone passcodes. EPIC filed an amicus brief in Andrews and presented oral argument to the New Jersey Supreme Court arguing that the vast troves of data stored in a cell phone require strong constitutional protections. State supreme courts have disagreed about the extent to which individuals are protected from compelled disclosure of their cell phone passcode. Some courts, like New Jersey and Massachusetts, have applied the "foregone conclusion" exception to require individuals to divulge their passcodes. Others, like Pennsylvania and Indiana, have refused to apply that exception and found that the Constitution protects against compelled disclosure of cell phone passcodes. (Jan. 11, 2021)

  • The Supreme Court has granted review in Americans for Prosperity v. Becerra to decide whether the First Amendment protects donors to charities from compulsory disclosure of their identifying information. A California law requires charitable organizations to identify donors who contribute above a certain amount annually in a form filed with the state. Americans for Prosperity and other charitable organizations challenged the law, arguing that the reporting requirement violates First Amendment rights to speech and association. The Ninth Circuit ruled that the law did not violate the First Amendment. EPIC filed an amicus brief in the Ninth Circuit, arguing that donor privacy is an important tradition and that, contrary to California's assurances, the data was at risk of public disclosure. EPIC frequently files briefs in First Amendment cases, including several before the Supreme Court. (Jan. 11, 2021)

  • The Federal Aviation Administration posted the agency's final rule for remote drone identification. The final rule will require all drones to broadcast drone ID information in real-time, eliminating the option in the proposed rule to forgo real-time broadcast and only submit drone ID information for retention by a third party. EPIC previously commented on the FAA's proposed rule, urging the FAA to require all drones to provide real-time public access to drone ID information. In 2015, EPIC argued that drones should be required to broadcast relevant information to the public while in operation. (Jan. 6, 2021)

Share this page:

Defend Privacy. Support EPIC.
US Needs a Data Protection Agency
2020 Election Security