Background

EPIC’s Screening & Scoring Project produces comprehensive resources that identifies instances of scoring and screening of everyday life, articulates common issues with these tools, analyzes potential violations of existing law with their use, and works to protect the public from the algorithmic harm these tools may cause.

Automated decision-making tools are tools or systems that analyze data in order to aid or replace human decision-making. These tools can vary from simple algorithms to scoresheets to machine learning algorithms. These tools can be self-developed by local or state agencies or purchased by private companies.  

Everyone is Screened and Scored 

Without notice or consent, people are often screened and scored at important junctions in their lives. 

When applying for a job, applicants encounter dubious facial or voice analysis, “emotion detection,” a series of games to try to identify “fit”, and/or resume scanners. 

Students and employees are subject to monitoring software aimed to control behavior and detect potential wrongdoing. In some districts across the country, students are frequently allocated to schools based on opaque automated screening algorithms. Algorithms used in more than 100 high schools in New York City, for instance, reportedly consider variables like test scores, attendance, and behavioral records.

Prospective tenants are screened through credit scoring, reputation scoring, and/or tenant screening tools. Third-party companies offering these tenant screening tools collect, store, and select records for housing providers to use in evaluating tenants. 

Individuals, and specifically medical patients, often have to interface with algorithms to help determine public benefit eligibility and fraud determinations, medical prioritization, medical diagnostics, and more. 

Child welfare agencies use risk assessment systems to assist in identifying children who are at high-risk of experiencing abuse or neglect. These risk assessments are used to determine whether the agency should initiate a family intervention, which could range from connecting parents with resources to removing children from the home. 

Many government offices use risk assessments and other automated tools to screen potential contracting partners. Commercial financial risk assessments assess a business’s financial health and creditworthiness. Much like individual credit scores, business credit reports impact how much credit a business may get, what interest rates and repayment terms are attached to the loan, and what insurance premiums a company will have to pay. 

Most people in the U.S. have a credit score and have had to use their credit score to obtain services. Credit reporting agencies have unlimited access to consumer information and this data is used to in a myriad of decisions beyond lending matters. Employers, utility service providers, insurance companies, landlords, among others, use credit scores to evaluate whether to offer their services to you. 

Controversial risk assessment tools are used in the criminal justice system to set bail, determine criminal sentences, and can even contribute to determinations about guilt or innocence. Predictive policing tools are used to over surveil and over police communities of color. 

These Tools are Problematic 

As more entities rely on these tools to supplement and replace human decision-making, the inherent risks associated with these tools become more urgent. Scoring and screening tools are problematic for a variety of reasons, including:  

  • Bias and Accuracy Issues: These tools have significant bias and accuracy issues. These tools can have assumptions built into the system that reinforces biases and inequalities. Issues with data sources, such as limited data sets, or lack of external validation studies often result in inaccurate screening and scoring outcomes. 
  • Insufficient Transparency and Accountability: These tools have woefully insufficient transparency and accountability requirements. There is often a lack of transparency about how these tools work, what data points are being used, and the logic behind each automated decision. It is often unclear whether these tools have undergone proper validation studies for accuracy and reliability. 
  • Opacity and Secrecy: These tools obscure the decision-making process. Because many of these tools are proprietary third-party systems, companies often refuse to disclose information about how their systems work.
  • Lack of Notice and Denial of Procedural Due Process: Entities using these systems often fail to provide adequate notice, if any, to individuals being scored and screened. The use of these tools can erode procedural due process protections because people do not know how these tools make determinations and what types of data is used. These tools also limit an individuals’ ability to challenge outcomes affecting their eligibility for services, jobs, housing, or benefits because they don’t understand how these tools work.
  • Discrimination: These tools lead to flawed decisions that contribute to inequity, often replicating and exacerbating bias. These tools are often not designed to account for discrimination and instead, use biased training data and modeling to predict outcomes. They illustrate how inconsistent and highly-opaque algorithms can marginalized communities of color and people living with disabilities.

These Tools Need to be Regulated

Although civil rights laws exist to protect against certain types of disparate impact for members of protected classes, there is insufficient regulation about the use of these screening and scoring tools. Lawmakers must develop standards for the regulation of these tools and hold entities deploying these tools accountable. There must be oversight and enforcement that can mitigate and remedy harms caused by these tools. There also must be greater transparency and accountability mechanisms in place. 

EPIC’s Screening & Scoring Work

Many areas of EPIC’s work involve aspects of screening and scoring. Whether in the criminal justice, surveillance and profiling, consumer protection, housing, or credit scoring context, EPIC has advocated for algorithmic transparency and the regulation of these tools. EPIC has utilized open records laws and have litigated for the release of information about the government’s screening and scoring tools. EPIC has filed complaints to the Federal Trade Commission and the DC Attorney General to investigate companies peddling these tools. EPIC has also submitted public comments to government agencies about the use of these tools. 

Some examples of EPIC’s work:

D.C.’s Use of Automated Decision-Making Systems

EPIC spent over 14 months investigating the D.C. government’s use of automated decision-making systems, culminating in the release of Screened & Scored in D.C. The report sheds light into the many uses of automated decision making systems in the District as well as the problems associated with them. The report also includes vignettes taken from real world instances of being screened and scored, policy recommendations, and resources for residents affected by these systems. EPIC found 29 automated decision-making systems used by 20 D.C. agencies and compiled the information into a comprehensive table both within the report and online.

Criminal Justice 

EPIC published Liberty at Risk: Pre-Trial Risk Assessments in the U.S., a report that provides an overweight of risk assessment tools that practitioners and scholars can use to understand the nature of these systems, the context in which they are used, and help focus their evaluations of the fairness of these systems. As part of its reporting, EPIC also obtained several documents about states’ use of criminal justice algorithms. 

EPIC sued the Justice Department for records concerning the government’s use of risk assessments and predictive policing techniques. EPIC’s case led to the disclosure of hundreds of pages of relevant records and revealed the existence of a previously-unknown DOJ report to the White House about the use of predictive analytics in law enforcement. 

Through public records requests in six states, EPIC obtained documents about a secret DNA forensic source code used in forensic analysis. Law enforcement used the TrueAllele software test results to establish guilt, despite individuals accused of crimes not having access to the source code that produces these results. EPIC obtained validation studies, technical specifications, and other records about this controversial DNA forensic technique. 

Surveillance and Profiling

EPIC sued Customs and Border Protection for records about the agency’s Analytical Framework for Intelligence (AFI), an analytic tool used to assign risk assessments to travelers, including U.S. citizens traveling domestically. The agency’s uses AFI to analyze personally identifiable information from a variety of sources like government data bases, commercial data brokers, or the internet. EPIC eventually obtained documents about this secretive scoring program.

EPIC sued the Department of Homeland Security for records about the agency’s Future Attribute Screening Technology (FAST) program. The FAST program was a “Minority Report” style initiative that tried to determine the probability of an individual to commit a future crime. EPIC obtained the program’s Privacy Threshold Analysis, project presentation, technical requirements, test-site installation, among other records. 

Consumer Protection

EPIC has sent several complaints to the FTC asking the agency to investigate screening and scoring tools related to education, hiring, student athlete profiling, and commercial data brokers. EPIC has urged the agency to investigate HireVue, the Universal Tennis Rating ScoreChoice Point, and regulating the commercial use of AI

EPIC also filed a complaint with the D.C. Office of the Attorney General calling on the AG to take enforcement action against five online test proctoring companies.

Housing

EPIC has urged the FTC to investigate Airbnb’s use of “Trustworthiness” scores. EPIC also commented on the Department of Housing and Urban Development’s implementation of the Fair Housing Act’s Disparate Impact Standard. 

Credit Scores

EPIC sent comments to the Consumer Financial Protection Bureau’s about revising its regulations and/or providing new guidance to financial institutions about their use of AI and machine learning systems. 

Recent Documents on Screening & Scoring

Support Our Work

EPIC's work is funded by the support of individuals like you, who help us to continue to protect privacy, open government, and democratic values in the information age.

Donate