If AI Made the Hiring Decision, You May Have a Case
When a Computer Says No: Understanding AI in Hiring
Artificial intelligence is changing how companies find and hire workers. Many employers now use software to screen resumes, rank candidates, and even conduct video interviews. These tools promise to save time and reduce human error. But there is a growing concern that these systems can actually make things worse — particularly for job seekers who face discrimination based on race, gender, age, or disability.
If you applied for a job and were rejected without ever speaking to a human, an algorithm may have made that decision. And if that algorithm was biased, you may have legal options available to you.
What Is Algorithmic Bias and How Does It Happen?
Algorithmic bias occurs when a computer system produces results that are unfair or discriminatory, even if no one intended that outcome. These systems learn from historical data — data that often reflects the biases of the past. If a company’s hiring history shows that mostly men were hired for certain roles, the AI may learn to prefer male candidates going forward, even without anyone telling it to do so.
Here are some common ways that algorithmic bias shows up in hiring:
- Resume screening tools that filter out candidates based on words or formatting associated with certain demographic groups
- Video interview software that analyzes facial expressions or speech patterns in ways that may disadvantage people with disabilities or non-native speakers
- Scoring systems that favor candidates from certain zip codes, schools, or backgrounds — which can indirectly discriminate based on race or income
- Personality assessments built on data sets that do not fairly represent all groups of people
The problem is not always easy to spot. The AI makes its decisions quietly, behind the scenes, and most job seekers never know exactly why they were rejected.
Is This Legal? What the Law Says About AI and Hiring Practices
Federal law has long prohibited employment discrimination. Laws like Title VII of the Civil Rights Act, the Age Discrimination in Employment Act (ADEA), and the Americans with Disabilities Act (ADA) still apply — even when a computer is making the decision instead of a person.
The Equal Employment Opportunity Commission (EEOC) has made clear that employers cannot escape responsibility for discrimination simply by outsourcing the decision to an algorithm. If an AI tool produces results that have a disproportionate negative impact on a protected group of people, that can still be considered unlawful discrimination under a legal theory known as disparate impact.
Some states and cities are going even further. New York City, for example, now requires employers who use AI hiring tools to conduct annual bias audits and notify candidates when such tools are being used. Illinois has passed laws requiring transparency in automated video interviews. These are early steps, but they signal that lawmakers are taking this issue seriously.
How to Know If You May Have Been Discriminated Against
It is not always easy to prove that an AI system discriminated against you. However, there are signs that may suggest something went wrong in the hiring process:
- You were clearly qualified for the position but were rejected very quickly — possibly before any human reviewed your application
- You belong to a protected group such as a racial minority, a woman, a person over 40, or someone with a disability
- The employer is known to use automated screening software
- You requested an accommodation during the application process and were denied or ignored
- You noticed a pattern of rejections from multiple companies using similar hiring platforms
None of these factors alone proves discrimination, but they are worth paying attention to. If something felt off about the way your application was handled, it is worth looking into further.
What Legal Remedies Are Available to You?
If you believe you were a victim of algorithmic bias in a hiring process, there are several legal remedies you may be able to pursue. Employment discrimination law provides real options for people who have been treated unfairly, and AI-driven decisions are not immune from these protections.
Filing a Complaint with the EEOC
The first step in most federal employment discrimination claims is filing a charge with the EEOC. This agency investigates complaints and can take action against employers who violate anti-discrimination laws. You generally have 180 to 300 days from the date of the discriminatory act to file, depending on where you live.
State and Local Agencies
Many states have their own civil rights agencies that handle employment discrimination complaints. These agencies may have broader protections or shorter deadlines than federal law, so it is important to act quickly and understand the rules in your specific location.
Private Lawsuits
In some cases, you may be able to file a lawsuit against an employer directly. If successful, legal remedies in employment discrimination cases can include back pay, compensation for emotional distress, attorney fees, and in some cases punitive damages designed to discourage future misconduct.
Class Action Cases
When an AI system affects a large group of people in the same discriminatory way, it may be possible to bring a class action lawsuit. This allows many affected individuals to join together in one legal claim, which can be more efficient and more powerful than individual cases filed separately.
What You Should Do If You Suspect AI Bias
If you believe an automated hiring system may have discriminated against you, taking the right steps early can make a significant difference in your case.
- Document everything. Save all emails, rejection notices, and records of your application. Note the timeline of events, especially if a rejection came very quickly.
- Research the employer’s hiring tools. Some companies publicly disclose which AI platforms they use. This information can be helpful when building a case.
- Request information. In some jurisdictions, you have the right to ask employers about automated decision-making tools used in hiring. Check your local laws to see what disclosures companies are required to make.
- Consult an employment attorney. An experienced attorney can help you understand whether your situation may qualify as employment discrimination and what your legal options are.
- File a complaint promptly. Deadlines for discrimination claims are strict. Do not wait too long before seeking help.
The Bigger Picture: Holding Technology Accountable
AI hiring tools are not going away. If anything, they are becoming more common as companies look for faster and cheaper ways to manage large pools of applicants. But speed and efficiency should never come at the expense of fairness.
Employment discrimination is harmful whether it comes from a person or a piece of software. The law recognizes this, and legal accountability for algorithmic bias is continuing to grow. Employers have a responsibility to make sure the tools they use do not unfairly shut out qualified candidates based on race, gender, age, or disability.
If you were rejected for a job and believe an AI may have played a role in that decision, you deserve to know your rights. The legal framework to challenge unfair hiring practices exists — and it applies to the machines making these decisions just as much as it does to the people who program and deploy them.
Talk to an Employment Discrimination Attorney
Understanding your rights in the age of AI hiring can feel overwhelming, but you do not have to figure it out alone. An employment discrimination attorney can review your situation, explain your options, and help you take action if you have a valid claim. Algorithmic bias is a real problem with real legal consequences, and you may have a stronger case than you realize.














