When it comes to giving AI control over recruiting and hiring, there’s a potential drawback that employers need to be aware of: racism, sexism and bigotry are all learned attributes, and machines can learn them too.
In tech, this phenomenon of a machine providing bad results due to bad data is known as “garbage in, garbage out.”
Teaching AI to Not Discriminate
The big problem with AI and machine learning, at the moment, is the fact that systemic discrimination, both along race and gender lines, has been going on for so long and is so ingrained in society, that there’s almost no way for an AI to not pick up on it. What we all now understand as implicit bias gets translated into actual bias by AI.
However, AI can still be helpful in recruiting and hiring, just not when it comes to choosing what criteria to value. Notably, it can simply be used to anonymize applicants until it is actually interview time.
Related Resources:
- Age Discrimination Claims for Facebook’s Targeted Ads (FindLaw’s Technologist)
- Why Asking Applicants Riddles Is a Bad Idea (FindLaw’s Technologist)
- You Can Train AI to Spot Legal Issues – for Fun (FindLaw’s Technologist)
You Don’t Have To Solve This on Your Own – Get a Lawyer’s Help
Civil Rights
Block on Trump’s Asylum Ban Upheld by Supreme Court
Criminal
Judges Can Release Secret Grand Jury Records
Politicians Can’t Block Voters on Facebook, Court Rules