Just last week, the U.S. Equal Employment Opportunity Commission (EEOC) held a public hearing to better understand the use of automated systems, including artificial intelligence (AI) in the employment process. This hearing is just one part of a years-long conversation that has involved many people, from the federal government to some of the largest employers in the American workplace. Back in May of 2022, government officials said that using AI technology to screen new job candidates or monitor employee productivity has the potential to unfairly discriminate against those with disabilities. Additionally, the U.S. Justice Department and the EEOC issued a joint guide to employers to proceed with caution before making the decision to implement popular algorithmic tools that were created to streamline the process of evaluating job applicants and employees.
AI has been a field of research since the 1950s, but its use in the human resources sector has drawn a great amount of attention in recent years. The highly competitive job market in the past decade has required many organizations to optimize processes, particularly when it comes to hiring new employees. It has been increasingly common for companies to use AI to automate aspects of the hiring process, some to more significant extents than others. In fact, according to the EEOC’s chair, Charlotte Burrows, 83% of employers, including 99% of Fortune 500 companies, now use some form of automated tool to screen applicants. What these tools look like vary from company to company. For instance, one of the most popular incorporations of AI is through an applicant tracking system, or ATS, which is programmed by employers to search for and prioritize keywords in resumes that applicants upload to job portals. An ATS automatically compares resumes against the posted job description and ranks candidates based on how well it determines they fit the qualifications. Resume scanners within an ATS are often the first step that job applications go through, even before a human hiring manager ever looks at a single resume. It is possible for resumes to be rejected by the ATS due to perceived gaps or misalignment with job descriptions, all before a pair of human eyes has set on any documents.
Recent developments in AI have also led to the use of programs that evaluate a candidate’s facial expressions and speech partners in a video interview. During a live video conferencing interview, for instance, a recruiter can receive constant feedback from the AI algorithm about the applicant’s behavior and mood. The recruiter can learn AI-generated information about how the candidate feels, including if they are relaxed or nervous, or even if they are telling the truth.
Companies’ reasons for using AI-powered HR tools are usually centered around a desire to decrease workload prompted by high recruitment volumes and isolate the best talent from a surplus of applicants. Many have also cited that they seek to fulfill corporate diversity, equity, and inclusion (DEI) goals through eliminating implicit biases that a human recruiter may have when viewing the race, ethnicity, gender, or age of a specific candidate.
However, such technology could do more harm than good by screening out people based on their social identities. Video interview tracking software, for example, can potentially screen out people with speech impediments or other physical or mental impairments. Chatbots that vet candidates with gaps in their resumes could be ignoring that many parents of young children, particularly mothers, may have had to take significant time off from work to raise their families. Algorithms that search for data from social media and professional digital profiles to look for “ideal candidates” could also be excluding older applicants who may have smaller digital footprints.
Job recruitment tools and companies that claim to use AI to eradicate these biases may not be improving diversity in hiring, and could actually be exacerbating and further perpetuating these already prevalent prejudices. It seems that AI-powered tools are more of a quick fix in employers’ hiring processes rather than a genuine attempt at solving a much deeper problem.