When a computer eliminates some candidates and raises others without explanation, it is more difficult to know if it is making fair assessments.
A day after her interview for a part-time job at Target last year, Dana Anthony received an email informing her that she had failed.
Anthony didn’t know why – a situation common to most job seekers at one time or another. But she also had no idea how the interview would unfold, as her interviewer was a computer.
More and more job seekers, including some professionals, may soon have to accept impersonal online interviews where they never speak to another human being, or find out whether behind-the-scenes artificial intelligence systems are influencing decision-making decisions. ‘hiring. Demand for online hiring services, which interview job applicants remotely via laptop or phone, has increased during the COVID-19 pandemic and remains high amid a shortage of workers perceived as the economy reopens.
These systems claim to save employers money, circumvent hidden biases that can influence human recruiters, and broaden the range of potential candidates. Many now use AI to assess candidates’ skills by analyzing what they say.
Anthony likes to look an interviewer in the eye, but all she could see was his own face reflected on the screen. “I interview better in person because I am able to develop a bond with the person,” she said.
But experts wonder if machines can accurately and fairly judge a person’s character traits and emotional signals. Algorithms tasked with figuring out who is the best fit for a job can reinforce prejudices if they draw inspiration from industries where racial and gender disparities are already prevalent.
And when a computer eliminates some candidates and raises others without explanation, it is more difficult to know if it is making fair assessments. Anthony, for example, couldn’t help but wonder if her identity as a black woman had affected the decision.
“If you apply for a job and are rejected because of a flawed algorithm, you certainly won’t know it,” said Aislinn Kelly-Lyth, a researcher at the University of Oxford. In contrast, during a face-to-face interview, a job seeker may pick up discriminatory clues from the interviewer, she said.
The new rules proposed by the European Union would subject these AI recruitment systems to more stringent regulation. Advocates pushed for similar measures in the United States
One of the leading companies in the field, Utah-based HireVue, has gained notoriety in recent years by using AI technology to assess cognitive abilities from a candidate’s facial expressions during the interview. ‘interview. After strong criticism centered on the scientific validity of these claims and the potential for racial or gender bias, the company announced earlier this year that it would end the practice.
But its AI-based ratings, which rank the skills and personalities of candidates to mark the most promising for further consideration, still factor in speech and word choices in its decisions.
The private company has helped create a market for “on demand” video interviews. Its known customers include retailers like Target and Ikea, big tech companies like Amazon, banks like JP Morgan and Goldman Sachs, oil giants, restaurant chains, supermarkets, airlines, cruise lines and school districts. The Associated Press has contacted many prominent employers who use the technology; most refused to discuss it.
HireVue CEO Kevin Parker says the company has worked hard to ensure its technology won’t discriminate based on factors like race, gender or regional accents. Its systems, which translate speech to text and look for clues about team direction, adaptability, reliability and other professional skills, can outperform human interviewers, he said.
“What we’re trying to replace is people’s instincts,” he said in – naturally – a video interview.
HireVue says it surveyed more than 5.6 million people worldwide in 2020. Supermarket chains have used it to screen thousands of applicants per day amid a wave of hires fueled by a pandemic for cashiers , storers and delivery crews, Parker said.
Larger hiring-focused software vendors like Modern Hire and Outmatch have started offering their own video interviews and AI assessment tools. On its website, Outmatch touts its ability to measure “the essential soft skills your candidates and employees need to be successful”.
HireVue notes that most customers don’t actually use the company’s AI-powered reviews. The Atlanta School District, for example, has been using HireVue since 2014, but says it relies on 50 human recruiters to grade recorded interviews. Target said the pandemic had led it to replace face-to-face interviews with HireVue interviews, but the retail giant told the AP it relies on its own employees – not algorithms for HireVue – to watch and rate pre-recorded videos.
None of this was clear to Anthony when she sat down in front of a screen interviewing for a seasonal job last year. She dressed for the occasion and settled in a comfortable place. The only hint of a human presence came in a prerecorded intro that laid out what to expect – noting, for example, that she could delete a response and start over.
But she had no way of knowing what kind of impression she was creating. “We are unable to provide specific comments regarding your application,” the rejection email from Target said. She was turned down again after she interviewed HireVue for another job in December.
“I understand that businesses or organizations are trying to be more conscious of the time and finances they devote to recruiting,” said Anthony, who earned a master’s degree in strategic communications last year at the University of Carolina. north to Chapel Hill. Still, the one-sided interviews left her uncomfortable about who, or what, was evaluating her.
This impenetrability poses one of the biggest concerns about the rapid growth of complex algorithms in recruiting and hiring, Kelly-Lyth said.
In an infamous example, Amazon developed a resume analytics tool to recruit top talent, but ditched it after discovering it favored men for technical roles – in part because it compared candidates. for employment in the technology workforce of the predominantly male enterprise. A study published in April found that Facebook shows different job postings to women and men in a way that could violate anti-discrimination laws.
The governments of the United States and Europe are considering possible controls over these recruiting tools, including requirements for external audits to ensure they do not discriminate against women, minorities or people with disabilities. The proposed EU rules, unveiled in April, would force providers of AI systems that screen or assess job applicants to meet new requirements for accuracy, transparency and accountability.
HireVue began phasing out its facial analysis tool, which analyzed eye expressions and movements and faced derision from academics as “pseudoscience” reminiscent of the discredited and racist theory of 19th-century phrenology. The Electronic Privacy Information Center filed a complaint in 2019 with the Federal Trade Commission, citing a HireVue executive who said 10-30% of a candidate’s score was based on facial expressions.
“The value he was adding related to the controversy he was creating was not that big,” Parker told the AP.
HireVue also released portions of a third-party audit that looked at issues of fairness and bias around its automated tools. A published summary recommended minor changes such as changing the weight given to particularly short answers disproportionately provided by minority applicants.
Critics praised the audit but said it was just the start.
“I don’t think science really supports the idea that speech patterns are a meaningful assessment of someone’s personality,” said Sarah Myers West of the AI Now Institute at New York University, which studies the social implications of AI. For example, she said, such systems have historically struggled to understand women’s voices.
Kian Betancourt, a 26-year-old pursuing a doctorate in organizational psychology at Hofstra University, also failed a HireVue remote interview for a consultant position earlier this year. He acknowledged that he may have tried too hard to predict how the system would rate him for a consultant job, tailoring his diction to include keywords that he said could boost his score.
While Betancourt favors “structured interviews” involving a standard set of questions, he is hampered by the opacity of automated systems.
“Tell people exactly how we’re rated, even if it’s something as simple as ‘This is an AI interview’,” he said. This basic information can affect the way people present themselves, he said.