Skip to content

The Impact of AI on Diversity Recruiting

October 16, 2019

“You’re going to have to deal with [diversity] whether you want to or not,” Jarik Conrad, senior director of human capital management (HCM) innovation for Ultimate Software told a group of rapt attendees at one of a handful of sessions on diversity during this year’s HR Tech.

During HR Tech this year I met with some vendors, explored a few demos, and attended quite a few sessions on diversity in technology. Conrad and Trish McFarlane, CEO and principal analyst for H3 HR Advisors, presented "How HR Technology Can Help CHRO’s Lead the Most Impactful DE&I Initiatives." This session provided insight into issues such as diversity, equity, and inclusion (DE&I) statistics across North America, and how to understand the employee continuum of needs. 

Alas, the session came to an end before Conrad and McFarlane could expound on how HR technology helps lead meaningful DE&I initiatives. I was interested in hearing them talk about how technology, specifically diversity recruiting software, can help. The “how” is the hardest part of the equation to answer, and the G2 on diversity is only finding more questions.

Find the best Diversity Recruiting software on the market. Explore Now, Free →

Clearly, tech is not the panacea for increasing diversity throughout all industries. However, it allows us to do things in a way that was not possible before. It can help HR professionals identify problems and their scale; nevertheless, it can present additional challenges along the way. A theme emerged throughout the sessions on diversity at HR Tech: the need to understand what’s under the hood, or rather, what’s inside the black box.

AI’s black box problem

Inside and outside of the workplace, we are increasingly trusting artificial intelligence (AI) systems to make decisions on our behalf. From Amazon’s Alexa to personalized online shopping recommendations, our lives are impacted daily by these systems. Although AI is embedded throughout our everyday lives, these systems are not at all opaque. We cannot see inside the black box, or the algorithm, and therefore cannot understand how decisions are made. This usually becomes apparent after bias is found within a solution.

What is the black box?

A black box is a system which can be understood in terms of its inputs and outputs; however, due to the proprietary nature of the data, there is little insight into the internal workings of the system. 

Earlier this month at TWIMLcon in San Francisco, VentureBeat senior AI staff writer Khari Johnson moderated a panel on operationalizing responsible AI. In a followup article about the panel, he expanded on the idea that a lack of insight into how AI systems operate result in bias and can harm real people in serious ways. 

While considering focusing on the user, LinkedIn senior software engineer Guillaume Saint-Jacques insisted that we look beyond bias and consider the potential harm that unchecked AI systems cause. Rachel Thomas, the Center for Applied Data Ethics director, provided example algorithms erroneously cutting off Medicaid benefits or blindly firing teachers based on flawed data.

TIP: It’s not all heed and warning when it comes to AI. Rebecca Reynoso breaks down the benefits and the risks of artificial intelligence and provides industry examples.

The potential for harm and bias in AI knows no bounds. AI helping to improve diversity recruiting has made headlines recently. Amazon spent years building an AI recruitment system that it had to scrap after it showed bias against candidates who are women. The model ranked candidates based on 10 years of resumes. Most of the resumes were from men; therefore the system deemed men more qualified when ranked beside candidates who are women. 

Diversifying tech

It has been five years since Apple’s CEO Tim Cook wrote employees a letter promising the company would be “as innovative in advancing diversity as we are in developing products.” As WIRED recently reported, the tech industry has made very little progress in terms of diversity hiring initiatives. Today the workforces that make up Apple, Facebook, Google, and Microsoft are still overwhelmingly (white or Asian) men. According to Freada Kapor Klein, founding partner at venture capital firm Kapor Capital, there are barriers to achieving diversity goals lurking around every corner. 

Discover the Easiest-to-Use AI Platforms...

Freada Kapor Klein on the lack of diversity in tech

Companies need to address the deep-rooted biases ingrained throughout their culture before they can improve their hiring practices. We are currently facing quite an irksome catch-22. Tech has an issue with diversity. Tech is unwittingly creating biased AI solutions that harm, disregard, or snub an array of humans. Tech needs a diverse range of individuals to build and institute responsible AI. Tech cannot seem to make real strides in expanding their talent pool. And yet we’re constantly innovating and expanding the capabilities of AI. Can AI help diversify the talent pool?

Reimagining HR 

According to consulting firm Mercer's Global Talent Trends 2019 report, 88% of companies already use AI in HR. Most of these companies employ AI solutions in the form of chatbots for candidates and employees, performance management, career pathing, and screening or assessing candidates. 

An increasing number of AI-based screening and assessment solutions that companies use or are considering already include diversity features. The market for this technology will only grow. There are already plenty of companies implementing these solutions. During another panel at HR Tech on HR in The Age of Artificial Intelligence Jennifer Carpenter, VP of global talent acquisition for Delta, said that implementing AI-driven assessment technology allowed Delta to double the number of candidates that made it to final stage interviews.

During the question part of that session an attendee asked the panel whether they, “audit the algorithm” for errors or issues. Carpenter said that as Delta’s pipeline increased, they consistently test and verify that their algorithms lack blatant bias. Meanwhile, Jean Smart, VP for global recruitment at Hilton Hotels, attested that her team continually tests their systems for errors. Hilton’s relationship with the vendor is one of mutual responsibility. If they find bias, error, or harm, they share the burden of responsibility with the company, and then fix it.

Next steps for AI and diversity

Efforts to improve hiring practices that cast a wide net require a combination of regularly enhancing and improving the solutions available, implementing recruiting software to improve the process, and continually validating its harmlessness. With the great minds currently in tech, and the talented diverse candidate pool eager to get to work, there is great potential for improvement and development of these invaluable tools.

Get G2 Research and Insights →

Don’t fall behind.

Subscribe to the latest software news & updates from the expert analysts at G2.

By submitting this form, you are agreeing to receive marketing communications from G2.