ai

The Fair Pay Act is a strict gender-equality law recently enacted in California. The law puts the burden of proof on a company to show that it has not shortchanged an employee’s salary based on gender. It’s a powerful tool to address a wrong that has already happened. But can discrimination be prevented in the first place? Even managers who don’t think they are biased may be—and just their word choices can send a signal. A new wave of artificial intelligence companies aims to spot nuanced biases in workplace language and behavior in order to root them out.

The San Francisco-based company Kanjoya is applying natural language processing (essentially, computer algorithms that can read for tone and context) to ferret out what people are really thinking when they fill out an employee survey. Like some other companies applying AI to the workplace, Kanjoya’s focus goes well beyond gender bias, but discrimination is one of the problems its tech can reveal. “When I interview [Moritz Sudhof, Kanjoya’s chief data scientist] and his sister, I might apply different rubrics to them and different expectations of what attributes would make them a good employee.” says Armen Berjikly, CEO of Kanjoya.

Kanjoya grew out of theExperience Project, a social network matching people who have similar life situations and similar problems so they can find a sympathetic ear. Users have written hundreds of millions of entries since the Experience Project launched in 2006 and tagged them with emotions—worried, stressed, confident, excited, confused, angry—which provided the basis for Kanjoya to start understanding the sentiments behind language.

Kanjoya launched with Twitter as its first major client. (Microsoft, Salesforce, andNvidia are some of many other big-name clients.)

Sudhof gives this example of how the process can work: “You ask them, ‘Hey, what’s on your mind?’ If they mention work-life culture and kind of the immediate workplace environment, and if they mention them negatively, it’s hugely predictive of very low intent to stay.”

Kanjoya then aggregates the perceived sentiments from employee surveys and crosses it with hard information like demographics, allowing HR to slice into the data by different criteria, including gender.

If a lot of women mention topics such as leadership and learning in a negative light, that’s a sign the company is not giving women the same opportunities as men, says Sudhof. Another red flag: when topics like attitude and teamwork skills come up more in women’s employee evaluations, while leadership skills show up more in men’s evals.

CHANGING THE EQUATION

All that assumes women even make it into the company. They may feel too discouraged to apply for certain jobs, says Kieran Snyder, founder and CEO of Seattle-based Textio, which applies natural language processing to the hiring process. Snyder, who holds a PhD in linguistics and cognitive science, has published several articles recently on gender-biased language. Her August 2014 Forbes article “The Abrasiveness Trap” describes her study that women’s performance reviews have more negative feedback about their tone than men’s do, with words like bossy, abrasive, strident, and aggressive often popping up in the evaluations. (The results were similar whether a man or woman was performing the review.)

Snyder later used natural language processing to analyze nearly 90,000 tweets from similarly qualified men and women in the technology world. In an article for Re/code, she reported that the men’s tweets about tech are fivefold more popular. Like Kanjoya, Textio also counts Twitter as a client. Other clients include Microsoft (where Snyder worked previously), Barclays, and Broadridge Financial.

Just as a spelling and grammar checker underlines suspect words and phrases while you are typing, Textio’s web-based text composer highlights problematic snippets that hurt a job description, such as a hackneyed word like “synergy,” and proposes alternatives. (It also highlights positive text such as “mobile-first,” for software developers or “fun-loving,” for anyone.) Textio’s analysis is based on ingesting job descriptions and comparing their wording and structure to how successful they were in attracting qualified applicants.

Textio also flags parts of text that lean toward one gender. A job offering a “world-class” experience appeals more to men, whereas “premium” is less associated with a gender, says Snyder. Men tend to prefer bulleted content, she says, while women prefer narrative text. If Snyder is right, inadvertent discrimination goes back way before the question of a promotion or even a job offer. Women may feel the job is not for them and not even apply.

Overcoming background bias is also a central theme for Utah-based HireVue. The company began in 2004 as a web-based service to record job interviews, giving the applicant the ability to apply remotely and the interviewers the flexibility to watch when it’s convenient. But HireVue soon looked for ways to automate the process to narrow down the candidate pool, says Jeff Barson, head of the company’s research and development department, HireVue Labs. “In general, what [our customers] are looking for is: ‘Okay, I have these 100 videos,” says Barson. “What are the first 10 that I want to watch?”

HireVue claims that it can find the best candidates based not just on what they say, but how they say it. Using machine learning, and AI tech that discerns patterns from huge data sets, HireVue analyzes phrasing and even physical gestures that applicants use in an interview. HireVue then compares the interviews of people who were hired to how well they actually did in the job. Computers don’t hire people, but they help to refine the selection.

“When someone new comes in, we can look at their video in an increasingly comprehensive way,” says Barson. That includes analyzing language such as sentence structure, rate of speech, and use of active or passive voice. But HireVue goes farther, noting temperature fluctuations across the face or pupil dilation, for example, things that show someone’s emotional response.

That may sound creepy: I tried a quick sample interview and felt rather self-conscious. But the benefit Barson claims is that the process removes human bias, and allows companies to consider people they might have overlooked.

Hilton Hotels, for example, started using HireVue as part of its program to hire 50,000 military veterans over the coming years. It’s hard for vets to get jobs, says Barson, because their job experience looks very different from the terminology on a civilian resume. Other big customers include Urban Outfitters, GE, and publisher Houghton Mifflin Harcourt.

“Every candidate has the same amount of time, is asked exactly the same questions in exactly the same way, and is treated (by the system) in exactly the same way,” Barson writes in an email.

HireVue’s tech can also evaluate the interviewers, by seeing how well the people they hired did in the job. “We’ve found that with some of our clients, they have evaluators who are making the right decision 80% of the time, and others who are right only 20% of the time,” writes Barson.

Kanjoya’s Berjikly encounters the same problem in his company’s experience with clients. “People that aren’t very well trained to do interviews are just doing interviews, and they’re dumping their biases and their thoughts into this content.”

Here’s where technology promises to be constructive and not just punitive. It’s not just about catching people who are guilty of bias, but teaching them not to be biased. “So, if I have a problem I can uncover it immediately and address it,” writes Barson. Then, critically for the new California legislation, he adds: “If I don’t have a problem with bias promotions or hiring, I can prove it.”

Image credit: Cloudwiser

Article via Fast Company