October 27, 2021

Excellent Pix

Unlimited Technology

A startup called UnBiasIt says its software can spot racial bias within companies. Will this surveillance scare employees?

In August, the company began using software from a startup called UnBiasIt to monitor for signs of or keywords related to racial, gender, or age bias, Telhio spokeswoman Jessica Bing told CNN Business. For instance, she said, if an email from one employee to another alluded to a “diversity hire,” that’s the kind of thing the software, UnBiasIt’s Racial Bias Alert, would be expected to flag. Companies can customize the software to look for certain words or phrases, and update it over time with new ones.

If UnBiasIt scans an email and finds wording that may be objectionable, it will send an alert to a small group of Telhio employees working in human resources and diversity, equity, and inclusion, Bing said, with the wording in question highlighted in yellow.

The software “is not looked at as a ‘gotcha’ for employees,” Bing said, because in cases of unconscious bias in particular a person may not know they’ve said or written something that’s not okay. Rather, those tasked with looking over the alerts will decide whether or not to take action — such as offering an employee some bias-related training or other education.

UnBiasIt was created by a company that sits outside the tech industry: Black Progress Matters, a Phoenix, Arizona-based staffing agency that helps companies add more minorities to their executive ranks and also runs an incubator for minority-owned businesses. Formed in late 2020, UnBiasIt began rolling out its software to companies in March, according to Dean Haynesworth, Black Progress Matters’ CEO and cofounder. Haynesworth said the software is similar to compliance software typically used by financial companies (like that which Telhio already uses) to spot potential finance-related violations like insider trading — it was put together, Haynesworth said, with the help of a data-archiving company called Unified Global Archiving.

This $5 billion insurance company likes to talk up its AI. Now it's in a mess over it

Haynesworth said UnBiasIt relies on keyword and phrase spotting, plus signals such as the locations of words and phrases in a message, to determine when to flag an email, text, or call recording. It doesn’t use artificial intelligence to determine when to send an alert, he said, because of concerns surrounding the possibility that bias could be contained in AI itself. Haynesworth echoed the point that the tool isn’t meant to be used in a punitive way, and he “highly suggests” companies inform workers when they use it.

Telhio’s rollout is a test case, of sorts, for UnBiasIt: While there is software that attempts to do things like fight toxic language online or cut down on Slack messages containing unconscious gender bias, UnBiasIt may be the first of its kind that attempts to spot and stop bias problems across companies’ communications.

It’s too new to know whether it will be helpful, and whether it will lead to negative consequences (such as disturbing employees who feel wary of being watched). UnBiasIt is likely to have more data to pore over than ever, though, as the startup’s effort to expand monitoring of workplace communications comes at a time when many workers are conducting more of their professional conversations via various online communications tools because of the ongoing pandemic.

Haynesworth is optimistic that UnBiasIt can help companies stop microagressions, inherent (or overt) bias, and other such issues from perpetuating.

“If you don’t try you just don’t know,” Haynesworth said. “These scenarios go unchallenged.”

Bing said Telhio decided to use UnBiasIt in the wake of civil unrest sparked in 2020, with the goal of making sure the company fosters an equitable workplace for all. Because the software is still so new — as of early September, it had only been in use for a few weeks — Bing acknowledged Telhio doesn’t know whether it will be helpful or trigger false positives.

Twitter says its image-cropping algorithm was biased, so it's ditching it

The software on its own may not be that useful. Experts contacted by CNN Business expressed concerns that it will only be able to pick up blatant examples of bias, given its reliance on text search, and that it may produce false positives (such as from women or minorities communicating with each other about experiences involving bias).

“It’s likely to miss ways in which bias can be occurring in the workplace and yet give the employer a false sense of security that they’re doing something and that they’re changing the culture, when they’re not,” said Pauline Kim, a law professor at Washington University in St. Louis who researches the use of data and algorithms to manage the workplace.

And regardless of its utility, some employees may also be disturbed by the surveillance aspects of UnBiasIt. While there’s no legal restriction on monitoring workplace communications, many companies have related norms and expectations in place — such as that an employer won’t sift through their employees’ emails without having a good reason to do so, Kim pointed out.

“The more we unravel those norms and move toward a world where employers have and exercise a right to scrutinize every communication and every aspect of workers’ presence on the job, it’s very troubling,” she said.

Joan C. Williams, a professor at The University of California, Hastings College of the Law and director of the Center for Worklife Law, said the way workers react to an employer using the software will depend on how companies introduce it. A good way, from her perspective is for an employer to acknowledge the pervasiveness of bias and explain they want employees to be able to recognize and interrupt it.

“That’s a very different way of engaging employes than saying, ‘We’re going to read your emails’,” she said.

Source News