Why giving virtual assistants female names, personas and voices is coming under scrutiny
Finance is a male-dominated realm. But there is one corner where women are everywhere: the virtual world of digital assistants.
Bank of America customers can ask for Erica. In Paris, millennials with their savings at Hello bank! reach out to HelloiZ (pronounced “Eloise”).
HSBC Holdings has Amy standing by with advice.
In Sweden, SEB’s Aida is waiting to help. Meanwhile, on the trading floor of AllianceBernstein Holding, it is Abbie who assists the bond traders with their deals.
For an industry where men outnumber women at least three to one in the upper echelons, the propensity to give virtual assistants female names, personas and in some cases, voices, is coming under greater scrutiny for perpetuating sexist stereotypes.
Many executives interviewed by Bloomberg defended their use of female-gendered bots by citing research that shows both men and women prefer female voices to male ones.
But this doesn’t exempt companies from the responsibility to proactively combat prevailing norms, according to Heather Andrew, CEO at Neuro-Insight UK, a London-based firm that conducts research into how men and women respond to different voices.
“There is an opportunity for those that are forward-thinking to do something about the stereotypes,” Andrew said. “Brands and companies should be thinking about the tone of voice much more than they are.”
Neuro-Insight research shows that while people do tend to prefer the feminine voice, they also recall more information when it’s delivered by a male voice. According to Andrew, both findings correspond to oversimplified stereotypes of women as comforters and men as authorities.
Using bots with feminine personas is hardly isolated to banks. Just look at Siri, Cortana and Alexa, the digital assistants for Apple, Microsoft and Amazon.com, respectively. Google’s digital assistant doesn’t have a defined gender, although its default voice is feminine. These companies have cited the same research showing people prefer female voices; more recently they’ve introduced the ability for users to switch to a male voice. So far, most banks aren’t doing that.
There are exceptions, like Commonwealth Bank of Australia’s payments tablet “Albert”. But by and large, as banks rolled out bots in the past few years, they’ve opted for feminine ones.
That’s not necessarily surprising for an industry where gender imbalance has long been entrenched. Aside from the management bias toward men, there are two women for every man in support staff roles at banks, according to consulting firm Mercer.
Banking officials interviewed said they didn’t really consider the gender politics of their bots explicitly and chose female personas for branding reasons or because, during market research, customers responded better to female names and voices.
The idea for the name and persona of Abbie, Alliance Bernstein’s bond-trading digital assistant, came through an informal discussion over the internal group messaging app, according to Gavin Romm, a money manager for high-yield debt. There were two main requirements for the name: it had to be easily identifiable for natural language-processing algorithms, a machine-learning software that can understand text; and it had to reflect AllianceBernstein’s brand.
“We all refer to AllianceBerstein as AB,” Romm said by e-mail. “This led our chat conversation down the path of names that start with the letters ‘Ab’—Abigail, Abernathy … Abby. We needed something less common, and Abbie was born.”
Bank of America also chose Erica because its team liked the play on America, according to Christian Kitchell, an executive in charge of AI Solutions. They also did focus-group testing of the persona and voice of the assistant.
“Everything we do tends to be data-driven,” Kitchell said. “We got very, very clear signals from our customers that the voice we have gone with was their favourite.”
Speaking at a tech conference in New York last week, his colleague Cathy Bessant, Bank of America’s chief operations and technology officer, said digital assistants probably tend to be women “because we’re super smart”. But algorithms are only as sensitive as the humans that program them, she added in an interview with Bloomberg Television.
A firm could, for instance, use AI to quickly hire a large number of candidates using past policies—but if it wanted to create a more diverse workforce, the algorithms would need to be retooled.
“Done well, they should leverage the best of what human behaviour and human judgment is. Done poorly, we repeat the sins or the biases — the risks of the past,” she said, adding the lender was studying the responsible use of AI with Harvard University.
Just because customers participating in focus groups might be conditioned to react a certain way to a certain voice doesn’t mean companies necessarily need to play into those prejudices, according to Andrew at Neuro-Insight.
Vive la différence seems to be the attitude at French bank BNP Paribas toward the gender politics of its digital assistants. It opted for two different personas: the male persona Telmi for its BNP banking clients who are older and more traditional; and the female persona called HelloiZ for Hello bank!, its digital-banking sibling that appeals more to younger clients.
“We are maybe naïve, but for us it was not a debate,” Ariel Steinmann, head of digital marketing for BNP and Hello bank!, referring to gender politics. “What is important for us is the customer experience.”
This generational divide suggests stereotypes are changing. A closer look at the Neuro-Insight research revealed that it was primarily older men who took in far more information if it was coming from male voices. Younger people in the study, by contrast, seemed to have more equal recall between genders.
“I think we will eventually see an end to the default that a virtual assistant should have a female name,” said Jeremy Pounder, the director of marketing firm Mindshare, which did the research with Neuro-Insight. “We might see a trend toward gender-neutral names and personas.”