Article

AI-enabled Voice Assistants: No longer female by default

UNESCO 2019 publication  revealed how much gender bias and stereotypes were engineered into Artificial Intelligence-powered voice-assistant applications. Beyond highlighting the overall gender imbalance of teams creating these new tools, it also showed evidence of alarming gender gaps in technology industries, even in countries that are close to achieving gender equality.  

In this Q&A, Mark West, Project Officer at UNESCO and lead-author of the publication, shares insights on where we stand regarding gender prejudice in AI since the report came out.

Two years since the publication came out, where do we stand on the fight against gender prejudice in AI assistants?

On the positive side, awareness is much higher than it was when we researched and wrote our report.  When AI voice assistants were first released, we were so enthralled with the novelty of speaking to computers that we forgot to ask critical questions. This honeymoon period is over. We now know鈥攐ften from personal experience鈥攖hat these systems are imperfect. We are on the lookout for 鈥榬acism鈥, 鈥榗lassism鈥, 鈥榮exism鈥, 鈥榓geism鈥 and other 鈥榠sms鈥 that get cooked into AI. This recognition and alertness means that problems are more likely to be flagged and corrected. On the negative side, transparency remains a problem. Even when people have a hunch that an AI system might be disadvantaging them for some unfair reason, it can be hard to prove. The recommendations we put forward about the importance of algorithmic audits remain relevant. Systems and AI engines need to be pulled out of black boxes so that people can study and test them. If a company claims to have an unbiased system, that鈥檚 great: prove it, show us what is under the hood, explain how it works and how it learns.  

So, you are saying we need to be vigilant?

It鈥檚 important to remember that rooting out bias in AI will be a perpetual undertaking. So long as humans are at the helm of AI (as we very much are today), the decisions, opinions and recommendations offered by technology will be reflections of our own assumptions and worldviews or, more precisely, the worldviews of the relatively small groups of people鈥攐verwhelming men鈥攖hat actually build AI systems. In this sense, there is nothing 鈥榓rtificial鈥 about AI. It will take conscious efforts and determination to keep rooting out prejudice. Getting more women in the field of AI is a key part of the solution. There are some encouraging signs that the field is slightly less male-dominated than it was when we wrote our report. However, there is still a long way to go. And backsliding remains a risk. As an example, in a number of countries a lower percentage of women are getting bachelor-level degrees in computer science today than in the late 1990s.

Are voice-assistant technologies today more gender-neutral than they were a few years ago?

Well, many of the most clich茅d gendered responses have been dialed back. Siri no longer says, 鈥淚鈥檇 blush if I could鈥 when users lob gender insults at 鈥榟er鈥, so this is progress, even if the starting point was really low. Mainstream voice assistants are much more likely to shut down abusive speech than they were previously. After our report came out, a lot of technology teams looked over the scripts they had written for voice assistants and said 鈥測eah, a lot of this plays to problematic gender stereotypes鈥 and made changes. We鈥檙e glad they did鈥攖his was an aim of our work.

To what extent have the report鈥檚 recommendations been implemented?

Encouragingly, a there has been traction on several of our recommendations. For example, in the report we called on technology companies to 鈥渆nd the practice of making digital assistants female by default.鈥 Recently, Apple announced it would do exactly this. Going forward, iOS users will select what voice they want Siri to use. Apple has, in effect, abandoned its former practice of assuming that people prefer a female voice to set the kitchen timer, make calendar appointments and read email messages. It is estimated that iOS is used on over 1 billion devices globally, so it鈥檚 a change that will be felt by a lot of people.  We鈥檝e seen progress in other areas as well. Governments are thinking more deeply about the voices, accents and gender projections of everything from chatbots that help with tax returns to systems that help people navigate public transportation. In the past most of the voices giving commands (things like 鈥淓xit the bus鈥) were male, while most of the voices offering assistance (鈥淲hat can I do for you?鈥) were female. Since our report has come out, I鈥檝e noticed a lot more of a mix: male and female voices being used both for commands and offers of help.

What still needs to change?

As communicated in our report, we continue to believe that projecting a human voice, gender and personality on non-human technology presents challenges. A way out of this conundrum is to project voice assistants and other AI applications as non-humans鈥攁 sort of 鈥榣et鈥檚 keep AI artificial鈥 ethic. In our research we encountered lots of examples where AI helpers assumed a voice that is clear, distinct and pleasant, but still immediately recognizable as non-human and not overtly male or female. We saw examples of AI assistants that were projected as cartoons鈥攖alking animals, for example. I don鈥檛 mean to say there will never be a place or reason for projecting technology as a human person鈥攖here almost certainly will be. But if companies want to avoid tricky questions about gender, there is no 鈥榬ule鈥 that AI assistants have to be 鈥榗ast鈥 as young women or young men. Makers of AI assistants would do well to lean into the non-human identify of their creations, rather than trying to give them a human veneer.  

How about voice assistant Alexa, which still only has a female voice?

That鈥檚 correct. For certain subtasks it is possible to change Alexa鈥檚 voice but there is still no male option for general purpose Alexa functions. Recently, Amazon released 鈥溾 which gave rare insight into the level of thought that goes into these things鈥攏othing accidental about it. The guidelines contained a section titled 鈥楢lexa and Gender鈥 which seemed like a direct response to our report. It鈥檚 full of contradictions.  For example, the guidelines say Alexa 鈥榙oes not have a gender鈥 and should not be labelled with words like 鈥榮he鈥 and 鈥榟er鈥. But later in the same document Amazon refers to Alexa as 鈥榮he鈥 and 鈥榟er鈥 and says the technology has 鈥榝emale persona鈥. The guidelines reveal the problems of trying to give a machine a gendered human identity. If an AI technology speaks like a woman and has a 鈥榝emale persona鈥 people will understandably make associations between this technology and actual women. The first thing many voice assistants say when you call on them is 鈥淗ow can I help you?鈥 Why should this question and subservient obedience always have a female voice? Amazon should let users choose whether or not they want a voice assistant with a male or a female voice or maybe even a non-gendered voice like C-3P0 from Star Wars.

What鈥檚 UNESCO doing?

Gender bias in AI and in other aspects of technology are, as we say in the report, about 鈥榖ias in, bias out鈥. The teams working at the frontiers of technology are heavily male. If technology is to help communities and countries to become more gender-equal, we need women as well as men to steer the development of this technology that is, day-by-day, changing our world in profound ways. And for this, we need more girls and women to take up studies in computer science and other technology fields and join the teams shaping the technologies that will enter our homes, schools and workplaces in the future. UNESCO continues to do important work to ensure gender equality at every level of education, particularly in fields like technology and engineering where women remain severely underrepresented professionally. In our report, we included lots of recommendations to ensure early interest expressed by girls in technology is sustained and nurtured. We are working with countries to implement the many actions we suggested in the report, placing special emphasis on gender equal education鈥攐ne of the straightest routes to lasting change.

UNESCO is currently developing a Recommendation on the Ethics of Artificial Intelligence (AI). This  will be submitted to Member States for the adoption by the General Conference of UNESCO at its 41st session in November 2021. If adopted, the Recommendation will be the first global normative instrument in this critically important field.