User reports that Sonos Alexa responds less easily to her voice compared to her husband's, finding it frustrating that the device doesn't seem to understand her voice as well. Another user corroborates this, stating they have to rely on someone else to manage smart home commands because the speaker prefers another voice.
A year ago I took the plunge and bought a smart speaker. For anyone who knows me, this would strike them as an odd move because I am so hyper privacy-aware to the extent that I read GDPR legal documents in my free time. Maybe it was the craziness of the pandemic - being an extrovert and the realisation I couldn’t see anyone for the foreseeable future, that tipped me over the edge; or maybe it was the notion of bringing a woman into the house (yes, because the default setting for virtual assistants is still predominantly women) who would listen to my many questions rather than my husband’s fantastic ability to drown me.Admittedly, owning a smart speaker has been good addition to my life over this past year. I listen to more varied music; it quenches my multilingual news thirst; it solves quibbles quickly when both my husband and I are adamant about a fact, such as who and when did the last defender win the Ballon D’Or. (If you are wondering it was Fabio Cannavaro in 2006 and yes I was right.)However, technology can have teething problems. I get it – I work in the industry. My issue is that 12 months on, the teething problems have persisted for me but haven’t for my husband; the speaker recognises his request to, say, set a radio alarm, and carries out his commands without fail. While I quite frequently get replies like ‘I can’t help you with this right now’, ‘sorry something went wrong’, or ‘please try again later’. So, as any business consultant worth her salt, I started to do some research and what I find was an uncomfortable discovery - smart speakers are fundamentally sexist. Wait, not fundamentally sexist – institutionally sexist. It is not by design they’re sexist; smart speakers weren’t ever set out to be better at responding to male voices than female, but the data with which these devices are trained have more male voices than female. It’s an historic issue that is manifesting itself in my home; ‘bias in, bias out’.A YouGov survey in 2019 found that 67% of female owners of smart speakers encounter problems being understood, with experts claiming that artificial intelligence has a harder time understanding a women’s higher pitch compared to a man’s deeper tones. White male voices are understood better by speech recognition. This is not the only example of tech preferring white maleness - face ID has shown to have difficulty recognising darker skin and female features. If artificial intelligence categorises a human as a white male with a deeper voice, what category are we in?This discovery was disheartening, and that was before I ended up down a rabbit hole about the behavioural and societal impacts linking smart speakers’ female voices and their servile nature further embedding gender biases into our lives. 55% of smart speaker users admit to talking rudely to their device. Even worse, 7% admit to ‘always’ talking disrespectfully to them.Unless we actively do something to change how these devices are programmed, we are perpetuating institutional sexism. In my case, the impact of this sexism is frustrating but has minimal real-world impact; in other cases, it could be life changing as voice recognition software is now used to support decision-making across many aspects of our society – like the case of the Irish vet, a native English speaker, who failed the oral English language test for permanent residency in Australia as her voice samples did not hit the minimal pass rate.So, what can be done?Consciously train devices using a range of data which reflects our diverse society (women make up 50% of the world so why does training data contain majority male voices?) Create more opportunities for women into development teams. Tech is still predominately a male-focused industry. More women in development teams could bring more inclusive design by nature and help to identify and prevent biases. To support getting more women into these jobs we can create gender-equal job and course descriptions; Gendered language in job and course postings that focus on stereotypical male traits (e.g. ‘assertive’, ‘competitive’, ‘leader’, ‘confident’) rather than softer skills (e.g. ‘teamwork’, ‘collaboration’, ‘responsible’, ‘support’, ‘understanding’), has been proven to deter women from applyingBecome an ally by being educated about the issues. This means being much more active than simply taking part in once-yearly Unconscious Bias trainings. Hearing from experts, from technical women themselves, can help identify our own biases and bring positive changeWe need to talk about inherent, institutional sexism in smart speakers to ensure our world is one where everyone’s voice can be heard. Is this something you have experienced? I’d love to hear your experiences and suggestions.