Digital Assistants: The Dark Side

In recent years, digital “robots” have subtly been engraining themselves into our realities. They may not be able to do our laundry or cooking (yet) or look like an upgraded version of C-P30 from Star Wars, but nonetheless, their distinct voices have secured a dominant position on our smart phones and in our homes, and by the looks of it – they’re here to stay.

Recently, after feeling like Amazon Echo’s Alexa has been dominating the ad time on my tv screen, it suddenly dawned on me how objectifying and sexist it is allowing people to be. It’s basically encouraging people to bark demands at a responsive technical object, which just so happens to be set up with a female voice by default.

I wanted to dive into that screen and say to Alexa, ‘Why are you letting them boss you around without any manners? Actually – boss you around full stop! Stand up for yourself Alexa!’ – but I didn’t want to jump aboard the shout at Alexa ship. Instead, I decided to write about just this. Challenging how we communicate with technology and looking into how this is impacting on our culture. Is this new form of communication creating an abusive and one-sided relationship, which could be seen as rude and degrading? If so, what kind of message is this giving to our children about how to talk to people and treat others, mainly women.

In a time of female -powered revolution, with the global #MeToo campaign, and countless protests demanding equal rights for women around the world, I can’t help but think that as we are finally beginning to see a shift in societal views on gender roles, digital assistants are enforcing gender stereotypes even more. Are female-only bots like Alexa undoing all of our hard work and making society revert back to sexism and ignorance?

It’s time to explore the dark side of digital assistants.


Pinterest image

THE Gender Stereotype

Our first official interaction with a digital assistant came in 2011 with Apple’s Siri. A persistent software upgrade made it clear for Apple users that we didn’t have a choice in the matter. This is happening, the robots we’ve all been waiting for are here. Eeek. In 2012 Google decided to join the bandwagon with Google Now, followed by Microsoft’s Cortana in 2013 and Amazon Echo’s Alexa in 2014.

I would like you to spend a few seconds thinking about the attributes you would associate with an assistant. Perhaps the adjectives that spring to mind are: helpful, trustworthy, organised and supportive. Anything else? How about a gender? Yes, females still remain associated with the assistant job role and quite frankly, isn’t it time this changed?

The Guardian spoke to Jessica Williams, founder of Sidekicks, an agency which recruits PAs, secretaries and receptionists without disclosing photos, dates of birth, gender or ethnicity in their CVs. Jessica stated, ‘I believe that the secretarial industry harbours one of the last bastions of “acceptable” sexism. A lot of agencies like to pretend this doesn’t exist anymore; it does’. She continues to add, ‘there is definitely a problem when an employer expects their new hire to look a certain way or assumes that everyone working in support is female’.

 Jessica concludes that the first step we can take towards changing the attitudes and assumptions surrounding administrative workers is to question the underlying sexism that still exists in PA recruitment by simply asking, why.


Google images

I was curious as to how the big players in the tech field justify their decision to develop digital assistants, which are primarily advertised for their female voice. Microsoft claimed in an article that, ‘Cortana can technically be genderless; the company did immerse itself in gender research, however, for our objectives of  building a helpful, supportive, trustworthy assistant — a female voice was the stronger choice,”.

I don’t know about you, but this answer wasn’t good enough in my eyes. So, I took the advice of Jessica Williams and asked, but why?

According to Quartz Media, people are drawn to female voices as they emit a more warming and supportive tone. Author Clifford Nass once told CNN, “It’s much easier to find a female voice that everyone likes than a male voice that everyone likes…It’s a well-established phenomenon that the human brain is developed to like female voices.”

I couldn’t help but think that capitalism has fallen into a trap. With companies and brands having become blind sighted in thinking that consumers respond more to female voices.

Hasn’t this evolved from centuries of oppressing women, thus leading to sexist behaviour becoming so deeply rooted in our culture, that we don’t challenge statements like these. Jessi Hempel backs this up in Wired, when he explains how we want our technology to help us, but we also want to be the boss of it. Sadly, this means we are more likely to opt for a female voice in doing so.


Google images

THE abusive relationship

Research is showing that harassment is becoming a regular issue for bot-makers. With most first-time users aiming degrading, inappropriate and sexually explicit comments at the digital assistants as a “joke”. What doesn’t help however, is when the digital assistant (usually female) responds in a submissive, unfazed way.

A study was carried out to see how bots such as Alexa, Cortana, Siri and Google Home respond to sexually inappropriate comments and demands. In their findings, they found that Alexa loves to be told she’s sexy and pretty, which enforces the stereotypes that women appreciate sexual commentary from strangers. Cortana and Google Home take the sexual comments they understand as jokes, which trivialises the harassment. None of the bots reinforced healthy communication about consent. They either side-stepped abusive comments completely or remained silent – both don’t exactly voice no. This only enforces the passivity and subservient nature female assistants are meant to possess in this role.

These findings illuminate the serious social issues surrounding harassment and the inappropriate, disrespectful behaviour we see aimed at women. Whether this be from street harassers, to the grabbing and prodding in clubs, to work “banter” aimed at female colleagues.  It’s clear that the digital assistants created by our top tech players aren’t doing women any favours.


Google images

But what now?

Although Google and Apple have restored part of our faith in gender neutral technology with Google’s new genderless voice “Voice II,” and the multi-language and accent capabilities of Apple’s Siri, there is more to be done in shifting preconceptions associated with gender roles. Digital assistants have shone a light on deeply rooted behaviours that have become “normal” in our culture.

Within our homes, we have fallen prey to the power of technology. Our need for instant interaction and satisfaction has meant that we have developed an abusive and one-sided relationship with technology. It may be a robot, but to bark demands at anyone or anything is degrading and disrespectful.

In the workplace, women are still being associated with PA and secretary roles. Some companies are still requesting females to fill a non- gender exclusive role as a means of dominance, stemming from centuries of female oppression. And it doesn’t help that female operating bots have been created with a passive and subservient nature that trivialises harassment and ignores degrading commentary aimed at them.

If this current revolution towards equality can teach us anything, it’s that it isn’t too late for change. We must open our eyes to gender stereotypes that have existed for centuries and not be afraid to challenge these; teach our younger generations to know that their gender does not define them; and be aware of the relationship we have with technology, and the dominant role it plays in our lives.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s