Amazon must listen to concerns

0
14

Amazon must listen to concerns


Amazon Echo is now used in thousands of Irish kitchens and bedrooms
Amazon Echo is now used in thousands of Irish kitchens and bedrooms

Amazon is listening to our conversations.

And it’s not just the AI in its Echo smart speakers. It emerged last week that some of its staff are tasked with listening to what we say to Alexa, Amazon’s voice system. This includes misfires, where Alexa accidentally listens to us without a trigger word being used.

Echos are now used in tens of thousands of Irish kitchens and bedrooms.

Amazon has been very secretive about this listening. But thanks to a Bloomberg investigation, the company admitted that its staff listen to some voice recordings to help improve its artificial intelligence. The rationale is that a human worker can, for example, correct the system when it gets a word wrong due to a strong regional accent.

But that’s not all the workers hear. Some of them told Bloomberg that they hear disturbing things, like assaults. But they’re told not to report it or do anything about it because the audio files are supposed to be private.

There are a whole host of issues that crop up around this.

Is it okay for Amazon staff to physically listen to our voice commands, even if it’s just for quality control? And on that, do we believe them that it’s just for quality control?

Finally, a broader, thornier associated issue: what responsibility, if any, does a platform have if one of its staff detects that something like a sexual assault has occurred?

On the general question of Amazon workers physically listening to some voice requests, this shouldn’t really come as a surprise. It’s not really possible to significantly improve a voice-recognition system without a human, at some point, checking its progress. That means comparing what Alexa thinks it hears with what it actually hears. And that means listening to the original audio file.

This shouldn’t be controversial. But because Amazon has been so evasive about admitting this in plain English, it is. Just like Facebook’s problems with combating widely held suspicions that it secretly records us through phones (it doesn’t, as Mark Zuckerberg told me when I asked him directly in Dublin two weeks ago), Amazon is now opening up the stage for all sorts of half-truths and conspiracy theories.

“We use your requests to Alexa to train our speech recognition and natural language understanding systems,” says Amazon’s small print, without specifying that it’s a human listening in, or anything about the process that this entails.

}
});

#bb-iawr-inarticle- { clear: both; margin: 0 0 15px; }

Seeing as it’s so secretive about this, other obvious questions now arise: is Amazon using these audio files for any other purpose, such as to add extra detail to the customer profile it already has on you (from your use of Amazon and the web, among other sources)?

We know from patents the company has filed that it is considering a future where it can detect a hundred (or a thousand, or a million) different trigger words instead of just one (‘Alexa’) as an activation trigger.

Does the explanation that Amazon provides cover use of your Echo request for researching and developing such technology further? Legally, it probably does. (A data protection regulator might form a different view.)

The third question is a much more basic one, and not to be laid solely at the door of Amazon.

According to the original Bloomberg report on this issue, a number of Amazon staff tasked with listening to the recordings said they picked up what they thought was a sexual assault. Having raised the issue with the company, they say they were told that it wasn’t Amazon’s place to interfere.

Instead, staff were told that they were allowed to share their experience in the internal chat room as a way of relieving stress.

Here’s the thing: when does a platform have a responsibility to take action on something illegal or wrong if a channel is private? Where is the line between privacy and safety?

It’s a little unfair to expect Amazon to have a clear answer for this, just as it’s unrealistic to place the same legal or moral burden on Apple or Facebook or Google.

In Ireland, like everywhere else, this question tends to split public opinion down the middle.

We’re appalled that tech platforms can host violent or criminal material. But we’re also outraged if the tech company monitors our communications too closely, which is one direct way of stopping, say, child abuse imagery.

Encryption entrenches the paradox. On a service like WhatsApp, encryption is our guarantee of privacy and freedom from big brother (and advertisers). But it also means that Facebook can do little to stop thousands of Irish people forwarding obscene or illegal material to one another.

A recent example of this occurred earlier this year in the aftermath of a fatal car crash on Dublin’s M50 motorway.

Within hours, graphic images from the crash scene were being WhatsApped and texted around Ireland. The circulation of the image was such that the gardai and the family of the crash victim had to issue public appeals to stop forwarding the images.

Just like Amazon staff hearing evidence of a sexual assault being committed, the logical question presents itself: what is the role of a network (like WhatsApp) to step in and halt the forwarding of such images? And whatever about offensive or obscene imagery, what about illegal imagery, such as child abuse?

Rightly or wrongly, we prioritise privacy over safety. Vodafone and Eir are not asked to use voice-recognition technology to protect against a terrorist or paedophile using their phone networks. Neither is Virgin held responsible for illegal uploads on its broadband (even though music and movie networks still think telecom firms are somehow responsible for content because they ‘allow access’ to the internet).

Amazon is probably treated the same way. But a little more transparency from it over its plans for our voice files would be welcome.

Sunday Indo Business


LEAVE A REPLY

Please enter your comment!
Please enter your name here