Apple Apologizes Over Listening to Siri Recordings, Makes It Opt In
A big change in policy
Apple today (Aug. 28) announced that it would be overhauling the way it has humans check Siri voice transcriptions for errors, a process that the company calls "grading."
Beginning this autumn, Apple said in an official blog posting, it will no longer hold on to recordings of Siri interactions with users. Users will have to opt into the grading program -- by default, your Siri recordings will never be heard by humans. And lastly, the entire grading program, which relied on outside contractors to listen to the recordings, will be brought in-house.
The Guardian reported today that 300 such contractors lost their jobs last week in Cork, Ireland, and an unknown number were let go in other locations in Europe. The Guardian itself broke the story of humans listening to Siri recordings a month ago, leading to the suspension of the program.
MORE: Alexa vs. Google Assistant vs. Siri: Why Google Wins
"We've decided to make some changes to Siri," Apple's blog post says. "We realize we haven't been fully living up to our high ideals, and for that we apologize."
"We plan to resume [the Siri grading program] later this fall when software updates are released to our users -- but only after making the following changes," the post continues. "We will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve."
"Users will be able to opt in to help Siri improve by learning from the audio samples of their requests," it adds, which is quite a big deal. Almost all tech companies enroll users in data-sharing programs by default, and only let them opt out.
Sign up to get the BEST of Tom's Guide direct to your inbox.
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
"When customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions," the post says. "Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri."
Siri, do I still have a job?
Unfortunately, that decision to bring Siri reviews in-house means that hundreds of people across Europe no longer have jobs.
"We've all been laid off after the scandal, with no protection against this," one Cork-based contractor, who asked to remain anonymous because of a non-disclosure agreement, told the Guardian. "They do what they want, and when they're done with your project or they screw up (like what just happened), they tell your vendor company to let you go."
Still, "I'm relieved this information came out," the former contractor said. "Discussions around ethics in this job was a constant between workers, but we don't know how to bring it up."
The necessary human factor
To be fair, there is no way that a voice assistant, whether run by Amazon, Apple, Google or Microsoft, can completely and accurately recognize and respond to human speech without at least some human oversight. Machines simply aren't as good at understanding all the nuances, slang terms and varied accents involved in human speech.
Just yesterday (Aug. 27), the BBC said it would develop its own voice assistant to recognize British regional accents, even though we've not seen reports that Alexa or Siri have had more difficulty understanding a Scouser or a Geordie than a California surfer dude.
But the big tech companies, Apple included, deliberately downplayed the role humans would have in fine-tuning the speech recognition their machines performed, rightfully assuming that some people might be creeped out. By doing so, they set themselves up for embarrassment and opportunistic lawsuits.
And the media has certainly played a role in generating outrage, highlighting allegations that human contractors correcting the voice transcriptions have heard recordings of people buying drugs or having sex -- in other words, what millions of humans around the world do every day.
Paul Wagenseil is a senior editor at Tom's Guide focused on security and privacy. He has also been a dishwasher, fry cook, long-haul driver, code monkey and video editor. He's been rooting around in the information-security space for more than 15 years at FoxNews.com, SecurityNewsDaily, TechNewsDaily and Tom's Guide, has presented talks at the ShmooCon, DerbyCon and BSides Las Vegas hacker conferences, shown up in random TV news spots and even moderated a panel discussion at the CEDIA home-technology conference. You can follow his rants on Twitter at @snd_wagenseil.