Always Listening? Apple Contractors Hear Your Private Siri Conversations
Thought it was just Alexa and Google Assistant? Nope, Siri listens, too
Earlier this year, we found out about Alexa, then later Google Assistant. Now it turns out Apple's voice assistant Siri also has humans listening to your queries. An Apple contractor shared Siri's quality-control practices with The Guardian last week, raising concerns about Apple's privacy policies.
The unnamed contractor confirmed Apple hired the contractor's employer to review a small portion of saved Siri recordings and grade Siri's responses on a number of factors, including whether the voice assistant offered satisfactory answers or whether the activation was accidental.
After listening to several short, private recordings picked up by what were likely unintended triggers, the anonymous contractor came forward with concerns about lack of disclosure.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on," the whistleblower told The Guardian. "These recordings are accompanied by user data showing location, contact details and app data.”
MORE: 5 Big Changes Coming to Siri
In its consumer-facing privacy documentation, Apple doesn't explicitly state that Siri recordings are sometimes reviewed by humans as a quality-control measure. The company only notes that the data “is used to help them recognize your pronunciation and provide better responses."
“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user's Apple ID," Apple told The Guardian. "Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
Sign up to get the BEST of Tom's Guide direct to your inbox.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
The tech giant also noted that less than 1% of daily Siri interactions are reviewed by humans.
Unlike Amazon's Alexa, Siri operates on devices that users have on their persons every day. The Guardian's anonymous source highlighted the Apple Watch (which makes up 35% of the smartwatch market) as a common source for the quality-control recordings.
“The regularity of accidental triggers on the watch is incredibly high,” they said to The Guardian. “The watch can record some snippets that will be 30 seconds -- not that long, but you can gather a good idea of what’s going on.”
Apple prides itself on its reputation for privacy, and even uses privacy as a selling point to distinguish itself from Amazon, Facebook and Google. But the company offers no way for you to opt out of Siri's quality-control grading, aside from disabling Siri entirely.
If you do use Siri, there's not much you can do to avoid having your recordings saved, but this revelation serves as yet another reminder that when you have voice and home assistants, you're sacrificing your privacy.
Kate Kozuch is the managing editor of social and video at Tom’s Guide. She writes about smartwatches, TVs, audio devices, and some cooking appliances, too. Kate appears on Fox News to talk tech trends and runs the Tom's Guide TikTok account, which you should be following if you don't already. When she’s not filming tech videos, you can find her taking up a new sport, mastering the NYT Crossword or channeling her inner celebrity chef.