Apple has hit pause on contractors listening into and making notes on recordings of people using its Siri digital assistant after the secretive practice was exposed. Apple sent anonymised audio snippets to the contractors to maintain a level of privacy, but the idea that someone is listening to Siri queries can be unsettling for many.
A series of media reports into what companies like Amazon, Google and Apple are doing with the recordings of users interacting with their respective Alexa, Google Home and Siri services, has put the tech companies in a privacy spotlight. In a statement to TechCrunch, the company said that while it conducts a "thorough" review, it's suspending the program globally. In addition, she said Apple will also offer users the ability to participate in grading as part of a future software update.
Although Apple shows no interest in these kinds of information, the informant is concerned about the people hearing them and the threat of the whole ordeal on people's privacy.
It's also worth noting that Apple has only said that it's temporarily suspending the program, so it's entirely possible that it will start back up again, although we're guessing that will only happen after Apple has the necessary software updates out to make user participation optional, rather than mandatory.
According to the informant, an accidental recording may contain medical information shared between a user and his/her doctor, suspicious criminal undertakings, and/or sexual intercourse. Is Siri not safe to use any more? Siri and other services can activate in error after wrongly picking up sounds they mishear as their "wake" words. The responses are analyzed in secure facilities by reviewers who are obliged to adhere to Apple's confidentiality requirements.
Apple CEO Tim Cook has said that user privacy is a "fundamental right". The feature comes amid growing privacy concerns over the practice, which is used to help the company - and others like it - improve the quality of their AI assistants. The latter point is essentially the crux of the problem, with one anonymous contractor detailing how confidential conversations can be heard when Siri is inadvertently called into action. In Germany, the Hamburg Data Protection Authority ordered Google to stop human review of Google Assistant recordings under GDPR regulation following a data leak last month.
Google, meanwhile, has also suspended reviewing voice recordings from the Google Assistant in the European Union after a leak was exposed in mid July.
San Francisco- After facing flak, tech giants Google and Apple have now stopped snooping on users' conversations.