These fears proved well-founded when The Guardian reported that contractors were, in fact, listening in on a small percentage of Siri recordings.
This led to a massive internal review of Siri by the company. Apple issued a statement yesterday clarifying privacy issues around Siri and setting clear boundaries moving forward.
“First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.”
Clearly, Apple hopes that making audio review opt-in only and ensuring that only company employees participate in audio review customers will be more comfortable. Earlier in the statement, Apple reassures customers that the audio review will continue to be for internal-facing purposes only, as a means for improving Siri as a product, not as a marketing or data-mining operation.
During this internal review, the Siri “audio grading process” was suspended. The suspension will continue until the opt-in option is rolled out as part of a software update.
Amazon and Google are working on similar opt-out options for their respective audio assistant software. Tech industry insiders believe these moves make Siri the most privacy-focused of the audio assistants.
*First Published: Aug 29, 2019, 11:53 am