Apple is changing Siri to bolster privacy protections and hopefully reassure customers. Audio-based intelligent assistants like Siri have long drawn the suspicion of privacy advocates, and the question with these services is always: “Who else is listening?”
These fears proved well-founded when The Guardian reported that contractors were, in fact, listening in on a small percentage of Siri recordings.
This led to a massive internal review of Siri by the company. Apple issued a statement yesterday clarifying privacy issues around Siri and setting clear boundaries moving forward.
Following the review, Apple announced the following changes:
“First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.”
During this internal review, the Siri “audio grading process” was suspended. The suspension will continue until the opt-in option is rolled out as part of a software update.
Amazon and Google are working on similar opt-out options for their respective audio assistant software. Tech industry insiders believe these moves make Siri the most privacy-focused of the audio assistants.