From: Siri records fights, doctor’s appointments, and sex (and contractors hear it) | Ars Technica
These cases bring up a series of questions. What can Apple and its colleagues do to better protect user privacy as they develop their voice systems? Should users be notified when their recordings are reviewed? What can be done to reduce or eliminate the accidental activations? How should the companies handle the accidental information that its contractors overhear? Who is responsible when dangerous or illegal activity is recorded and discovered, all by accident?
Now it looks like your Siri voice recordings can be heard by contractors roughly 1% of the time.
I think my issue with all of this is that it’s not opt-in other than the “by using this software you agree to …” BS all tech companies shove down our throat. I think one solution to this problem would be to allow users to opt-in to have humans review your recordings as long as they are properly anonymized. There’s still a chance an accidental wake word could trigger some of the scenarios mentioned in the article but at least give folks the ability to make decisions about how much they want to contribute to making these voice assistants better.
I’ve turned off the “raise to talk to Siri” on my watch long ago but we do have Google Home devices in our house and “Hey Siri” is still activated on my phone. I could shut off the wake word functionality on my phone but I’m not even sure you can do that with the Google Homes. I’ll be honest, I’m starting to lean toward yanking most of the voice assistant stuff out of my house in favor of dumb speakers hooked up to Chromecasts or maybe just going full Sonos (although that has it’s own privacy issues).
Update: Looks like Apple is halting the program for now and will be adding a way to disable this in the future. Good for them.