Contractors who review Apple’s Siri recordings for accuracy and to help improve the product may be hearing personal conversations, The Guardian reported over the weekend.
This problem was revealed by an Apple whistleblower who said he was concerned about the lack of disclosure, especially consdering the frequency with which accidental activations pick up extremely sensitive personal information.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the whistleblower said. “These recordings are accompanied by user data showing location, contact details, and app data.”
Apple explained to The Guardian that “Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
The company added that less than one percent of daily Siri activations are used for grading, and those used are typically only a few seconds long.
Ars Technica noted that as voice assistants have grown in popularity, the technology has been experiencing a parallel rise in concerns about privacy and accuracy.
For example, Amazon recently outlined its polices for keeping and reviewing recordings in response to questions from Sen. Chris Coons. In addition, a whistleblower report from a Google contractor said that workers had heard conversations between parents and children and at least one possible case of sexual assault.
© 2025 Newsmax. All rights reserved.