Apple agrees to a $95 million settlement in the case of Siri Eavesdropping

Apple has agreed to pay $95 million to settle a long-running class action lawsuit that accused the company of illegally intercepting customer conversations through its Siri virtual assistant and sharing snippets of those conversations with human critics.
The lawsuit was originally filed in 2019 after a whistleblower said The Guardian that third-party contractors Apple has hired to review Siri’s responses sometimes hear private interactions, ranging from patients talking to doctors to people having sex or buying drugs. While Apple stated that Siri only activated its listening mode after detecting your wake word-“Hey Siri” – The Guardian reported that the assistant had activated by mistake and started recording conversations in response to similar words and even the sound of zippers.
The lead plaintiff in the class action lawsuit, Fumiko Lopez, claimed that Apple devices improperly registered her daughter, who was a minor, mentioning brand names such as Olive Garden and Air Jordans and then serving her ads for these brands in Apple’s Safari browser. Other plaintiffs have reported that their Siri-enabled devices went into listening mode without saying “Hey Siri” while having intimate conversations in their bedroom or with their doctors.
In their suit, the plaintiffs characterized the invasion of privacy as particularly egregious, since a core component of Apple’s marketing strategy in recent years has been to promote its devices as user-friendly of privacy. For example, an Apple billboard at the 2019 Consumer Electronics Show read “What happens on your iPhone, stays on your iPhone,” according to the lawsuit.
U proposed togetherfiled in California federal district court on Tuesday, covers people who owned Siri-enabled devices from September 17, 2014 to December 31, 2024 and whose private communications were recorded by a Unexpected Siri activation. Payment amounts will be determined by how many Apple devices a class member owns that improperly activated a listening session.
Apple also agreed to confirm that it permanently deleted recordings collected by Siri before October 2019 and to publish a web page explaining how customers can opt-in to its Improve Siri feature, which allows the company to share and to listen to audio recordings for quality. control
Apple did not immediately respond to a request for comment.
Shortly after The Guardian’s report, Apple temporarily suspended all the human qualifications of the Siri answers and recognize that “we have not completely lived up to our high ideals”. The company said it will resume human grading after the release of software updates and that in the future, graders will receive computer-generated transcripts of conversations, rather than the audio itself, and that only Apple employees , and not third-party contractors, would. classification.
https://gizmodo.com/app/uploads/2025/01/apple-siri-lawsuit.jpg
2025-01-02 18:10:00