$95 million is a headline-making number, especially when it comes to results Proposed class action lawsuit settlement Where the claimants accuse Apple of illegally monitoring them through Siri and other Apple devices. If the district court in Oakland, California, overseeing the case approves the settlement, those who owned Siri between September 17, 2014 and December 31, 2024 and who believe they experienced “unintended Siri activation” will be able to. To file a claim for $20 as compensation.
D case In 2019 it began after a Guardian investigation in which a whistleblower came forward Alleged “innumerable examples” Where Apple devices, including Siri and the Apple Watch, inadvertently listened to users. At the time, Apple employed numerous third-party contractors to eavesdrop on the devices, including the data it inadvertently obtained — though it claimed it was only for the purpose of improving them, many of the plaintiffs alleged, not selling the data to advertisers.
The company is fast to stop Practice, though not before public debate There has been a lot of talk about whether Siri was really spying on its users. After all, it wouldn’t be a tech company for the first time Accused of surveying the audio of its users without their knowledge or consent. Not only that, Almost identical cases Regarding Google’s voice assistant, a similar court filing and possibly a similar settlement is waiting in the wings.
What’s going on with all these audio devices? Are Amazon and Microsoft listening too? And does all this actually constitute a serious breach of privacy?
The answer to this question is simple and complex at the same time. Experts Vox spoke to to learn more tell us that the public outcry over Siri’s data collection may be all, by comparison, nothing.
Ah, but that’s the related part of relativity. Siri’s data collection may not matter in the big picture because it’s not potentially harmful or unethical.
It’s just a drop in the bucket because it is.
We’ll probably never know the extent of Apple’s audio surveillance
Ongoing concerns about the potentially aggressive practices of large technology companies such as Apple and Google may distort our understanding of the Apple case. Alex HammerstoneA cybersecurity consultant at security consultancy TrustedSec, told Vox that the case “may project a lot of people’s concerns about the overall surveillance state.”
In other words, what appears to be a major referendum — an opportunity to hold major tech companies accountable for a major privacy breach — is what Hammerstone described as “a special case.”
The way Siri and similar smart assistant devices are supposed to work is that you use a “wake phrase” to activate them — like, “Hey, Siri.” The problem is with Apple devices (and, allegedly, other tech companies, including Google And the amazon) can be inadvertently activated in any number of ways, and users do not always know when these activations have occurred. The lawsuit against Apple alleges that not only was its unauthorized listening a deceptive business practice, the effect involved multiple instances where Apple violated privacy laws as well as the privacy of minors. 2019 investigation by The Guardian Reports say that Siri has illegally recorded everything from confidential doctor visits to couples having sex to illegal drug deals.
In Apple’s case, the claimants pointed out that Apple, In A 2018 letter To Congress, it said in no uncertain terms that Siri would never be activated without users’ express consent—arguably a boldfaced lie. But Apple continues to protest that it has not violated the letter of its confidentiality agreement with customers. However, Apple admitted it to the Guardian Inadvertently recorded and passed “a small part”. Of those unauthorized recordings to third-party contractors, it never clarified whether it sold any of that data to advertisers, which was one of the biggest complaints made by customers in the class action lawsuit.
“There is a widespread belief that these devices are listening to you,” cindy noexecutive director of the Electronic Frontiers Foundation, told Vox. “People think Facebook is listening to you, listening to all kinds of things, and advertising based on what they’re listening to.”
As it stands, we are unlikely to find out if this is true. As the case against Apple progressed, it seemed that we would; the court was found As recently as 2021 that “targeted advertising claims … are complimented by the unique nature of word-of-mouth communication.” In other words, the court was sympathetic to the view that your private conversations are unique; You will not be able to receive advertising based on those conversations unless your privacy is violated in a major way
Under the terms of the proposed settlement, however, Apple accepts no liability — not for recording users without their express consent to begin with, or for any potential misuse of those recordings. And without a trial to force its hand, the company likely won’t have to reveal what it did with all the data.
“The result for all of us is that Apple claims they got these accidental recordings, and they used them to make the system better, but they didn’t use it for other things,” Cohn told Vox. “And we’re not going to know the answer to that … I would say at the end of this case, we’re no closer to knowing whether it’s true than we were at the beginning.”
It’s worth noting that existing US surveillance and technology laws have not always kept pace with the rise of “smart” devices. When a similar lawsuit was brought against Samsung and others in 2017 for illegally recording users on its smart TVs, a federal court in New Jersey ultimately It was dismissed on a technicality — although the Court also seemed skeptical about the transitory nature of the allegations in the case. After the suit was dismissed in 2018, a new civil suit was brought against it; That case is still ongoing complicated way through the courts. As for the upcoming Google lawsuit, it’s so similar to Apple that it could have a similar outcome.
One reason it’s difficult to pin tech companies for such violations in the legal system is that the nature of the law, for example wiretapping, relies on outdated understandings of the technology in question. “So much data protection these days is contractual and regulatory,” Hammerstone told Vox. “The law has not kept up with the pace of technology.”
Another reason is that it is difficult to know what has been taken from you in this situation; Since (as the court Note that) Apple was in possession of the intercepted recordings, the claimants struggled to clarify exactly what was intercepted Cohn noted that Apple had deleted many of the recordings in question, making it more difficult to effectively pursue the case.
Another reason the case may face difficulties is that it is difficult to claim that you did not consent to the collection of your data in one way when you may have consented to it being collected unknowingly.
You don’t need to listen to your phone
Cohn insisted to Vox that Apple’s insistence that it wasn’t intentionally illegally surveilling users was probably true — because, after all, why would it need to?
“It’s a labor-intensive, computationally-expensive thing to track us when they’ve got this cornucopia of completely legal, easy-to-do, low-computational ways to monitor all of us,” he said.
“I think people don’t realize how much of a profile marketers and companies have on each of us and how much data we’re sharing,” Hammerstone told Vox. “Usually we agree to do this through this end-user license agreement … even if you can get through the validation, it’s impossible to read so many words.” By using metadata like cookies on websites, and through your daily purchases, media consumption and behavior, companies can learn more about you than you realize. They may engage in highly complex predictive algorithmic marketing that makes you feel cheated. Even though you may think you’re only sharing information in a private conversation that your Alexa just overheard, you’re sharing it in many other ways without even realizing it.
“Everything is constantly being collected about us and put into databases,” Hammerstone said. “Your phone doesn’t need to listen to you. It knows what you buy and all these other things because of all the other data we have about each of us.”
“The good news is your phone probably isn’t listening to you the way you fear,” Cohn said. “The bad news is that perspective doesn’t really matter [marketers’] The ability to place these unusual ads and make you feel like you’re being watched all the time.”
If all this makes you feel a bit victimized, you’re not alone; Browsers that emphasize privacy controls have grown in popularity in recent years Cohn was adamant that “we need legal protections and the onus should not be on individuals to try to protect themselves from the ever-increasing reach of data collection into our lives”.
“All of your devices should come with privacy protections that just work and you don’t have to think about it,” Cohn said.
Cohn emphasized the need for expanded privacy laws, more protections for consumers and an updated legal system that can more easily handle these types of cases. He noted that the Apple case “had to jump through too many hoops that it didn’t have to jump through” and that it shouldn’t have taken four years to reach a settlement.
“We need a comprehensive privacy law that has real teeth that gives people the power to sue,” he said.
The important next step, he says, is to create laws that limit not just what tech companies can do with our data, but the aggressive things they’re already doing.
“We need to simplify things in this privacy area and we need to expand its reach [the law] To include metadata … instead of just having these cases where we’re trying to push the edge.”