A former Apple contractor who helped blow the whistle on the company’s program to listen to users’ Siri recordings is going public to protest the lack of action taken as a result of the disclosures, reports The Guardian.
In a letter sent to all European data protection regulators, Thomas le Bonniec said: “It is worrying that Apple (and undoubtedly not just Apple) keeps ignoring and violating fundamental rights and continues their massive collection of data. I am extremely concerned that big tech companies are basically wiretapping entire populations despite European citizens being told the EU has one of the strongest data protection laws in the world. Passing a law is not good enough: it needs to be enforced upon privacy offenders.”
Le Bonnie worked as a subcontractor for Apple in its Cork offices, transcribing user requests in English and French, until he quit in the summer of 2019 due to ethical concerns with the work.
“They do operate on a moral and legal grey area,” he told The Guardian at the time, “and they have been doing this for years on a massive scale. They should be called out in every possible way.”
In August 2019, it was reported contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading,” Siri. In response, Apple temporarily halted the grading program. The company resumed it in the fall, but said it had made these changes.
First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.