Apple has temporarily suspended the practice of using human contractors to grade snippets of Siri commands after some users raised concerns about the program. The California-based technology giant's move has come after it recently acknowledged that Siri eavesdrops on users' conversations without their knowledge and even when it's not activated.
Competing voice assistants like -- Amazon's Alexa and Google -- also use the human review to improve their quality and accuracy but have provided users with an option to opt-out. On the other hand, Apple doesn't offer users any way to opt-out of the grading program. The only option the user has is to completely disable Siri.
Apple said that only 'a small portion' (less than 1% of Siri commands) of the user's conversation was being tracked by its contractors and for quality control, reports The Guardian. The report adds that a former employee detailed the program and claimed that contractors "regularly hear confidential medical information, drug deals, and recordings of couples having sex" as part of their job.
However, Apple still hasn't made it clear whether it is continuing to recording users' conversations or not. Besides temporarily suspending the program, where contractors listen to Siri recordings, it would also stop saving those voice snippets on its servers. Currently, the tech-giant keeps these voice recordings for six months before removing identifying information from a copy that it could keep for two years or even more.
The Guardian reports suggested that the whistleblower was "tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri's response was appropriate."
An Apple spokesperson said, "While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading," reports the Verge.
According to Apple, the purpose of the voice grading program is to improve the accuracy of Siri's voice recognition and prevent accidental triggers. "A small portion of Siri requests are analyzed to improve Siri and dictation," Apple told The Guardian following its report. "User requests are not associated with the user's Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements," it added.
But the Apple's terms of service were unclear about the fact that people outside the company are listening in on user's conversations without their knowledge, only mentioning that 'certain information such as your name, contacts, music you listen to, and searches is sent to Apple servers using encrypted protocols.'(Edited by Vivek Dubey)