Siri manual assessment staff listens to 1000 recordings every day

Recently, the Irish media Irish Examiner reported on the details of the Apple Siri manual evaluation program. According to one of the contractor employees, they organize and analyze up to 1,000 Siri recordings every day.

The employee, who previously worked for Ireland’s GlobeTech data analytics company, said that Apple would share the recordings collected by Siri Voice Assistant to these contractors. Most of these recordings were just a few seconds of voice commands, but occasionally some private ones were heard. A discussion of the data or a segment of the conversation.

The daily work of employees is to classify and study various recordings, such as whether Siri is accidentally activated, whether the user’s questions can be correctly answered and whether the content is appropriate.

At the same time, every employee responsible for Siri’s manual analysis will be required to sign a confidentiality agreement before entering the job, prohibiting the disclosure of the details of the work, or that they are working for Apple.

The author also pointed out that although the recordings of each Siri user are anonymous, they can still distinguish the accent of the speech, such as most users with Canadian, Australian or British accents, and a small team dedicated to Europe. Content in other languages.

After the British “Guardian” exposed the incident last month, GlobeTech began to prohibit employees from carrying mobile phones at work, and then Apple announced that it would suspend the Siri manual evaluation program, and these employees were immediately terminated by the company.

In an official statement in early August, Apple said it would suspend listening to its user audio on Siri and will let users choose whether to participate in the project when they update the software in the future; in an earlier statement, Apple This means that these Siri conversations are analyzed in a secure environment and the uploaded data is less than 1% of the total Siri data.

In fact, Apple is not the first company to be pointed out to listen to users’ voice content, including Amazon Alexa, Microsoft Cortana, and Google Voice Assistant. Almost all of them have the behavior of capturing user speech and analyzing. The goal is to rely on manual Means to train and improve the understanding of voice assistants and improve their responsiveness.

After Apple’s incident was exposed, Google also chose to suspend the listening and transcription capabilities of the Google Voice Assistant in Europe.

But for Apple, which regards personal privacy as one of the selling points of the product, a similar situation will obviously suffer more public opinion and controversy. At the same time, this incident also proves the limitations of current voice assistants on “smart” services.

In theory, if a voice assistant is smart enough, the work of analyzing voice can be done to the machine or locally, without actually uploading the recording and performing manual analysis.

The employee also said that he understands why Apple is doing this, but he added that Apple did not inform users of the possibility of Siri voice being uploaded and analyzed. Apple’s own privacy policy has never been mentioned, so the incident After the exposure, it caused a lot of user dissatisfaction.

“Uploading users’ private content without consent, I think this is the root cause of the incident,”

Apple Employee

Leave a Comment