Apple is Regularly Listening to Your Private Siri Conversations: Report

Aadil Raval
By Aadil Raval
3 Min Read

After instances recorded of both Amazon and Google, Apple has come under review after a whistleblower pointed out that it’s contractors regularly hear Siri recordings. The report came out in The Guardian although the whistleblower was kept under the condition of anonymity. According to the reports, contractors at Apple regularly hear recordings from its Apple devices which includes couples making love and even drug deals, etc.

When asked, Apple stated that it does send out recordings to its contractors to carry out the process of grading where they are tasked with grading the recordings on several factors including if the recordings were made accidentally or deliberate based on if Apple’s AI-based voice assistant Siri was summoned or not. It also grades the response of Siri when a user asks or commands to do something and grades the appropriateness of the resolution provided.

Furthermore, the report adds that since Apple has a series of devices with Siri such as Apple Watch, HomePod, iPhones, iPad, iPad Pros, etc, Apple Watch have been titled as one of the most frequency sources of mistaken recordings which is similar for Apple HomePod as well although the probability of mistaken recordings are comparatively higher in terms of Apple Watch when compared to other devices.

Siri collects recordings maxing at 30 seconds upon accidental trigger which is then routed to these Apple contractors along with details such as app data, contact details, location, etc. Apple maintains that only a small portion of the recordings are sent for grading on a globe scale. Grading enables the AI-based Siri to understand where and how the Siri should be improved and how better it can recognize the command made, etc.

- Advertisement -

As per Tom’s Guide, the Siri recordings can be varied from a conversation between doctors and patients, sexual encounters, business deals, and even criminal dealings such as drug deals. The Guardian quoted that Apple doesn’t use the data obtained from Siri to tie it with other Apple services. It is also not that easy to link two or more recordings coming from any individual as there’s no identifier or specific name associated with the recordings.

 

Source

TAGGED:
Share This Article
Follow:
A wordsmith, a kin tech observer, a sci-fi fanatic and a scientific documentary buff.
Leave a comment

After instances recorded of both Amazon and Google, Apple has come under review after a whistleblower pointed out that it’s contractors regularly hear Siri recordings. The report came out in The Guardian although the whistleblower was kept under the condition of anonymity. According to the reports, contractors at Apple regularly hear recordings from its Apple devices which includes couples making love and even drug deals, etc.

When asked, Apple stated that it does send out recordings to its contractors to carry out the process of grading where they are tasked with grading the recordings on several factors including if the recordings were made accidentally or deliberate based on if Apple’s AI-based voice assistant Siri was summoned or not. It also grades the response of Siri when a user asks or commands to do something and grades the appropriateness of the resolution provided.

Furthermore, the report adds that since Apple has a series of devices with Siri such as Apple Watch, HomePod, iPhones, iPad, iPad Pros, etc, Apple Watch have been titled as one of the most frequency sources of mistaken recordings which is similar for Apple HomePod as well although the probability of mistaken recordings are comparatively higher in terms of Apple Watch when compared to other devices.

Siri collects recordings maxing at 30 seconds upon accidental trigger which is then routed to these Apple contractors along with details such as app data, contact details, location, etc. Apple maintains that only a small portion of the recordings are sent for grading on a globe scale. Grading enables the AI-based Siri to understand where and how the Siri should be improved and how better it can recognize the command made, etc.

- Advertisement -

As per Tom’s Guide, the Siri recordings can be varied from a conversation between doctors and patients, sexual encounters, business deals, and even criminal dealings such as drug deals. The Guardian quoted that Apple doesn’t use the data obtained from Siri to tie it with other Apple services. It is also not that easy to link two or more recordings coming from any individual as there’s no identifier or specific name associated with the recordings.

 

Source

TAGGED:
Share This Article
Follow:
A wordsmith, a kin tech observer, a sci-fi fanatic and a scientific documentary buff.
Leave a comment