Apple Siri conversations are being reviewed by external contractors, it has been confirmed. Worse still, the voice assistant is easily activated, sometimes picking up private conversations such as people talking to their doctor, drug deals and sexual encounters, according to an article by U.K. based news outlet The Guardian.

Devices such as the Apple Watch are often accidentally activated by a person lifting their wrist, while the assistant often “wakes” to the sound of a user’s zipper, the Apple contractor told The Guardian. Apple does not explicitly state that humans are listening to conversations in its consumer facing privacy documentation.

The whistleblower told The Guardian they had witnessed countless instances of recordings “featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on.”

Concerningly, the recordings are accompanied by user data showing “location, contact details, and app data.”

Apple has responded to the claims, saying that only 1% of recordings are used to improve its responses to user requests and to measure when the device is activated accidentally. However, even 1% isn’t a small number: According to figures, there are 500 million Siri enabled devices in use.

Apple said it does not store the data with Apple ID numbers or names which could identify the person. The firm told The Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Apple Siri is not the only assistant listening to recordings of user requests. In April, it was revealed that Amazon’s voice assistant Alexa was sometimes recording private conversations. Then in July, it emerged that the Google Assistant was doing the same.

Forbes