Software Bug Causes Siri to Record Users' Personal Conversations

A bug in Apple led to Siri, its virtual assistant feature, recording personal interactions of its users without their consent.

Last week, Apple acknowledged this very serious problem in its latest update, iOS 15. According to its own, the AI-based virtual assistant recorded people's conversations, even though the people refused such a mechanism: "The bug automatically activated the Improve Siri and Dictation setting that gives Apple permission to record, store and review personal conversations with Siri," ZDNet reported. Later, in issuing an apology, the US company said it had fixed the bug for "many" users. There are still many unanswered questions: the company's statement does not clarify, for example, how many phones were affected, or even when. "Without transparency, there is no way to know who may have their conversations recorded and listened to by Apple employees, even though the user acted in exactly the way to avoid that scenario," added online portal The Verge

Technology and AI experts have previously argued for these big tech companies to listen to our requests-especially in order to tweak the flaws in voice-based technology. This is what Amazon's FAQ on Alexa says: "The more data we use to train these systems, the better Alexa works, and training Alexa with voice recordings from diverse customers helps ensure that Alexa works well for everyone." In other words, the only way to improve voice-based technology, according to some experts, is to make private interactions listenable. It is estimated that by 2020, more than 60% of Indian users used voice assistants on their smartphones for a multitude of tasks - from listening to music, to setting an alarm, or even asking questions.

Florian Schaub, an assistant professor at the University of Michigan who has studied people's perceptions of privacy, argues that people tend to impersonate their devices, which makes them even more inattentive to these types of issues. In this sense, when they ask innocuous questions to Alexa or Siri, they are not really thinking deeply about these actions, however, when they realize that there is someone listening in on their conversations, they feel that it is intrusive and a violation of their privacy, and are therefore much more likely to disconnect from these systems. 

This is an issue that raises a number of concerns not only about users' privacy, but also about the extent to which their data is retained and how it is leveraged and used by these companies. "VAs work based on users' voices - it is their main feature. All the VAs mentioned above are activated by listening to a specific activation keyword. Although some of the policies state that cloud servers do not store data/voice unless the activation word is detected, there is a constant exchange of voice and related data between your cloud servers and the VA device. This turns out to be particularly worrisome in cases of fake activation, when data can be stored without real knowledge," according to a report by the Internet Freedom Foundation (IFF)