Software Error Causes Siri to Record Users' Personal Conversations

A software at Apple led to Siri, its virtual assistant feature, recording its users' personal interactions without their consent.

Last week, Apple acknowledged this very serious problem in its most recent update, iOS 15. According to Apple, the AI-based virtual assistant recorded people's conversations, even though they refused to do so: "The bug automatically activated the Improve Siri e Dictation which gives Apple permission to record, store and review personal conversations with Siri", reported the ZDNet. Later, issuing an apology, the US company said it had corrected the bug for "many" users. There are still many unanswered questions: the company's statement does not clarify, for example, how many phones were affected, or even when. "Without transparency, there's no way of knowing who might have their conversations recorded and listened to by Apple employees, despite the user having acted in exactly the way to avoid that scenario," added the online portal The Verge

Technology and AI experts already have previously argued in favor These big tech companies are listening to our requests - mainly in order to adjust the flaws in voice-based technology. This is what Amazon's Alexa FAQ says: "The more data we use to train these systems, the better Alexa works, and training Alexa with voice recordings from multiple customers helps ensure that Alexa works well for everyone." In other words, the only way to improve voice-based technology, according to some experts, is to make private interactions listenable. It is estimated that by 2020, more than 60% of Indian users have used voice assistants on their smartphones for a multitude of tasks - from listening to music, to setting an alarm, or even asking questions.

Florian Schaub, assistant professor at the University of Michigan, who studied people's perceptions of privacyIn a recent article in the magazine "The Internet of Things", he argues that people tend to personify their devices, which makes them even more inattentive to these kinds of questions. In this sense, when they ask Alexa or Siri innocuous questions, they are not really thinking deeply about these actions, however, when they realize that there is someone listening to their conversations, they feel that it is intrusive and a violation of their privacy, and are therefore much more likely to disconnect from these systems. 

This is an issue that raises a number of concerns not only about users' privacy, but also about the extent to which their data is retained and how it is harnessed and used by these companies. "VAs work on the basis of users' voices - that's their main feature. All the VAs mentioned above are activated by listening to a specific activation keyword. Although some of the policies state that the cloud do not store data/voice unless the activation word is detected, there is a constant exchange of voice and related data between their servers. cloud and the VA device. This turns out to be particularly worrying in cases of false activation, when data can be stored without real knowledge," according to a report by the Internet Freedom Foundation (IFF)

 
en_US