A software bug at Apple has led to Siri, its virtual assistant feature, recording personal interactions with its users without their consent. Last week, Apple acknowledged this very serious problem in its most recent update, iOS 15. According to Apple, the AI-based virtual assistant recorded people's conversations, even though they had refused to do so: "The bug automatically activated the Improve Siri and Dictation setting that gives Apple permission to record, store and review personal conversations with Siri," reported ZDNet. Later, issuing an apology, the US company said it had fixed the bug for "many" users. There are still many unanswered questions: the company's statement does not clarify, for example, how many phones were affected, or even when. "Without transparency, there's no way of knowing who might have their conversations recorded and listened to by Apple employees, despite the user having acted in exactly the way to avoid that scenario," added the online portal The Verge. Technology and AI experts have previously argued in favor of these big tech companies listening to our requests - mainly in order to adjust the flaws in voice-based technology. This is what Amazon's Alexa FAQ says: "The more data we use to train these systems, the better Alexa works, and training Alexa with voice recordings from multiple customers helps ensure that Alexa works well for everyone." In other words, the only way to improve voice-based technology, according to some experts, is to make private interactions listenable. It is estimated that in 2020, more than 60% of Indian users used voice assistants on their smartphones for a multitude of tasks - from listening to music, to setting an alarm, or even asking questions. Florian Schaub, an assistant professor at the University of Michigan who has studied people's perceptions of privacy, argues that people tend to personify their devices, which makes them even more inattentive to these kinds of issues. In this sense, when they ask Alexa or Siri innocuous questions, they are not really thinking deeply about these actions, but when they realize that someone is listening to their conversations, they feel that it is intrusive and a violation of their privacy, and are therefore much more likely to disconnect from these systems. This is an issue that raises a number of concerns not only about users' privacy, but also about the extent to which their data is retained and how it is harnessed and used by these companies. "VAs work on the basis of users' voices - that's their main feature. All the VAs mentioned above are activated by listening to a specific activation keyword. Although some of the policies state that cloud servers do not store data/voice unless the activation word is detected, there is a constant exchange of voice and related data between your cloud servers and the VA device. This turns out to be particularly worrying in cases of false activation, when data can be stored without real knowledge," according to a report by the Internet Freedom Foundation (IFF). The original article via The Swaddle can be read at: https://theswaddle.com/apples-siri-was-accidentally-recording-conversations-without-peoples-consent/