{"id":6604,"date":"2022-02-17T16:20:50","date_gmt":"2022-02-17T16:20:50","guid":{"rendered":"https:\/\/pstqb.pt\/?p=6604"},"modified":"2022-02-18T10:43:48","modified_gmt":"2022-02-18T10:43:48","slug":"erro-de-software-leva-siri-a-gravar-conversas-pessoais-dos-utilizadores","status":"publish","type":"post","link":"https:\/\/pstqb.pt\/en\/erro-de-software-leva-siri-a-gravar-conversas-pessoais-dos-utilizadores\/","title":{"rendered":"Software Error Causes Siri to Record Users' Personal Conversations"},"content":{"rendered":"<div data-elementor-type=\"wp-post\" data-elementor-id=\"6604\" class=\"elementor elementor-6604\">\n\t\t\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-6e368fa elementor-section-boxed elementor-section-height-default elementor-section-height-default wpr-particle-no wpr-jarallax-no wpr-parallax-no wpr-sticky-section-no\" data-id=\"6e368fa\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-8b4cf3a\" data-id=\"8b4cf3a\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-b122b8c elementor-widget elementor-widget-text-editor\" data-id=\"b122b8c\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>A <strong><em>software<\/em><\/strong> at Apple led to Siri, its virtual assistant feature, recording its users' personal interactions without their consent.<\/p><p>Last week, Apple acknowledged this very serious problem in its most recent update, iOS 15. According to Apple, the AI-based virtual assistant recorded people's conversations, even though they refused to do so: \"The <em>bug<\/em> automatically activated the <em>Improve Siri<\/em> e <em>Dictation<\/em> which gives Apple permission to record, store and review personal conversations with Siri\", reported the <a href=\"https:\/\/www.zdnet.com\/article\/ios-15-4-update-why-youre-asked-to-help-improve-siri-after-updating\/\">ZDNet<\/a>. Later, issuing an apology, the US company said it had corrected the <em>bug<\/em> for \"many\" users. There are still many unanswered questions: the company's statement does not clarify, for example, how many phones were affected, or even when. \"Without transparency, there's no way of knowing who might have their conversations recorded and listened to by Apple employees, despite the user having acted in exactly the way to avoid that scenario,\" added the online portal <a href=\"https:\/\/www.theverge.com\/2022\/2\/8\/22924225\/apple-ios-15-bug-recorded-interactions-siri\">The Verge<\/a>.\u00a0<\/p><p>Technology and AI experts already <a href=\"https:\/\/edition.cnn.com\/2019\/08\/19\/tech\/siri-alexa-people-listening\/index.html\">have previously argued in favor<\/a> These big tech companies are listening to our requests - mainly in order to adjust the flaws in voice-based technology. This is what Amazon's Alexa FAQ says: \"The more data we use to train these systems, the better Alexa works, and training Alexa with voice recordings from multiple customers helps ensure that Alexa works well for everyone.\" In other words, the only way to improve voice-based technology, according to some experts, is to make private interactions listenable. It is estimated that by 2020, <a href=\"https:\/\/www.thinkwithgoogle.com\/intl\/en-apac\/country\/india\/ok-google-how-is-voice-making-technology-more-accessible-in-india\/\">more than 60% of Indian users<\/a> have used voice assistants on their smartphones for a multitude of tasks - from listening to music, to setting an alarm, or even asking questions.<\/p><p>Florian Schaub, assistant professor at the University of Michigan, who <a href=\"https:\/\/www.key4biz.it\/wp-content\/uploads\/2018\/11\/cscw102-lau-1.pdf\">studied people's perceptions of privacy<\/a>In a recent article in the magazine \"The Internet of Things\", he argues that people tend to personify their devices, which makes them even more inattentive to these kinds of questions. In this sense, when they ask Alexa or Siri innocuous questions, they are not really thinking deeply about these actions, however, when they realize that there is someone listening to their conversations, they feel that it is intrusive and a violation of their privacy, and are therefore much more likely to disconnect from these systems.\u00a0<\/p><p>This is an issue that raises a number of concerns not only about users' privacy, but also about the extent to which their data is retained and how it is harnessed and used by these companies. \"VAs work on the basis of users' voices - that's their main feature. All the VAs mentioned above are activated by listening to a specific activation keyword. Although some of the policies state that the <em>cloud<\/em> do not store data\/voice unless the activation word is detected, there is a constant exchange of voice and related data between their servers. <em>cloud<\/em> and the VA device. This turns out to be particularly worrying in cases of false activation, when data can be stored without real knowledge,\" according to a report by the <a href=\"https:\/\/internetfreedom.in\/privacy-of-the-people-voice-assistants\/\">Internet Freedom Foundation (IFF)<\/a>.\u00a0<\/p><div>\u00a0<\/div><div style=\"font-size: 15px; font-style: normal; font-weight: 500;\"><span style=\"font-size: 15px;\">The original article\u00a0<\/span><span style=\"font-size: 15px;\">via <em>The Swaddle<\/em><i>\u00a0<\/i><\/span><span style=\"font-size: 15px;\">can be read at:<br \/><\/span><a href=\"https:\/\/theswaddle.com\/apples-siri-was-accidentally-recording-conversations-without-peoples-consent\/\">https:\/\/theswaddle.com\/apples-siri-was-accidentally-recording-conversations-without-peoples-consent\/<\/a><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>","protected":false},"excerpt":{"rendered":"<p>A software bug at Apple has led to Siri, its virtual assistant feature, recording personal interactions with its users without their consent. Last week, Apple acknowledged this very serious problem in its most recent update, iOS 15. According to Apple, the AI-based virtual assistant recorded people's conversations, even though they had refused to do so: \"The bug automatically activated the Improve Siri and Dictation setting that gives Apple permission to record, store and review personal conversations with Siri,\" reported ZDNet. Later, issuing an apology, the US company said it had fixed the bug for \"many\" users. There are still many unanswered questions: the company's statement does not clarify, for example, how many phones were affected, or even when. \"Without transparency, there's no way of knowing who might have their conversations recorded and listened to by Apple employees, despite the user having acted in exactly the way to avoid that scenario,\" added the online portal The Verge.  Technology and AI experts have previously argued in favor of these big tech companies listening to our requests - mainly in order to adjust the flaws in voice-based technology. This is what Amazon's Alexa FAQ says: \"The more data we use to train these systems, the better Alexa works, and training Alexa with voice recordings from multiple customers helps ensure that Alexa works well for everyone.\" In other words, the only way to improve voice-based technology, according to some experts, is to make private interactions listenable. It is estimated that in 2020, more than 60% of Indian users used voice assistants on their smartphones for a multitude of tasks - from listening to music, to setting an alarm, or even asking questions. Florian Schaub, an assistant professor at the University of Michigan who has studied people's perceptions of privacy, argues that people tend to personify their devices, which makes them even more inattentive to these kinds of issues. In this sense, when they ask Alexa or Siri innocuous questions, they are not really thinking deeply about these actions, but when they realize that someone is listening to their conversations, they feel that it is intrusive and a violation of their privacy, and are therefore much more likely to disconnect from these systems.  This is an issue that raises a number of concerns not only about users' privacy, but also about the extent to which their data is retained and how it is harnessed and used by these companies. \"VAs work on the basis of users' voices - that's their main feature. All the VAs mentioned above are activated by listening to a specific activation keyword. Although some of the policies state that cloud servers do not store data\/voice unless the activation word is detected, there is a constant exchange of voice and related data between your cloud servers and the VA device. This turns out to be particularly worrying in cases of false activation, when data can be stored without real knowledge,\" according to a report by the Internet Freedom Foundation (IFF).   The original article via The Swaddle can be read at: https:\/\/theswaddle.com\/apples-siri-was-accidentally-recording-conversations-without-peoples-consent\/<\/p>","protected":false},"author":2,"featured_media":6606,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[31],"tags":[],"class_list":["post-6604","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-destaque"],"_links":{"self":[{"href":"https:\/\/pstqb.pt\/en\/wp-json\/wp\/v2\/posts\/6604","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/pstqb.pt\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/pstqb.pt\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/pstqb.pt\/en\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/pstqb.pt\/en\/wp-json\/wp\/v2\/comments?post=6604"}],"version-history":[{"count":0,"href":"https:\/\/pstqb.pt\/en\/wp-json\/wp\/v2\/posts\/6604\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/pstqb.pt\/en\/wp-json\/wp\/v2\/media\/6606"}],"wp:attachment":[{"href":"https:\/\/pstqb.pt\/en\/wp-json\/wp\/v2\/media?parent=6604"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/pstqb.pt\/en\/wp-json\/wp\/v2\/categories?post=6604"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/pstqb.pt\/en\/wp-json\/wp\/v2\/tags?post=6604"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}