Apple Apologizes for Listening In, Announces Changes to Make Up for It

Apple has been in the news lately following a controversial leak regarding one of the company’s very important features – Siri. It was identified that the management team behind Siri had been listening in to people’s conversations with Siri, which obviously led to a wave of privacy concerns emerging on social media in forms of tweets, blogs, and more. After Facebook and Google being identified as breaching the users’ privacy, it comes as no shock that Apple does it, too.
However, Apple accepted that they had been listening in, and they explained that this was done in order to improve Siri’s working performance and responses to suit what most Apple users require. They call this process, “Human Grading of Siri Requests”.While this was a part of Siri’s reviewing process, it was still a privacy breach of the company’s users since Apple’s employees were listening to voices of millions of their users often including private details, and they are not so happy about this.
Acceptance is the first step towards repentance, and that is what Apple just did. The company responded to the controversy by responsibly accepting their mistake, and not only apologized for it, but introduced a number of changes to make up for it, as well as to make their users feel safe with them again. Read on to find what has changed in Siri’s privacy policy.
Source:aljazeera
Apple has updated Siri’s privacy policy to ensure that no recordings are stored and retained in the company’s database anymore. This means that whatever you say to Siri will not be listened in on by the company’s employees for Siri’s grading or other purposes. However, Apple has made it clear that the computer-generated responses and dialogues said by Siri will be kept in the database for quality assurance purposes.
Following the controversy of Apple listening to people’s audio commands to Siri, what concerned people more than the company listening in was that Apple did not clarify this act or got the users’ consent. Apple has wisely introduced this option in their privacy policy that lets the users ‘choose’ whether they are okay with Apple listening to their audios or not. Those who choose to let Apple hear their audios are given the option to opt out of this agreement anytime they want to. Since Apple has clearly announced that this will be done by consent, it hopes that people will be helpful to make Siri learn better and become better for them. This way, Apple gets the information and people get their privacy. It’s a win-win update.
ReadMore : iPhone 11; All The Updates You Need to Know About
Since the company was outsourcing the Siri Grading to third party human contractors to listen in to the users’ interaction with Siri, they could themselves not promise that the information will stay safe with them. Hence, Apple has now announced that only Apple employees will be able to hear your audios, and that too if you allow them to. This way, your private information is guaranteed to be kept secure by the company as they mentioned that they have ‘strong privacy controls’ for any information at the company’s responsibility. Apple also indicated that user related information will not be shared or sold to any third parties, especially after this controversy.
Source:techcrunch
It is true that personal privacy is at risk when using modern technology. This includes all software apps ranging from Apple’s to Google’s products. Regardless, since more and more people are becoming aware of their rights to limit their information to themselves, such news are breaking out more commonly as controversies.
While the company took the right step at the right time and has attempted to cover up their mistake, it is also notable that this was being done for better user experience. The changes in the Siri’s privacy policy are not to be implemented before fall 2019, but are definitely what the world hoped for. However, do these changes make a difference? Probably – for the users – but also, they have now limited the customized quality checks and updates of Siri, as computer-generated information will not yield the results that human analysis did. This might mean a slow down in the developments of Siri’s updates that matched users’ requirements promptly.
Albeit, whether or not company’s changes to the Siri’s privacy policy are effective is another discussion, these changes were a necessity to retain the customer base and brand reputation of the company. Even though only humans could do Siri grading the way it is being done, customer satisfaction lies in their consent and privacy protection, which Apple has given in time.