“Alexa, can I trust you?”
A Research by Check Point Software Technologies has found some “serious security flaws in Alexa.” According to them, “in just one click, a user could have given up their voice history, home address and control of their Amazon account.”
In recent times, significant technological advances are being made across various fields, including information communications technology (ICT); artificial intelligence (AI), machine learning, nanotechnology among others. These breakthroughs are highly disruptive and can bring about major shifts in how societies function.
These innovations are centred on gathering, processing, and analysing of enormous reams of data with implications for countless areas of research and development. These could reap social and economic benefits, increase efficiency and productivity of various sectors.
However, concerns that these technologies and how they are used could pose serious challenges are valid. Easier access and use, makes most of them vulnerable to exploitation and disruption from across the globe.
What Amazon doesn’t tell one explicitly, as highlighted by an investigation from Bloomberg, is that one of the only ways Alexa improves over time is by having human beings listen to recordings of your voice requests. Of course, this is all buried under the terms and conditions few consumers will ever read.
The process is known as data annotation, and is becoming a bedrock of the machine learning revolution that’s churning out advances in natural language processing, machine translation, and image and object recognition.
The idea is that an AI algorithm only improves if the data they have access to can be easily broken down and categorized — they can’t necessarily do that themselves. When dealing with different languages, there are many nuances, like regional slang and dialects, that may not have been accounted for during the development process of the Alexa.
Usually, human beings listen to a recording of the exchange and correctly label the data and feed them back into the system.
Apple, Google, and Facebook all make use of these techniques, and both Siri and Google Assistant owe their intelligence to supervised learning requiring human eyes and ears.
In a statement to Bloomberg, Amazon admitted that they collect ‘small’ samples of Alexa voice recordings in order to “improve” customer experience.
Most customers don’t realize this is occurring. Also, there’s room for abuse. Recordings would contain identifiable characteristics and biographical information about who is speaking. It’s not known how long these recordings are stored, and whether the information has ever been stolen by a malicious third party or misused by an employee. Bloomberg’s report calls out instances where some annotators have heard what they think might be forms of criminal activity, in which case Amazon has procedures to loop in law enforcement.
There have been a number of cases where Alexa voice data has been used to prosecute crimes.
When police in Bentonville, Arkansas were faced with an unsolved murder, they looked to a nearby Amazon Echo’s records for clues to his guilt.
The Convention on Cybercrime or the Budapest Convention, is the first international treaty seeking to address cybercrime by harmonizing national laws, improving investigative techniques, and increasing cooperation among nations.
While the treaty has helped develop a more consistent approach to legislation on cybercrime and electronic evidence, increased the number of investigations and prosecutions and enhanced police-to-police and judicial cooperation among many of the Parties to the Budapest Convention, cybercrime continues to thrive.
As per the GDPR norms,
Data controllers must design information systems keeping privacy in mind. For example, using the highest-possible privacy settings by default.
Data controllers must clearly disclose any data collection, declare the basis and purpose for data processing, and state how long data is being retained and if it is being shared with any third parties or outside of the EEA.
Data subjects have the right to request a copy of the data collected by a controller, and the right to have their data erased under certain circumstances.
The GDPR has been the basis for most data protection bills drafted since its creation.
In 2019, the Government of India introduced the Personal Data Protection (PDP) Bill, which would create the first cross-sectoral legal framework for data protection in India.
The PDP Bill aims to legislate the mechanisms for the protection of personal data and aims to set up a Data Protection Authority in the country. The Bill regulates the processing of citizens’ personal data by the government and companies that are dealing with personal data of customers in India.
Through the proposed law, the Government of India intends to work towards data sovereignty by ensuring certain class of data is stored within Indian borders.
The bill is largely influenced by frameworks such as the GDPR and the Asia-Pacific Economic Cooperation Privacy Framework.
When implemented, the bill will apply to all enterprises across India other than those specifically exempted. This would include any enterprise that uses automated means to collect data.
In conclusion, while innovation and development of technologies is necessary to make human life easier, their by-products i.e., the lack of data privacy and cybercrime are a harsh reality and they will continue to thrive as long as they are dealt with laxity.
Therefore, it is high time that countries come together to fight this ‘virtual’ demon.