Setting up meetings, scheduling appointments or ordering food—voice-activated digital assistants are gaining in popularity by the day. Virtual assistants can easily look up information, help navigate your driving route, and control Internet of Things (IoT)-connected smart devices at home. This listening technology is used in most smartphones as well. According to a recent study by Google, more than one-fifth of searches on phones are already made by voice. While some form of user authentication (password, PIN, facial recognition or biometric validation) protects access to mobile devices, the case for most home digital assistants is different—a hot word is used to activate them. These assistants record voice data and stream it to cloud-based servers, where the data is deciphered by machine learning algorithms. Most of these devices are potentially always active. If exploited, they can expose personal data to cybercriminals.
The primary issue with using a virtual assistant stems from the fact that conversations with the assistant are actually being recorded and stored by the company on its servers without the user knowing.
What is more worrisome is that even when we have not actively engaged with the virtual assistant, the device is still activated as it is supposed to be “always listening”. Considering the sheer level of the sophistication of the neural network and other machine learning tools embedded in them, it is highly likely that these private conversations are being transmitted back to organizational servers so that they can be analysed and stored. The implications of this activity are concerning. Private conversations could be used for marketing purposes, such as creating targeted advertisements, or for malicious purposes, such as obtaining sensitive personal data such as credit card details and social security identification numbers.
According to ‘Voice Report 2019’, published by Microsoft, 41% of voice assistant users have concerns about privacy. One such example concerns a US family.
The concerns were recently realized when the family members reported that a virtual assistant-enabled device from a leading company recorded their private conversation and then promptly messaged it to an employee of a family member who is on the family’s contact list.
There is speculation as to what exactly happened but the company’s investigation revealed that in an unfortunate and unusual string of events, the virtual assistant had misheard a string of words in the background conversation and interpreted them as commands. The incident illustrates how virtual assistant devices could be responsible for potentially serious personal data breaches.
As this technology is widely used and very little information relating to it is publicly available, it is difficult to quantify its vulnerabilities. Therefore, if people want to continue using these devices at their current state of technology, there are a few things they should consider to protect their data.
■ By default, keep voice assistants in deactive mode. They should be activated when they are to be used.
■ Refrain from connecting any sensitive accounts, such as those of banks or credit cards, to such devices.
■ Tighten device settings to limit the data these devices acquire.
■ Delete recordings of commands stored in the device.
■ Disable communications, unless necessary, to avoid sending or receiving messages, voice or video calls.
■ Disable voice purchases.
■ Disable the microphone and camera functions.
■ Disable the function that uses stored data to modify and improve services the device provides.
In conclusion, it should be kept in mind that it is up to the user to decide and control the extent of data that is exposed through such user-friendly utilities.
The strongest deterrent is user awareness, followed by security controls enabled on the devices.
Technology is a wonderful tool and it provides the benefits of convenience and entertainment. Artificial intelligence, virtual assistants, and smart home technology are relatively new concepts that with time will become a necessary part of our homes and lives in the future. Hence, a prudent use of these innovative products and devices will ensure that the benefits may be reaped along with mitigated risks to sensitive personal data.
Manish Sehgal is partner at Deloitte India