Beware Of The Voice Trap

Beware Of The Voice Trap

Voice interactions are most convenient, but they could also lead to privacy breach and fake information

Illustration by Raj Verma Illustration by Raj Verma

Forget mouse and touch; voice is taking over. The sheer convenience of going hands-free and commanding voice assistants to access information or get tasks done may come back to bite us sooner than we imagine. By now, just about every technology major has been caught listening to users through one route or the other. In the process of operating, improving and personalising voice assistants, Amazon, Apple, Google and Microsoft have to rely on their users and the way they communicate. So, they listen to recordings, without which it will be tough to make these assistants more natural and useful.

Although tech firms have been doing it for a long time, they do not make full disclosure of the same. Besides, most users are not diligent enough to peruse privacy policies as the language is often too complicated. Also, people have somewhat accepted that their names and e-mail IDs are sold to third parties for targeted advertising. Nevertheless, they are surprised, shocked and outraged when reports of unauthorised listening hit the headlines.

Recently, Amazon gave its users an option to delete their commands by another voice command. But no one is sure whether the voice data completely disappears or not. The company also offered an option to disable human reviews of voice recordings. Not long ago, a report published by the online magazine Motherboard shared actual recordings handled by Microsoft contractors working on the virtual assistant Cortana and Skype translation. Apart from personal information, those recordings included phone sex conversations and voice searches for pornography.

Even though the contractors did not know to whom they were listening and were under strict confidentiality agreements, it is an uncomfortable thought. As per a Bloomberg report, Facebook also admitted to listening in on private audio conversations on Messenger, which were later transcribed. Apple said its contractors had studied a few interactions with voice assistant Siri for quality purposes. But they did not know the Apple IDs related to the Siri recordings.

If you are feeling spooked, here is yet another danger. As we use virtual assistants to call local businesses via voice commands and do not ensure whether these phone numbers and businesses are legitimate, we are likely to fall victim to scamsters mimicking the original brands. Two such instances have already grabbed public attention. A woman called what she thought to be a Zomato number but encountered scamsters instead and lost her bank balance. And a person in Chennai just missed a similar fate. Do not blame it on voice-commanded diallers, though; these might have occurred as the people concerned used phone numbers without adequate cross-checking.

Miscreants are always ahead of users, and they seem to have prepared for the time when voice commands become the preferred way to connect with local businesses. Consequently, the Internet has been flooded with fake companies to dupe customers. In a world ruled by algorithms, scamsters pay for high search engine rankings and use the right keywords and tags to appear among top results almost instantly. With your eyes glued to the screen for a long time, you may lose focus and that little bit of due diligence to make sure if a phone number looks right or not is taken out of the equation. Thus, many more fake calls become a real possibility.

Firing people? Let VR help

Firing someone is always stressful for all parties concerned. The inhuman way of doing it is to lock down the person's computer and shut off all access to the office or hand over the dreaded pink slip with few explanations. But in doing so, a company may lose its reputation and the person thus terminated may suffer untold harm. However, a US-based company called Talespin is trying to change all that. The company trains people in soft skills and emotional intelligence, and has now come up with a virtual reality (VR) tool to teach interpersonal skills so that one can behave more humanely in such situations.

Talespin has created a virtual entity called Barry, a man whom you can see if you don a VR headset. The company describes this method as a software-based approach to training soft skills. "The premise behind the software is giving employees a safe space to practise challenging interpersonal situations while using AI (artificial intelligence) to create emotionally realistic characters to stimulate and challenge them," its website says. "When a trainee enters a virtual human training experience, VR provides the environment and medium for the most effective delivery of the training scenario, while AI gives users a counterpart to interact with and carry out the other half of the simulated conversation."

One can interact with Barry and practise firing him, and changing the conversation again and again until it feels right. Barry responds, not just with words but with full body language and facial expressions, showing distress, arguing, shouting and so on, to raise the challenge.