There has been an air of controversy surrounding the logistics involved in the software manufacturing process ever since the inception of voice assistance technology. While many are thrilled by the impact smart speakers have had on their daily lives, others remain conflicted about whether or not owning them means sacrificing their privacy.
In 2014, Amazon made waves in the tech world when they released the first ever smart speaker called “Echo.” The product is activated by wake words like “Alexa,” “Echo,” or “Computer,” although in an effort to humanize the interaction, “Alexa” has become the most popularly used.
From setting alarms to streaming music, the voice interaction feature of the Echo has capitalized on its conveniency. The ability to make hand free commands is what has drawn millions of people to purchase the Echo and inspired other companies like Google and Microsoft to release their own versions.
As globally successful as smart speakers are, many people are still fearful of the microphones installed in them. Skeptics of the devices worry about the voice software listening in on their daily conversations, leaving them to wonder what companies do with the private information they receive.
It has been recently revealed that Amazon has a team of people designated to further improving the assistance capabilities of Alexa. Theses employees listen to the voice recordings that the Echo has access to and train its software to have algorithms that can cater to the nuances of human speech.
Many people were shocked by the lack of privacy attained when buying a smart speaker. Fizza Abbas, a sophomore studying sociology claims that “privacy does not exist in this day and age. The collection of voice data is the only way for companies to give consumers the best service.”
Amazon has openly said that they use the recorded requests to Alexa to train their “speech recognition and natural language understanding systems,” but have never explicitly admitted that it is humans who listen to the voice recordings. The company’s website also states that the Echo can only be triggered by a wake word, but it does not warn readers that the Echo can get set off by random noises and begin recording automatically. Amazon employees are required to review the audio, even if the activation was by mistake.
Juhura Akhi, a sophomore studying early childhood education thinks that “buying these kinds of high tech products means trading in your privacy and that’s a fact you have to accept.”
It is presumed that being very public about the manual labor involved in Alexa’s training process and the unfortunate faults of the system could instill anxiety in potential consumers of the device.
Although the voice review process can be tedious and dull, workers will sometimes come across audio that is worrisome or embarrassing. From sexual assault to distraught children, the listeners hear it all and are left unable to take any action. An Amazon spokesperson has made it clear that the company takes “the security and privacy” of their “customers’ personal information seriously.” They believe that it would not be their place to report problematic recordings.
After asking around the Queens College campus for students’ opinions on today’s technology invading people’s privacy, no one seemed to be surprised.
Shazna Olid, a sophomore studying psychology believes this “sparks a bigger conversation about ethics vs. money, and which is placed on a higher pedestal.” The revelations about Amazon’s Echo also reaffirmed all her “conspiracy theories about how we are never truly alone.”
Even with the controversial aspects of the Echo’s production process, Amazon believes that their choice to have a team of people listen to private recordings is the only way for them to advance Alexa’s human recognition abilities.
If this is the case, then are consumers willing to sacrifice their privacy for the benefits of voice assistant services? As the world becomes more reliant on technology, and the expectation of its optimization only increases, it seems like they may no longer have a choice.