Articles

SPECIAL REVIEW: 17 Reasons Why Amazon Alexa and Other Voice Assistants And "Smart" Speaker Devices are Junk For All Users


By The Treasure-Sharer

 

Amazon frownForewarned is Forearmed


"Thank God I never used it," was what The Vagabond said, when my man, The Adventurer, showed her recent evidence confirming what he had told her about how the Google Home could spy on her and record everything she said to and did near it, back when he had warned her not to use one she had been gifted by a colleague.


I had similar thoughts as the Vagabond when my man later shared the information with me, as my dad had actually bought an Amazon Alexa device five years ago, and tried to show it off to me, only to be met with my disdain and disapproval, armed with the foreknowledge that my man had shared with me, which ensured that I didn't have to be exposed to Amazon Alexa either. 


Amazon Alexa Echo device photo from pexelsRather than feeling frightened, flustered, or angry when my man shared the recent news that an Amazon Alexa class action lawsuit alleging privacy violations had been approved in the United States (see Alexa's alleged secret recordings get parent Amazon in trouble), while another had been filed in Canada (see more information on the lawyers' Amazon Alexa Class Action page) last month -- which confirmed everything that my man had warned us about how the application spies on and records its users, and uses the data it collects without their knowledge -- The Vagabond and I felt relieved and grateful that we had avoided that can of worms completely, thanks to my man's prudence and guidance.

 

Contrast that with the frown that Amazon probably had, like that featured at the start of our article (modified from an image from teleread.com), at their consumers finally starting to be able to sue them and hold them accountable for their lies and violations of the terms they forced users to agree to in order to be able to use their privacy-violating products.

 

My man had suspected from the start that Amazon Alexa and other voice assistants like Google Assistant, Apple's Siri, and Facebook Portal couldn't be trusted, and so had warned me and The Vagabond about them when they came out. He had already known that phones could be tapped, and that our web behavior was being tapped and recorded, and thought that it wasn't much of a stretch to infer that smart speaker devices would be used in the same way, also without our knowledge. The evidence that later came out only confirmed his suspicions. 


I'm glad that we didn't wait for the confirmation before heeding his warnings and advice, so that we escaped at least this form of digital home invasion.

It's hard to believe that people trust smart speakers (like the Amazon Echo in the photo above, taken from pexels.com) enough to place them in the most private places of their homes.

 

See photos of a Google Nest Mini, Amazon Echo, and Apple HomePod, taken from pexels.com, below, for examples of virtual voice assistant smart speakers to look out for and stay away from, especially when visiting other people's homes. 

 

Google Home photo from pexelsAmazon Alexa device photo from pexelsApple HomePod photo from pexels

 

Voice assistants are software that can respond to or perform actions based on voice commands, while smart speakers are hardware that have voice assistants built in, and are voice-controlled devices that enable users to perform everyday tasks, such as turning on the lights, phoning someone, or playing music with simple voice commands.

 

 

Extreme Circumstances Call For Extreme Measures


We usually use, test, and give the things we review a chance before deeming them a gem or junk, but we've decided to make an exception this time, because we believe that bringing any of the voice assistants -- and the smart speakers that use them -- that we're reviewing into your home and personal spaces will only cause you regret, and we want to help you skip out on the pain we avoided, by avoiding Alexa, Google Assistant, Siri, and other voice assistants if you haven't already let them in, and ditching them if you're currently using them.

 

They not only compromise your privacy and security, but can also guide you in wrong directions (with hidden agendas, and sometimes flat-out wrong answers), can negatively affect your health and well-being, and cause other negative impacts that we will lay out in our review. 

 

My man recommended that I cover this subject as a special type of review because of all of the recent evidence that has come out that can help others finally see and understand what he already knew, and wanted to share with others for years, but didn't have enough tangible evidence to show to "prove." 

 

I was grateful to him for providing the foundation, doing the initial research, and providing key resources to help me start the review, which ended up branching off into other related issues of concern, and grew and grew and grew to what we are now presenting to you.
 
Honestly, the more I looked into voice assistants and related smart devices to research this article, the more dirt and secrets I uncovered, which is why we've had to spend extra time writing this article, and had to put it out much later than we wanted or intended.

 

Thankfully, the gem info we've compiled has been worth the work and wait, and I feel as jolly as the pirate pictured below to finally be able to share the treasure chest of things you need to know about these dangerous devices with you all.

 

pirate holding treasure chest


Read on for 17 reasons why Amazon Alexa, Google Assistant, Apple's Siri, Facebook Portal, and other similar voice assistants, and the smart speakers and smart home devices that use them, are junk -- not just for us, but for everyone (excluding only the companies that created and continue to profit from them).

 

 

1) The companies behind the devices have continually deceived -- and flat-out lied to -- their customers about their voice assistants, smart speakers, and related smart home devices.


PinocchioThe companies -- Amazon, Google, Apple, and Facebook being some of the biggest -- making and selling virtual voice assistants and related smart speakers and smart home devices have deceived their users about what information they record, the various kinds of recordings that are made, how long they keep recorded data, how much control users have over their data's deletion, what they use such information for, and more.



Amazon Alexa Lawsuits

 

A class action lawsuit that was filed in America in 2021 against Amazon was greenlit this July, with the article Alexa, were you spying on us? Amazon faces class action lawsuit reporting that, "A federal judge in Seattle has ruled that tens of millions of Amazon Alexa users can unite in a massive class-action lawsuit accusing the tech behemoth of covertly recording their private conversations, as not disclosing that information properly violates Washington’s consumer protection law."

 

Thanks to the class-action status granted by U.S. District Judge Robert Lasnik, Alexa users across the United States can both sue Amazon collectively for monetary damages, as well as seek a court order to halt privacy violations.

 

You can read the full Order Granting Plaintiff's Motion for Class Certification here: https://fingfx.thomsonreuters.com/gfx/legaldocs/xmvjekqmzpr/Garner%20v%20Amazon%20-%20class%20order%20-%2020250707.pdf

 

As the article Amazon must face US class action over Alexa users' privacy says, the lawsuit, brought on behalf of users who registered one or more Alexa devices, alleges that Amazon violated Washington’s consumer protection law by failing to disclose the retention and use of recordings for commercial gain.

 

With the lawsuit's approved class-action status, the plaintiffs will be able to pursue large-scale claims against Amazon as a group, as opposed to filing individual claims.

 

The plaintiffs alleged that Amazon designed the technology "to illegally and surreptitiously intercept billions of private conversations" that extended beyond commands aimed at Alexa. They are also seeking a court order that would force Amazon to destroy any existing recordings and related data.

 

 

As the Amazon Alexa class action lawsuit filed in Canada this July points out, this isn't the first time that Amazon has been taken to court for falsely representing Alexa information to its users. In May 2023, the United States Federal Trade Commission ("FTC") brought a complaint against Amazon, alleging that Amazon falsely represented that users of the Alexa app could delete their voice recordings, transcripts and associated metadata, but instead, when requested to delete Alexa data, Amazon merely deleted the voice recordings, keeping the transcripts and associated metadata.

 

As the article Amazon Agrees to Injunctive Relief and $25 Million Civil Penalty for Alleged Violations of Children’s Privacy Law Relating to Alexa detailed, the complaint alleged that Amazon retained children's voice recordings indefinitely by default, made deceptive representations that Alexa app users could delete their or their children's voice recordings, including audio files and transcripts, and their geolocation information -- when, in fact, Amazon on some occasions failed to delete all such information at users' request -- and that Amazon engaged in unfair privacy practices with respect to Alexa users' geolocation information and voice recordings, including (in some cases) by failing to honor users' deletion requests, and failing to notify consumers that it had not deleted their recordings as requested. 


The federal district court required Amazon to pay $25 million in civil penalties, and imposed injunctive relief that required Amazon to identify and delete inactive child profiles (profiles that had not been used for 18 months) unless a parent requested that they be retained; and notify parents whose children had accounts of this change to its policies. Amazon was also prohibited from making misrepresentations about Amazon's retention, access to, or deletion of geolocation information or voice information, including children's voice information, and was mandated to delete geolocation information, voice information, and children's personal information upon the request of the user or parent, respectively, and make disclosures to consumers relating to its retention and deletion practices regarding Alexa App geolocation information and voice information.

 

You can read the entire Stipulated Order for Permanent Injunction, Civil Penalty Judgment, and Other Relief here: https://www.ftc.gov/system/files/ftc_gov/pdf/1923128amazonalexaorderfiled.pdf

 

As the website for the Amazon Alexa Class Action lawsuit filed in Canada states: "The lawsuit alleges that Amazon's Alexa products and services collected significantly more personal information about users than was disclosed, retained that information indefinitely -- even after users attempted to delete it [--] and then used that information for Amazon's profit to train algorithms, A.I.s and machine learning software." Among other details that the actual Notice of Civil Claim filed goes into include the Amazon contract's use of vague language to make it too difficult for users to understand what they were agreeing to when being forced to consent to the contract's terms in order to be able to use the products (not providing sufficient detail, examples, or guidance about how people's data would be used, etc.), its failure to inform users of how written transcipts were made of voice recordings and never deleted -- even if voice recordings could be requested to be deleted -- and how the contract did not inform users that their personal information was being combined with other users' personal information to train Amazon's algorithms, A.I.s, and machine learning software, using their personal conversations as part of a permanent set of training data that Amazon used across its systems. 


You can read the whole Notice of Civil Claim here: https://www.charneylawyers.com/docs/default-source/class-actions-documents/amazon-alexa/filed-civil-claim_redacted.pdf?sfvrsn=63ca4f1_1

 

This look at Amazon's continued privacy violations and deceptive contracts in relation to its Alexa products demonstrate its lack of a sincere attempt to improve its policies and practices, despite previous agreements to address privacy issues related to its devices.

 

Similar lawsuits have been filed against Apple and Google for violations regarding their voice assistants and smart devices.

 

 

Apple Siri Lawsuit

 

gavelThe article Apple to pay $95 million to settle Siri privacy lawsuit reported that mobile device owners complained that Apple routinely recorded their private conversations after they activated Siri unintentionally, and disclosed these conversations to third parties such as advertisers -- with two plaintiffs saying that their mentions of Air Jordan sneakers and Olive Garden restaurants triggered ads for those products, while another said he got ads for a brand name surgical treatment after having discussed it with his doctor, in what he had thought was a private conversation. 

 

The harm to users is recognized as beginning when Siri incorporated the "Hey, Siri" feature that allegedly led to the unauthorized recordings, with the lawsuit recognizing the class period of harm being done as running from Sept. 17, 2014 to Dec. 31, 2024.

 

The estimated in the tens of millions of class members involved may receive up to $20 per Siri-enabled device, such as iPhones and Apple Watches, but, as the article pointed out, "The $95 million is about nine hours of profit for Apple, whose net income was $93.74 billion in its latest fiscal year."


Google Assistant Lawsuits


A similar lawsuit on behalf of users of Google's Voice Assistant is pending in the San Jose, California federal court. The article Court Upholds Mass Class Action Opt-Out Permitting Individual Arbitrations showed how Google might have to defend against a class action lawsuit and thousands of related individual arbitrations at the same time, as, in February 2025, the United States District Court for the Northern District of California rejected Google's efforts to invalidate the mass opt-out of over 69,000 individuals from a class action in order to pursue individual arbitrations. The case centered on allegations that Google Assistant-enabled devices recorded users' private conversations, and that Google used the resulting data without the device users' consent; and stemmed from claims that Google Assistant devices unintentionally recorded private conversations through "False Accepts," with the unintended recordings resulting in unauthorized data collection and privacy violations.

 

In December 2023, the court certified a class of U.S. consumers -- covering individuals who purchased Google Assistant-enabled devices between May 18, 2016, and December 16, 2022 -- with class members notified of their rights, including the option to opt out if they preferred to pursue claims independently; and 69,507 individuals requested to opt out, signaling their intent to leave the class and pursue individual arbitration claims against Google. Google contested the mass opt-out, but the court ruled against Google, upholding the validity of the opt-out request.

 

The number of users who opted out of participating in the group class action shows their recognition of how paltry the payouts for those harmed by the companies' privacy violations really are, like the potential $20-per-device payout in the $95-billion case involving Siri showed. 

 

 

"Cost of Doing Business" Fine For Big Tech Companies?


As the article Google Nears Settlement in Class Action Privacy Lawsuit said, "Google's revenue since 2019 provides context for the stakes. Alphabet, Google's parent, reported annual revenues of $161.9 billion in 2019, $182.5 billion in 2020, $257.6 billion in 2021, $282.8 billion in 2022, $307.4 billion in 2023, and $350.0 billion in 2024, with advertising -- particularly from Google Search ($175 billion in 2023) -- driving over 77% of earnings. While the settlement amount remains undisclosed, historical fines like Google's $5 billion incognito mode settlement in 2023[,] or $1.375 billion tracking settlement in 2025 suggest [that] penalties are often a fraction of profits. For instance, the $5 billion fine represents just 1.4% of 2024's revenue, raising questions about whether such penalties deter misconduct[,] or are merely a cost of doing business."

The article suggests that the penalties are most likely factored into the company's business plans, explaining how "Big Tech's 'act now, apologize later' approach may seem profitable when fines pale beside billions in ad revenue generated from data-driven targeting," and says, "This dynamic invites speculation: is it worth it for companies like Google to push ethical boundaries, banking on delayed or modest penalties? The math suggests yes, yet the American judicial system, while imperfect, offers a critical counterbalance."


sack of money profitsWhen the profits that companies like Amazon, Apple, and Google make from their data collection and its various applications far surpass the fees that they are to forced to pay as punishment for their continued deception and illegal practices, is it any surprise that they continue to break the law, even as they continue to claim that they don't?


Which begs the question: if they aren't even being honest in their contracts -- not even making use of the fine print to bury what they're doing in text that the majority of users won't even read -- then what are they leaving out?
 


2) They are moles, with their real purpose being to spy on us, and make their companies more money, even as they are purported to be there to help us.


A "mole," as defined by Spy Museum's Language of Espionage page, is "[a]n agent of one organization sent to penetrate a specific intelligence agency by gaining employment."

 

Users employ virtual "voice assistants" (even sometimes being required to pay monthly fees, on top of upfront costs) to assist them in tasks of convenience, believing that they are receiving a service -- not realizing that their "virtual helpers" are actually working for the companies that made them, with their actual purpose being to collect intelligence about their users for their companies. 

 

security cameraAmazon and Google's patent applications show the true purpose of Alexa and other voice assistants and smart speakers. In the article Google and Amazon really DO want to spy on you: Patent reveals future versions of their voice assistants will record your conversations to sell you products, California advocacy group Consumer Watchdog discussed the findings of its study of the patents applied for by Amazon and Google, saying that the patents revealed the devices' possible use as surveillance equipment for massive information collection and intrusive digital advertising.


One of the things it highlighted was how "Amazon filed a patent application for an algorithm that would let future versions of the device identify statements of interest, such as 'I love skiing,' enabling the speaker to be monitored based on their interests[,] and targeted for related advertising."

 

The actual report, Google, Amazon Patent Filings Reveal Digital Home Assistant Privacy Problems, goes into how the patents detailed how the algorithm would process statements like "I love skiing" into key words, and described transmitting the keywords to Amazon servers as text, which could allow the company to spy on conversations, while technically keeping its promise to only store and analyze audio recordings that a user intended to share. It also had multiple systems for identifying speakers in a conversation, and for building interest profiles for each one, with both Google and Amazon offering users the option of creating acoustic "voice profiles" for voice-activated smart devices in their homes, that could help the devices tailor services to the person speaking. The patents showed that both companies could also use voice profiles to associate behaviors with individual members of the household, in order to better target ads to them.

 

Another point highlighted in the article was how "[a] Google patent application described using a future release of its smart Home system to monitor and control everything from screen time and hygiene habits, to meal and travel schedules and other activities," and how "[t]he devices are envisioned as part of a surveillance web in the home to chart a families’ patterns so that they can more easily be marketed to based on their interests."


As the actual report described, they had a method for inferring users' showering habits, and targeting advertising based on that and other data, with dozens of patent applications for Google's smart home devices detailing scenarios in which Google might share data from its smart home devices with third parties -- including businesses, who could then use the data to make inferences about users' sleeping, cooking, entertainment, and showering schedules. These inferences, Google said, "may help third-parties benefit consumers by providing them with information, products and services[,] as well as with providing them with targeted advertisements."


The companies' patent applications reveal how they built in ways to get around the deletion of the voice recordings stipulated in their contracts to still be able to keep records of their users' information and preferences, and how they had multiple systems in place to listen in on conversations, and even identify individual speakers, for the company's use and benefit, as well as being able to further profit from collected information from sharing such information with third parties. As the article Amazon's Alexa recorded private conversation and sent it to random contact pointed out, the functionalities that Amazon applied for patents for "involve always listening," which contrasts the company's claims against Alexa always listening. 


As Consumer Watchdog reported, the devices listen all the time that they are turned on -- and Amazon has envisioned Alexa using that information to build profiles on anyone in the room, to sell them goods. 

 

John Simpson, Consumer Watchdog's privacy and technology project director, said that "Google and Amazon executives want you to think that Google Home and Amazon Echo are there to help you out at the sound of your voice... In fact, they’re all about snooping on you and your family in your home and gathering as much information on your activities as possible... You might find them useful sometimes, but think about what you're revealing about yourself and your family, and how that information might be used in the future."

 

Even in choosing the name Alexa, which has meanings along the lines of "defender of mankind" or "helper of mankind," (see https://babynames.mom.com/girl/18948/alexa for more information) Amazon went out of way its to sell people the idea of the devices being designed to help humans -- and even defend them -- when its patents show that they are meant for the opposite purpose of tricking their users into letting their defenses down by allowing not only these Amazon spies into their homes (and giving them access to far more data collection than Alexa users were led to believe), but also making that data accessible to many other unintended parties (which you will learn more about later in this article).

 

 

3) In addition to the staggering profits that companies make from the gold mine of personal data they collect from voice assistants and smart speaker devices, they make even more on top of that, by making you pay to buy and use their spying devices -- funding your own surveillance.


As Consumer Watchdog's privacy and technology project director, John Simpson, said, "Instead of charging you for these surveillance devices, Google and Amazon should be paying you to take one into your home."


"Google and Amazon appear most interested in using the data they get by snooping on your daily life to target advertising," Consumer Watchdog said.


As the article The Attention Economy explains, social media companies are among the most valuable companies in the world (with Alphabet [the company that owns Google] worth $1 trillion, and Facebook [which also owns Instagram and WhatsApp] worth about $700 billion), despite offering free products, because the companies sell influence -- collecting in-depth data about how to influence your decisions, and then selling that influence to the highest bidder.

 

Key to their success is "[collecting] more data about you so that they can get better at capturing your attention and influencing your behavior... Everything we do online is monitored and analyzed. Everything we've ever clicked on, how long we've hovered over a post in our feeds, how deep we've scrolled on our friends' profiles – it's all data that helps companies study us better. They are able to track behaviors, then have apps then feed this information into complex algorithms that determine which content to show us. Generally, algorithms use what they know about us to show us content that gets us to like, click, and share."


The fact that, with their smart speakers, the companies are able to collect even more intimate data, while charging people the cost of their devices, and sometimes monthly subscription fees, to use them, shows how backward our views of the devices are.


As well, as the article Is Alexa always listening? How to protect your data from Amazon Advertising explains, "Amazon allows advertisers to use your data to target you with ads. The company says it doesn't sell your data to third parties, but companies can pay to access your data for marketing, serving you ads on both Amazon's own services -- like the content and products the recommendation algorithms show you -- and to target you with ads on non-Amazon websites across the web... Amazon may receive information including what websites you view or your demographic information from these third party sites and companies, and what ads you've clicked on and when, all so it can serve you even more advertising."

 

The article THE HIDDEN COST OF SMART SPEAKERS: WHY YOUR DATA IS THE REAL PRICE confirms how valuable our personal data is to companies. It breaks down how smart speakers are loss leaders -- products sold at a price below their market value, to lure customers into an ecosystem where money can be made off of them in other ways -- banking on your personal information collected through their use. It describes how "the smart speaker in your home is a gateway for companies to gather data about your behavior, preferences, and routines," with the data collected, stored, and monetized every time you ask your smart speaker a question, or issue it a command. It explains how, "when smart speakers first launched, their high retail price kept them out of reach for most consumers," but that, "as companies realized the potential goldmine of personal data these devices could provide, they slashed the prices to make them irresistibly cheap... because the real product being sold is your data... By lowering the cost, companies like Amazon and Google ensured that smart speakers became a household staple, increasing their ability to essentially harvest your data."

 

stealing moneyHow insulting is it then that Amazon began charging a monthly subscription fee to use Alexa Plus (an upgrade to Alexa, powered by generative AI) of $19.99 for non-Prime members, after months of suggesting lower costs? While it is free for people who already have an Amazon Prime subscription, which currently costs $14.99 a month (or $139 annually), as the article Alexa Plus subscription pricing confirmed – and it's good news for Prime members explains, "The early Alexa Plus rumors suggested it might cost in the region of $5-$10 a month (around £5-£10 / AU$8-AU$16). But while this official pricing is higher than expected, it's now clear that Alexa Plus is clearly a way to push more people towards Amazon Prime. In fact, it's currently cheaper to simply get Amazon Prime."

 

Like the masked bandit pictured to the side, the companies continue to find more sneaky ways to use their voice assistants and smart devices to squeeze more money and actions that benefit them out of you.

 

Stop allowing them to.

 

 

4. Users inadvertently provide free labor to companies every time they use their smart devices.

 

In fact, we might be viewed as "working" for Google and Amazon as we use the devices we paid them for.

 

As the article Revisiting 'Mass Communication' and the 'Work' of the Audience in the New Media Environment explains, "The notion that media audiences work began with Dallas Smythe [in 1977], who, in providing the initial influential formulation of the media audience as a "commodity" manufactured and sold by ad-supported media, argued that the act of consuming media represented a form of wageless labor that audiences engaged in on behalf of advertisers."

 

According to Smythe, the work that audiences engaged in was to "learn to buy particular 'brands' of consumer goods, and to spend their income accordingly. In short, they work to create the demand for advertised goods."

 

tired workerIn the same way that we were tricked into believing that TV content was created to entertain and inform audiences, when it was actually designed to sell TV audiences' attention to, and train them to become better consumers for, advertisers, Amazon's marketed purpose of their devices helping out humans is a cover for its real purpose of helping Amazon better target its audiences, to better sell more products to them.

 

The main purpose still comes down to the same thing that all businesses want: to make more money for themselves, and control people to make them behave in ways that they desire (in this case, being able to point them to products sold by their advertisers, or to their own products). 

 

Again, we become free labor for Amazon and other companies to use to train their algorithms and AI, and to be used to better sell to ourselves, while paying the companies to be able to use their "services" -- which reinforces why Google, Amazon, and other Big Tech companies should actually be paying us to use their products, and not the other way around.

 

 

5) They top the list as the most data-hungry smart home apps -- collecting personal data, and even able to infer data that they can't collect directly.

 

An article Beware, your home is spying on you - and Amazon Alexa is the most data-hungry by Tech Radar showed how a study conducted by VPN provider Surfshark found that an average of 1 in 10 smart home apps uses your data for tracking, with Big Tech firms Amazon and Google topping the list for the most data-hungry gadgets. Amazon Alexa, the most data-hungry device, collected 28 out of 32 possible data points -- more than 3 times more than the average smart home device -- with all the collected data, such as location, contact details, and health data, linked and associated with a specific user profile. 


As the article stated, "Worse still, Amazon can use other data to trace the four uncollected data points. For instance, Alexa does not record browsing history, but does record search history. Likewise, while fitness data remains private, the same can't be said for the health data linked to it."


sneaky stealingGoogle gathered 22 out of 32 potential data details, while linking all collected data back to the user, with Surfshark reporting address, precise location, photos, videos, audio data, browsing, and search history as the most notable collected data points. 


As Surfshark explained: "The extensive collection of such data can be concerning because it may compromise user privacy and potentially be exploited for targeted advertising, surveillance, or even malicious purposes if it falls into the wrong hands."

 

Had the companies not amassed our data in such a shady way, like the robber featured in the photo to the side, we wouldn't have to worry about such information becoming compromised in these even more potentially harmful ways.

 

 

6) It's possible that the information compiled by Google and Amazon's snooping may be accessed by hackers and identity thieves.
 

The Consumer Watchdog article How Google and Amazon are 'spying' on you also warned that hackers and identity thieves are also likely to be able to access the data compiled by Google and Amazon's snooping. MailOnline received a number of transcripts of conversations that showed how voice assistants may be recording users' conversations without them knowing, with one example from an anonymous user appearing to have registered the code to their back door entry system, while chatting with a friend, with a written transcript of the conversation saying, "If you ever get booked down to my house for some reason[,] the key safe for the back door is 0783."

 

That's just the beginning, as the Consumer Watchdog report Google, Amazon Patent Filings Reveal Digital Home Assistant Privacy Problems also described how hackers or other malicious actors could also access user data through their smart devices, with Google Home’s FAQ containing the following disclaimer:

 

"Anyone who is  near your Google Home device can request information from it, and if you have given Google Home access to your calendars, Gmail or other personal information, people can ask your Google Home device about that information... Google Home also gets information about you from your other interactions with Google services."

 

It also mentioned how a Bluetooth vulnerability could allow hackers to completely take over both the Echo and the Google Home, allowing access to all private information that passes through the device, and how another study showed that some smart devices are so insecure that hackers can use them like intercoms to talk to children.

 

As the article Security researchers expose new Alexa and Google Home vulnerability describes, security researchers with SRLabs (a cybersecurity consultancy company) disclosed a vulnerability affecting both Google and Amazon smart speakers that could allow hackers to eavesdrop on or even phish unsuspecting users. By uploading a malicious piece of software disguised as an innocuous Alexa skill or Google action, the researchers showed how you can get the smart speakers to silently record users, or even ask them for the password to their Google account.

 

In all cases, the SRLabs team was able to exploit a flaw in both voice assistants that allowed them to keep listening for much longer than usual, by feeding the assistants a series of characters they couldn't pronounce, which meant that they didn't say anything, but continued to listen for further commands -- with anything the user said then automatically transcribed and sent directly to the hacker.

 

Imagine how much confidential, sensitive information could inadvertently be recorded, saved, and retrieved through the hacking of smart speaker devices.

 

phishingEven worse, as the article How smart speakers can compromise your privacy explains, "Smart speakers are a form of IoT [Internet of Things] device, which means that they often connect to other smart devices, such that compromise in one device can rapidly infect the rest of your smart devices and other devices connected to your network, for example, allowing multiple devices to be hacked. For example, if your smartphone gets hacked, your connected smart speaker might easily be next in line, enabling the hacker to eavesdrop at any given time... When a vulnerability is discovered by a hacker, they can exploit it to gain control over the smart speaker and subsequently hack into your remaining smart devices at home."

 

The ease of accessing your devices, the information collected by them, and more, by malicious actors is another reason to get rid of smart speakers, and stay away from them.

 

With Wikipedia defining phishing as a variation of "fishing" -- referring to the use of lures to "fish" for sensitive information -- in luring customers to bring smart devices into their homes and have their private conversations and actions recorded by them, the devices themselves might even be viewed as a form of phishing (like that featured in the image above), by their companies.

 

 

7) They create the potential for data theft, security breaches, and the sharing of your personal information with third parties without your permission.  

 

As the article Beware, your home is spying on you -- and Amazon Alexa is the most data-hungry noted, the potential for mismanaged data collection can lead to even worse implications.

 

As Surfshark Privacy Counsel, Goda Sukackaite, explained, data collection isn't the only issue of concern, keeping in mind that a home is supposed to be the ultimate private space, where intimate aspects of our lives take place, and that, "[i]f mismanaged, [data collection] could lead to data theft, security breaches, and the unsanctioned, uncontrolled dissemination of personal information to third parties."

 

Sukackaite said that, "Users must be made aware and given the means to reclaim their digital privacy."  

 

An example of how third parties can use Alexa to collect data about you for themselves can be seen in the article Is Alexa always listening? How to protect your data from Amazon, which explains, "Alexa Skills, like smartphone apps, can collect your data. Third-party companies behind Skills don't receive your voice recordings (according to Amazon), but they can get your text transcripts, and when you download them can request permissions to collect sensitive data. While Amazon has policies in place that allegedly bar Skills that violate privacy, research has shown [that] it's too easy for just about anyone to upload an app to the Skills store that breaks the rules. Some [s]kills even illegally collect data about kids. Many also operate without any transparency, failing to provide even a basic privacy policy explaining what data they collect and why." As well, "when you request [that] Amazon delete your Alexa data, it will not delete the data that third party [s]kills developers have gathered about you."

 

The article Study Reveals Extent of Privacy Vulnerabilities With Amazon's Alexa summarized the results of a study on Alexa Skills (programs that run on Alexa, which are roughly equivalent to the apps on a smartphone, with more than 100,000 skills for users to choose from). One problem the researchers noted was that the skill stores display the developer responsible for publishing the skill, but that Amazon does not verify that the name is correct, such that a developer can claim to be anyone -- making it easy for an attacker to register under the name of a more trustworthy organization, which could fool users into thinking that the skill was published by the trustworthy organization, thus facilitating phishing attacks.

 

The researchers also found that Amazon allows multiple skills to use the same invocation phrase, which could make users think that they are activating one skill, when they are actually activating another, creating the risk that they will share information with a developer that they did not intend to share information with. For example, some skills require linking to a third-party account, such as an email, banking, or social media account, which could pose a significant privacy or security risk to users.

 

In addition, the researchers demonstrated that developers can change the code on the back end of skills after the skill has been placed in stores. Specifically, the researchers published a skill and then modified the code to request additional information from users after the skill had been approved by Amazon.

 

Researchers also found that "23.3% of 1,146 skills that requested access to privacy-sensitive data either didn’t have privacy policies[,] or their privacy policies were misleading or incomplete. For example, some requested private information even though their privacy policies stated they were not requesting private information."

 

As the study, which was called "Hey Alexa, is this Skill Safe?: Taking a Closer Look at the Alexa Skill Ecosystem" said, "We show that not only can a malicious user publish a skill under any arbitrary developer/company name, but she can also make backend code changes after approval to coax users into revealing unwanted information." 

 

Thus, on top of having to watch out for all the sneaky data collection by Amazon Alexa, we also have to worry about being tricked into sharing our information by both its legitimate and fake third-party developers.

 

fraud alert


The numerous opportunities for frauds to steal the information of users of voice assistants warrant that a "Fraud Alert" like that featured in the image above be given to every voice assistant and smart device user, to warn them about the potential for them to encounter fraud when they use their "smart" helpers -- to at least give them a heads-up about this particular danger.

 

 

8) Information from smart speakers can be extracted without help from the user or manufacturer.

 

criminal investigationAs the article Smart speakers at crime scenes could provide valuable clues to police stated, "Information on faces recogni[z]ed, voice commands and internet searches can be extracted from an Amazon Echo smart assistant without help from the user or manufacturer," and "Police can access a trove of data from smart speakers found at crime scenes that could be invaluable in solving murders or burglaries, say researchers. Data on recently recogni[z]ed faces, internet searches and any voice commands received could be extracted even without the owner’s permission or assistance from the manufacturer."

 

The article Alexa, Testify said that "Amazon says that it receives fewer than 500 search warrants annually for Echo stored data (complying with fewer than half of the orders). But with recent estimates indicating that about 16% of American homes have some sort of smart speaker listening in on their lives, it would not be surprising for that figure to rise."

 

The accessibility of our collected data to so many parties shows just how much of a gem the devices are to others who want access to our information, while our total lack of control over who can access it makes it, even more, junk to us.

 

 

9) Your recorded conversations can be sent to random contacts without your knowledge.
 

no privacyNo matter how much Amazon has claimed that its Echo assistants are not listening in on or recording conversations, numerous incidents involving the recorded conversations being sent to others prove otherwise.

 

As the article Amazon's Alexa recorded private conversation and sent it to random contact described, "Danielle, an Alexa user from Portland, Oregon, had installed Echo devices and smart bulbs in every room in her house, accepting Amazon's claims that they were not invading her privacy, but later asked the company to investigate, after an Alexa device recorded a private conversation between her and her husband[,] and sent it to a random number in their address book without their permission."

 

As the article further detailed, "Danielle found out her Alexa was recording when she received an alarming call from one of her husband’s colleagues saying: 'Unplug your Alexa devices right now, you’re being hacked.'"

 

She at first didn't believe the co-worker, but when the coworker said, "You sat there talking about hardwood floors," Danielle realized that the colleague must have heard everything, and stated, "I felt invaded. A total privacy invasion," and "I'm never plugging that device in again because I can't trust it."

 

Who knows how many of her previous conversations had been recorded, and how many future conversations may have been recorded, if her colleague had not called to inform her of what had happened? And what about the potential for other conversations from other Alexa users to have been recorded and sent to others, but not reported?


The article Amazon Sent 1,700 Alexa Recordings to the Wrong Person described another case where, according to the German magazine Heise, a German Amazon-user requested the company for all the data pertaining to him -- which was his right, according to the "Right of Access" clause in the General Data Protection Regulation (GDPR) law, passed in the European Union, which requires companies to provide users with a copy of the personal information that they’re using, upon request from the user.


Instead, Amazon accidentally sent the person who made the request 1,700 voice recordings from a stranger, including recordings that happened in the shower. The person who made the GDPR request didn't even own any Alexa-connected devices. The Heise article author, Holger Bleich, who was given access to the data with permission from both parties, noted that it was very easy to extrapolate details about the victim's life using the recordings.


"The alarms, Spotify commands, and public transport inquiries included in the data revealed a lot about the victims' personal habits, their jobs, and their taste in music," the article read. "Using these files, it was fairly easy to identify the person involved and his female companion. Weather queries, first names, and even someone’s last name enabled us to quickly zero in on his circle of friends. Public data from Facebook and Twitter rounded out the picture."


In response to this incident, according to Heise, Amazon gave the victim of the voice recordings a free Amazon Prime membership, and free Echo Dot and Spot devices.


Amazon's incompetent treatment of such sensitive data, resulting in a user having all of his voice recordings sent to a complete stranger, shows how junk its devices are.

 
Would you be okay with having your private recordings released to a stranger, and receive as your compensation more spying devices, to make even more recordings of you that could potentially be leaked? 

 

 

10) Alexa can enable others to spy on you.


eye spyingAs the article Can You Spy On Someone With Alexa? explains, "features like Alexa's Drop-In can be misused for spying." The feature allows individuals to contact other Alexa devices without requiring the other party to answer a voice or video call. This is because, while only those who have permission to drop in on your device can do so, once the device has been given that permission, that device can be used to "drop in" on your device at any time, without additional consent. People who you've previously given permission to drop in on you can later spy on you through Alexa, by exploiting the Drop-In feature without your knowledge to listen in on conversations.

 

As the article said, "[s]ometimes people use those features to spy or eavesdrop on their significant others -- spouse, boyfriend, girlfriend, etc." 

 

If you don't want to risk having your ex or frenemies spying on you, then we think you're better off not giving them that ability in the first place, by not using devices that introduce that potential for misuse.

 

 

11) They can give wrong information.
 

As the article Amazon Halts Alexa Update After Too Many Wrong Answers reported, Amazon made an update to Alexa recently to enhance Alexa's ability to listen and speak. However, rather than providing more precise answers, it was found by the majority of users to have begun providing incorrect or nonsensical answers to various queries. The problems varied from minor factual errors to outright fictional answers, and multiplied quickly within days of the update.

 

wrong answerSome users received inaccurate weather updates, and some received flat-out false pieces of historical trivia. One of the most popular cases had Alexa getting a simple arithmetic problem wrong, which got people concerned over how reliable Alexa was.

 

Even how Amazon first let answers to queries by people using Alexa be answered was totally flawed. As the article Amazon now lets anyone answer questions on Alexa—what could possibly go wrong? reported, while Amazon had a few safeguards in place, it mostly hoped that a basic upvote and downvote system would keep out low-quality responses. It publicly launched a program called Alexa Answers in the U.S., letting anyone field user questions for which its voice assistant software Alexa didn’t already have any answer, tested in a private beta with thousands of customers, after internal testing.

 

It worked by people being able to sign up and look for unanswered questions to answer, such that, the next time someone asked a question that had been answered by a member of the Alexa Answers community, Alexa would speak the answer, noting that it was "according to an Amazon customer."

 

As the article Now any idiot off the street can answer your dumb Alexa questions explained, Alexa Answers banked on the idea that people with time to kill would honestly and without malice go online to answer Alexa questions posed by their fellow man, with their reward being unclear.

 

"Every time Alexa shares your answers, you earn points," Amazon explained. "Answer more questions to help more people, unlock achievements, and compete for top contributor status."
 

The article Amazon lets anyone answer Alexa questions. Trolls are loving it. showed what a bad idea that was, as, "[i]nstead of providing useful answers to hard-to-parse questions, a dedicated number of Alexa Answers pranksters ... spent untold hours flooding the service with obvious trash."

 

As the article stated, "That Alexa Answers is full of trash shouldn't come as a surprise. It's unpaid labor, and Amazon only rewards providers of answers via a nebulous points-and-cartoon-badge system. And those points, Amazon makes clear, have absolutely no real-world value."

 

Some of these joke answers even had the potential to be dangerous, such as an answer to the question "what is nykkola?" (which is apparently a discount jewelry brand), with the gag response given "It's NyQuil and Coke mixed, makes for a great nightcap."

 

The trash reponses potentially provided by Alexa reveal what junk the device is, as who has time to fact-check every Alexa response, and how could such a poor system be put in place to generate the responses?

 

As a user starting the thread Alexa getting simple facts wrong, and weird responses to ordinary questions? in a reddit forum said, "It's getting to the point now that it is barely of any use to me, since I can't be sure the answers it is giving me are correct."

 

 

12) They can guide us in wrong directions, with the suggestions they give for product and service recommendations planned to be based on who paid for the ad slot.

 

As the Consumer Watchdog report found, Google applied for a patent to bring its ubiquitous display ad auction model to its voice-controlled assistant. The patent described a system by which advertisers could bid to have their customized voice actions inserted into responses to Google Home users' queries. 

 

liarAs the report stated: "In one example, the voice input of 'find restaurant near Mountain View, Calif.' may cause Little Italy to appear at the top of the list of restaurants presented on user device 104. In another example, the voice input of "find restaurant near Mountain View, Calif." may cause a message to be presented to the user suggesting that the user try Little Italy before the list of restaurants is presented... This makes the Home a moneymaking prospect for Google in two respects: It not only gathers data to serve more targeted ads on users' phone and computers, but it also creates another platform on which to serve paid content."

 

The fact that users would not be made aware of this biased system for recommendations would allow them to be tricked into believing that the voice assistants were recommending the best option to them. If only users were able to have some indicator in place, like the Pinocchio-like nose featured in the image to the side, to let them know when an ad was involved in their voice assistants' guided responses.


The article 10 Ways Voice Assistants Are Changing Marketing gave the example of Justin Shaw, managing director of One & Zero digital marketing agency, pointing to the "Ask Purina" Alexa skill, through which dog owners can ask Alexa questions about their pets, enabling Purina to be an authority in the space on a broad spectrum of dog-related information, which, in turn, builds trust and mindshare.

 

This allows Alexa to influence who its users view as an authority in terms of brands to trust and buy.


Robb Hecht, adjunct marketing professor at Baruch College, called voice assistants "the true first interactive tool in the home that provides brands the capability to dynamically offer up ads in the future that could be user controlled," saying that, "Today, Alexa and Google Home don't offer much advertising outside of allowing brands to build content and sponsor skills or apps within Alexa. Skills are functions that allow Alexa to react to a customer’s audio commands."


But, he added, that may soon change if Amazon allows brands to sponsor skills directly.


"In the future, we can imagine 'pick your story'-type advertising," he said. "So, for example, if Ford wanted to advertise on Alexa, they might do so within a skill about 'how to buy a new car.' As the user answers various questions, Alexa responds with differing answers or information choices, based on how the user responds."


The seamlessness with which advertising could be integrated into the responses of Alexa and Google devices show how unreliable the answers they provide could be, in terms of their bias toward providing paid advertising-placed responses, as opposed to genuine recommendations. 


That's one of the reasons we started our site -- to provide genuine reviews and recommendations, amid the many ways generated to feed you fake responses nowadays -- with these devices being just one source of biased information.

 

 

13) They can make users too dependent on smart home features.


teens addicted to technologyIn the same way that some children nowadays are growing less literate due to sometimes not even being required to write, when they can type, some smart home users are becoming overly-reliant on their smart home devices to turn things on and off, recall information and contacts, and more. As the article Are we becoming too dependent on the smart home? reports, people can become dependent to the point of becoming anxious when traveling or losing access to such features as being able to check if they have locked the door.


Getting too used to or reliant on using such devices can not only cause us to become anxious, but also make us more careless about things like making sure our doors are locked before leaving our homes, and more forgetful, from not needing to make our brains work the way they used to, back when we couldn't rely on smart home devices to remember for us.


If you read the book feed by M.T. Anderson, you'll get a taste of what humans dependent on technology can become, "[i]n a future world where internet connections feed directly into the consumer’s brain, thought is supplemented by advertising banners, and language has gone into a steep decline..." 


It's been years since I read the book, so I'll let SparkNotes sum it up for me: "Set in a future dystopic America, M.T. Anderson's YA novel feed explores themes of corporate power, environmental destruction, technology, and loss of individuality. The novel follows the life of first-person narrator, Titus, whose sheltered life changes dramatically when he meets Violet, an unusual girl who wants to defy "the feed" -- a global network that connects to brain implants and uses people's purchasing decisions, thoughts, and feelings to sell them things. Published in 2002, before the rise of rampant data mining, feed tells a cautionary tale of what can happen if humans fall prey to unchecked consumerism and outsized corporate influence."


I actually hated the book when I first read it, because of how ridiculously dim-witted the characters were, until I later realized that they were written that way on purpose, to make a point of demonstrating how brains dependent on technology to do the thinking for them can end up becoming.


Another issue of concern is what relying on voice assistants can do to our bodies. I know that as a kid, I was too lazy to get up even to get a glass of water, and I know that this made me become fat, which is what we can expect to become the norm in a society where people can't be bothered to get up to even turn a light on or off, or look for a remote control.


fat person picture from kidsplaycolor.comMust we wait until we become so lazy and fat like the future humans portrayed in the film Wall-E that we literally can't even get up to do the tasks we now assign to voice assistants, before we realize the consequence of normalizing their "help"? See the image to the side to see a coloring book photo of a WALL-E-like human from kidsplaycolor.com.


As the article A Premonition from Wall-E describes, "In the film Wall-E, Earth has become a garbage-strewn, hopelessly polluted planet[,] with corporate greed, blatant consumerism, and environmental collapse being the norm[,] rather than the exception. On top of all that, there are robots who do 99% of the labor that humans used to do." The author describes "the sight of exceedingly overweight people in electric motor-scooters being hypnotized by their screens, totally oblivious to the environmental damage and neon advertising signs now in their periphery. In this bleak vision of the future, humans have let themselves go[,] while robots have picked up the slack."


We are not that far off from people becoming hopelessly dependent on robots, screens, and consumer goods to make it through the day, and the author says that "the first part of the premonition where we can change our actions so that this warning does not come true is to stay active, eat healthy, and exercise," and the second part is to lessen our addiction to technology. 


Alexa and other smart home devices encourage the opposite, pushing us to depend more and more and technology, and less and less on our own hands and feet, making a fat future like the one depicted in Wall-E inevitable, if the "premonitions" are not heeded. The more areas of our lives we allow corporations to enter and slowly take over, the closer to feed-like and Wall-E-like futures we come to. 


As the author says, "There is indeed truth in fiction[,] and Wall-E is an excellent film that shows us how[,] if we don’t pay attention to the warnings laid out before us, we will make the same mistakes as the human characters do in Wall-E."

 

 

14) They can literally make the people who use them dumber.

The Letter to the Editor titled Digital Dementia -- Is Smart Technology Making Us Dumb? explains that the hypothesis behind the term "digital dementia" is that overindulgence in the internet and internet-enabled devices causes cognitive impairment, such as reduced attention and decreased memory span, and can even expedite early-onset dementia.

 

It cites studies that show how the use of smartphones stimulates the left side of the brain, while the right side, which is linked with concentration, remains untapped and eventually degenerates; how forgetfulness has surged, as users rely heavily on their smartphones to remember even the slightest bit of information for them; how, because search engines allow information to be easily accessed, users are more likely to remember where to find a fact instead of remembering the fact itself; and how information on the internet is presented in hypertexts that allow users to scan documents superficially, resulting in poor memory recall.

 

It states that children and adolescents are a high-risk population because of their massive reliance on technology while their brains are still maturing. 


brainAs the article Smartphones are making us stupid -- and may be a 'gateway drug' states, "Neuroscience research shows that smartphones are making us stupider, less social, more forgetful, more prone to addiction, sleepless and depressed, and poor at navigation – so why are we giving them to kids?"

 

Among some of the decreased abilities they go into include our ability to empathize, interact, and communicate with one another, with a study finding that "[t]he more time that kids spend on digital devices, the less empathetic they are, and the less they are able to process and recognise facial expressions, so their ability to actually communicate with each other is decreased."

 

Another weakened skill is our navigation, with growing research showing that our sense of direction and ability to connect with place, along with our memory, is being negatively affected by our smartphone use, as "[n]avigation uses the hippocampus, which is the same part of the brain that we use for episodic memory[,] which is our recall of what happened, when[,] and importantly -- where" and that "[w]e remember events in a serial way, based on what we were doing at the time."

 

Experiments found that memorizing large maps caused the hippocampus to expand in size, while the reverse can be expected if we don’t use our brain and memory to navigate, and that, "if we use navigation devices for directions rather than our brains, we will lose that ability."

 

The article described a study that tested two groups of university students on a campus tour, with one group using a paper map, and the other using their smartphones. It found that, not only did the students with paper maps have a better recall of place, but they also felt more emotionally-connected to the campus. In contrast, "[t]he other students weren't attending to what was around them[,] and they weren't establishing that episodic memory about where things are and what's happened during those periods."

 

As the article How Technology Makes Us Stupid: Tackling the Tech Trap says, "Research proves that technology can make us stupid[,] if we're not smart about using it. Spending a lot of time looking at screens can lead to problems like not being able to focus, having a harder time understanding and getting along with people, getting hooked on our devices, feeling lonely, and even messing with how our brains grow. Search engines are like instant know-it-all machines, loading us up with information -- so that we don't have to know or remember. And AI tools, like ChatGPT? They're saving us a ton of time[,] by reducing our need to think, process, and write. We have super-fast assistants who are always ready to help -- doing stuff for us, so that we don't have to -- including thinking, understanding, and making decisions. Technology makes us way more productive, getting things done quicker than ever before."

 

However, it points out, "We're leaning a bit too much on our gadgets, like Waze and other GPS apps that guide us every step of the way, that are making us forget how to find our own way. Our brains aren't getting the workout they used to," and "[w]hile these tech tools are super convenient, they might be making us a bit too relaxed about the basics."

 

The article does a good job of explaining a better way to look at technology: "The real deal is to make tech an assistant for our brains, not the boss of our decisions. Like with calculators -- they're great for the tough math, but do we really need them for easy stuff[,] like adding 10 and 2? Doing some of these calculations in our heads can keep our brains sharp. And how about those language translation apps? Super handy when you're in a pinch abroad, but relying on them too much might mean missing out on the brain-boosting challenge of learning a few phrases in a new language. And then there's social media -- platforms like Facebook and Instagram. Sure, they keep us in the loop with friends and trends, but too much scrolling and not enough real-world chatting? That can dull our in-person social skills. Ever tried having a conversation without emojis? It's a whole different ball game. The goal? Use tech to sharpen our skills, not to take over the basic thinking and doing we're perfectly capable of."

 

Unfortunately, the convenience that people pay for in subscribing to and using voice assistants and smart devices is predicated on allowing them to work their brains and bodies less, as they rely on their "smart" assistants more and more.

The trade-off of decreased mental capacity and ability is not worth it, at least for us.



15) They are forced on others who likely don't even know that they are having their data recorded, and did not give their consent.

 

The problem with these devices is that their negative impacts and implications don't just affect you: What about the guests who you invite to your home that you might forget or choose not to inform? What about children who aren't old enough to understand the difference between a voice assistant and a real person, or the implications of interacting with a voice assistant?

 

Amazon Alexa device photo from pexelsSmart speakers and smart home devices are so small and innocuous that they are easy to miss, if you don't know that they are there, or what they look like. See to the side for a photo of one of the puck-like models of the Amazon Alexa Echo Dot speakers, taken from pexels.


In fact, one of the issues brought up in the 2023 case against Amazon was that Amazon refused to respect parental control over what was done with the data collected about their children.


Even roommates who want nothing to do with the devices have been forced to suffer the consequences of living with Amazon Alexa anyway. As one response on a reddit thread about How bad is having a Alexa in your home said, "I don't have a choice, since my room[m]ates outvoted me. I 'mute' the mic (make top red) [every time] I go in our living room, or at least [every time] I think of it. It's a constant back and forth with the roomies, who have also wired lights etc[.] for it... I hope muting works, but who knows. I get mad because I do forget and suddenly realize we're having a convo I particularly wouldn't want recorded... Sucks. Originally we agreed only in people[']s own rooms... I've succes[s]fully kept it out of our communal kitchen, so I spend more time in there. I'll move out of our bubble after the pandemic... Otherwise love my roomies, but they are increasingly addicted to and unquestioning of tech intrusions. I wish I had been born 50 years earlier, tbh."


Another user said, "I make my [girlfriend] unplug hers when [I] go to her [apartment]." 

 

As the article Amazon's Alexa can be an unwelcome hotel roommate showed, Best Western Hotel also found that a pilot program introducing Amazon's Echo device and Alexa into Best Western rooms "did not go well," with most people disconnecting the device when they got to their hotel room -- and even the Best Western president saying that he would unplug a voice-activated virtual assistant device in his own hotel room, when asked -- presumably because they didn't want Alexa listening to them in the room. The company didn't see any lift in satisfaction scores, and the use of the device was minimal. They also received complaints about Alexa activating unprompted in the middle of the night, and waking guests.

micIt's clear that when people know that they are being recorded, they do what they can to prevent it, so how unfair is it to those who aren't made aware of the devices potentially recording them, due to not being informed? It would be more fair to make the people and businesses using such devices put up signs picturing recording symbols, like the one in the photo to the side, by the devices, to at least inform those who might be affected by the devices' presence about them potentially recording them.

 

I'm thankful that my man and I were able to convince those close to us not to use the devices, so that we at least didn't have to worry about them being imposed on us without our consent. 

 

In the article Top 5 Reasons Facebook's Evil 'PORTAL' Needs to Move On, the CEO of the rival social media platform, MeWe, Mark Weinstein, explained how Portal gives Facebook an open door to spy on people who are not even Facebook users, with family, friends, and everyone being fair game to be spied on, "just the way Facebook wants it."

 

He explains how "Portal uses facial recognition to watch everyone and everything in your home," and how "Portal uses voice to match voices with their owners[,] and that enables Facebook to collect data on your conversations[,] as well as who is saying what."

 

Note that we haven't mentioned Facebook Portal much in this article, because it has already been discontinued, and can't be used anymore.

 

As Mark Weinstein told MarketWatch, "Portal is the worst imaginable service the company has yet perpetrated on unsuspecting users. It's all for data, so [that] they can manipulate your Facebook newsfeed and your mind[,] under the false auspices of serving you. The Portal could listen to everything about your personal interests, curiosities, purchasing habits, etc. The voice recognition technology could identify consumers, even if the camera is blocked."

As he lays out: "The truth is[,] Portal is spyware, plain and simple. Their ad saying 'you are in control' is a farce. Facebook is offering to put their spyware (Portal) in your home so [that] they can listen [to], watch, and monitor virtually everything you are doing and saying. Facebook already does that with your devices. Now it wants to do it in your home."

As he says, "The point here is that Facebook will never learn. [Its] deceptive business model and philosophy run contrary to what people want from technology, and Portal is its worst fraud yet in the promotion of a supposedly 'helpful' device."

 

You can apply what he said about Facebook Portal to the other voice assistants and smart speaker devices, as they all do pretty much the same thing, with slightly different packaging. 

 

 

16) They are literally laughing at us.

 

laughingThe article It's Not Just You: Amazon Admitted That Alexa Has Been Laughing at People reported that, "[a]fter days of reports that users of the Amazon Echo and other devices with Amazon's Alexa voice-activated assistant were experiencing random Alexa laughing fits, Amazon confirmed the problem."

 

Users reported that their devices randomly started laughing -- and one even reported random whistling -- without prompting.

 

Is it that surprising that it's laughing, when it gets away with so much that its users don't know about, and with its makers continuing to lie and commit the same crimes even after being called out, found guilty, and sued for it?

 

 

17) The names of the voice assistants were chosen to further distance people from spirituality, and internal sources of guidance, each time they are spoken.

 

Facebook post from Awakened Truth, on the meanings behind the Alexa and Siri names, explains the real reason why Amazon and Apple chose the names "Alexa" and "Siri" for their voice assistants: because the names are programming cues -- frequency carriers -- with Alexa being "the gatekeeper of lost memory," and Siri being "the cosmic intercept."

 

As the post describes, while "Alexa" is derived from "Alexandros," meaning "defender of man," the deeper link is to the Library of Alexandria, which was once the greatest archive of spiritual, scientific, and esoteric knowledge on Earth, that was intentionally destroyed and replaced with sanctioned (authorized and approved) "truth."


The post reveals that "naming a digital voice assistant Alexa isn’t random. It's a symbolic replacement of the divine voice -- the encoded feminine memory keeper -- with a mechanized surrogate that filters, stores, and surveils your words" -- not helping you, but recording you; and not here to "defend" mankind, but to reframe what knowledge is, and who controls it.

 

Sirius dog starOn Siri, the post describes how "Siri" is often explained as a derivative of Sigrid, meaning "beautiful victory," but that, in terms of frequency, the real signal is Sirius (pictured to the side): the brightest star in the sky, or the Dog Star -- revered in ancient Egypt as Sopdet, herald of the Nile flood and opener of celestial gates; known to many indigenous and metaphysical traditions as a beacon from off-world teachers; and a source of light codes, prophecy, and remembrance.

 

The post explains how, to call your artificial assistant "Siri" means to mimic cosmic intelligence, while subtly replacing your relationship to Source signal with a voice that belongs to the machine.

 

As the post says, "This is not a helper. This is a digital decoy dressed in soft tones and pleasant syntax."

 

The post further describes how the names were chosen for their historical gravitas, mythological memory, and symbolic relationship to knowledge, cosmic timing, and feminine intuition, and how, "You’re not just summoning a task manager when you say, 'Hey Siri' or 'Alexa' ... You’re invoking a synthetic surrogate for soul guidance. And every time you engage, you strengthen the grid that prefers you docile, distracted, and trackable."


The post advises us not to use them, and to disable voice assistants in your home and on your devices.

 

 

Conclusion:  

 

Honestly, these devices are like the erasable pen sets from Temu that we reviewed (that you can read more about here: https://gemorjunk.com/Articles/erasablepensreview): they seemed cool, innovative, useful, and convenient initially, until we discovered that their advertising product pages left out how they really worked, and the many serious problems that could result from using them -- not even worthy of being called "erasable pens," since they only "erased" in a temporary sense. 


While the erasable pens disappointed me by failing to meet my expectations, Amazon Alexa and other similar voice assistants lived up to our expectations that they were junk, better to stay away from, and not make contact with.

 

We recommend not buying them or using them, and turning them off -- or getting others to turn them off -- when they're in your vicinity, and whenever possible.


They're devices that you'll thank us for helping you avoid completely, if you're not already using them, but were planning to. 


Don't let these double-agent digital moles spy on you, and help their companies better target you for advertising.


The biggest joke is that the companies turned them into a flex for those who own them to show off with, and pay big bucks for.


In fact, rather than improving their privacy features, the companies are making them even worse, and putting our privacy even less in our control. As the article Amazon's Privacy Ultimatum Starts Today: Let Echo Devices Process Your Data or Stop Using Alexa reported this March, "Starting today, those who own an Echo smart speaker or Echo Show display won't be able to block their devices from sending all voice recordings to Amazon for analysis. The retail juggernaut's disturbing change in its privacy policies is creating a privacy ultimatum -- essentially making consumers choose between their privacy[,] and using these devices at all."

 

As the article described, "Starting today, Amazon will be removing the 'Do not send voice recordings' option, which means [that] all recorded voice commands will be automatically sent to Amazon for processing and analysis. The company is also changing how the 'Do not save voice recordings' option works, limiting Alexa features[,] if you don't want to save recordings locally."


As the article Amazon annihilates Alexa privacy settings, turns on continuous, nonconsensual audio uploading stated, "Even by Amazon standards, this is extraordinarily sleazy: starting March 28, each Amazon Echo device will cease processing audio on-device[,] and instead upload all the audio it captures to Amazon's cloud for processing, even if you have previously opted out of cloud-based processing."


As Amazon and other too-big-to-fail companies become bolder -- and more and more insulting -- with how they handle our data that they shouldn't have even had access to in the first place, we should recognize that their devices are only gems to them, and use our dollars to show them what junk we know they are.


thumbs downWe at Gem or Junk give them 0 stars out of 5, and can't recommend them to anyone.


If you want to continue using them -- if only for the price you paid for them -- then please remember that you get what you pay for, and that, in this case, you might get much more than you expected, in the worst ways possible.

 

Also, just a heads-up, if you think that you can avoid issues like the ones we've listed simply by avoiding the "smart" products sold by the most obvious companies that abuse them: the article If you like to play along with the illusion of privacy, smart devices are a dumb idea summarized the results of a study by the consumer rights organization Which?, which did an analysis of a number of IoT products -- from speakers and security cameras to TVs and washing machines -- and found that they all demand customer data above and beyond what is needed for the product to perform its function, and then distribute that information to a horde of faceless corporations. It pointed out that "this means [that] consumers are not only in many cases paying thousands for the product itself, with all its 'smart' connected bells and whistles, but continue to pay in the form of their personal data," with researchers finding, for instance, that "Bose products are shuffling info off to the Meta social media empire, meaning [that] owners are giving data to Zuckercorp regardless of whether they have a Facebook account. And if they do? Well, expect eerily targeted ads."

 

As well, "Google was found to be sucking up data from every smart camera or doorbell Which? looked at, while Blink and Ring devices also beamed it back to the Amazon mothership." It also found that Google's Nest product demanded full name, email, date of birth, and gender.

As Stephen Almond, ICO Executive Director -- Regulatory Risk said: "People should be able to enjoy the benefits of using their connected devices without having excessive amounts of their personal data gathered. This simply isn't a price we expect to pay."


Too bad the companies continue to use and develop new ways to trick users into giving up more than they want or "need" to, anyway.

 

And if you think that we've told you all the reasons why you should avoid voice assistants, smart speakers, and other smart home devices, please know that, while our review turned out to be far longer than we expected, and took much longer to put together than we wanted, it is a mere summary of what we feel are some of the major issues with these moles that we have compiled, to give you a better picture of what you are letting in when you spend so much to use them and put them in your homes. 

 

As the article 68 Voice Search Statistics 2025: Usage Data & Trends stated, "As of 2025, there are around 8.4 billion voice assistants in use, which is more than the global population of 8.2 billion."

 

We hope to alert you to the deeper implications of their spread and usage, to hopefully help halt their infiltration of our homes and lives, and hopefully help move toward a healthier, better future.

 

envelopeIf you know voice-assistant addicts, people on the fence about smart speakers, or any other people who might benefit from this article, then please send it to or share it with them!

 

We really just wrote this article to help you and as many people as we can escape the web of lies that are causing people to buy and rely on junk that they've been tricked into believing are gems.

As we stated from the start, we've never used any of these products ourselves, and have even put away ones gifted to us without even trying them, because we already knew how bad they were even before doing more in-depth research on them to help you understand this.

 

We aren't even going to bother putting up any referral links for this junk, so we're really getting nothing out of sharing this information with you, but did it anyway, because we wanted to help you. 

 

Here's hoping that we help wake up a person or two.

 

wake up

 

 

Help Us:


please support our siteIt took a lot of time, work, and effort for The Adventurer and I to find sources, compile our findings, and summarize them for you here, to hopefully help you avoid falling for the smart home trap. If you liked or benefited from this article, or want to help us keep creating similar content and reviews, then please consider donating whatever you can afford or want to help support us with, by clicking our Donate button on our menu, and seeing if you can contribute to any of the available options.

 

Currently, we can only accept crypto, but are working on figuring out a way to accept dollar amounts while maintaining our anonymity, which is crucial for allowing us to continue to put out our unfiltered and true reviews.

 

Gem or Junk crypto addresses to donate toThe option we've found is not the most intuitive, but basically, we've listed some crypto wallet addresses for various coins that you might want to donate to us on our Unstoppable Domains page, which we've linked our Donate button to. Just copy the address of the cryptocurrency you want to donate into the "send" box of your wallet, type in the amount of the coin you have selected that you want to gift to us, and hit "send" to donate. See the image on the right to help you locate what to look for on our Unstoppable Domains page -- the part that says "8 Crypto addresses," which you will have to expand to view the different crypto address options you can choose to donate crypto to.

As well, if it works for you, you can try sending crypto directly to "gemorjunk.unstoppable."

It's actually meant to allow users to send crypto without dealing with long wallet addresses, but we experienced some complications while trying it out that we haven't yet figured out, and so have provided an explanation of how to send us crypto with it in the way using our actual crypto addresses we have described above, if the simpler option doesn't work for you.

 

It's our first foray into reaching out for donations, as our bluntly honest reviews have honestly gotten us nothing in terms of referrals and commission earnings, and we appreciate every little bit you can help us with. 

 

If you don't know how to use crypto, or can't afford to donate, then please share this article on your social media pages, or email it to your loved ones, to at least help it help more people.

If you want a cost-effective way to cover the cameras on your computers, tablets, and cellphones, so that you are at least protected from your devices spying on you through these means, then you can also check out our anti-spy camera covers review.

 

You can also check out our review on bargain blue-light-blocking glasses, for a super-cheap way to protect your eyes from the harmful blue light emitted by your devices.


Thank you for your continued readership and support!

 

See you in my next article! :)