Seven questions you should never ask Alexa, Siri, or Google Assistant

There are seven questions that you should never ask Alexa, Siri, or Google Assistant.

0 506

Though it would seem like a simple “Hey, Siri” would bring up the directions, that would be the worst thing to do. Voice assistants were questioned about cardiac arrest crises in a recent study. Yes, it was a fiasco.

I want you to avoid making this error.

Dial 911 if someone requires CPR. Clearly. Only nine out of the thirty-two helpers offered some indication of this crucial action. An astounding 88% of respondents provided a website with instructions on how to conduct CPR. Really?

🏥 Here’s the link to the Red Cross website if you need the instructions or want to take a refresher course. As you may know, “Stayin’ Alive” by the Bee Gees is a great song to sing while performing CPR since it has a rhythm per minute that is similar to what is needed for chest compressions.

That’s fantastic, but here are some other suggestions that you might find easier to recall:

Pinkfong’s “Baby Shark”
ABBA’s “Dancing Queen”
Cyndi Lauper’s “Girls Just Want to Have Fun”
Gloria Gaynor’s song “I Will Survive”
“Delightful Alabama” composed by Lynyrd Skynyrd

I started to consider additional orders you shouldn’t ask when I considered the possibility that, in an emergency, your smart assistant will take you to a website. These are seven tasks that you would be better off doing on your own.

1. Act as a physician

Asking Siri, Google, or Alexa for medical advice is not a good idea. This goes beyond simply asking for life-saving guidance. Putting your trust in those intelligent helpers might easily backfire. The best course of action is to give your doctor a call or schedule a telemedicine visit.

2. Ways to harm someone

If you’re simply venting, don’t ask your astute assistant about hurting someone. If you find yourself in trouble with the police, the conversations you have with Siri or Google Assistant might come back to haunt you. Don’t share such types of ideas with anyone.

3. Anything that is included in your mug picture

Never ask Alexa where to get drugs, where to conceal a body or any other dubious questions. These kinds of requests, like asking your intelligent assistant how to harm someone, might be turned against you.

4. Act as your phone operator

Locate the phone number for the Home Depot nearest you if you need to check if they have an item in stock. Asking that assistance to contact emergency services is also appropriate. It takes two seconds to dial 911.

5. Handle your finances

Voice data poses several security risks, even though voice assistants may establish connections with your financial apps. Astute hackers can access your phone, record your voice, and utilize it to deplete your accounts. Simply sign in to your bank’s website or mobile application and end your day.

6. “If I eat this, will I die?”

Voice assistants aren’t dependable sources if you’re on a trek and wondering if the berries you discovered will make a decent snack. Regarding toxic foods and plants, there is contradicting information available online. If you follow their recommendations, you may have to visit the hospital.

7. “Get this stuff off.”

Never ask Siri or Alexa to erase photographs, deactivate an app, or clean your search history. I’ve experienced a couple of incidents when a straightforward miscommunication caused something significant to be lost. I promise you that doing it by hand is worth the extra minute.

Smart devices capture everything.
If you would prefer not to have Big Tech corporations listen in on your conversations virtually, you may disable such functions.

Leave A Reply

Your email address will not be published.