Woman asks Alexa 'what happens to America on Feb 20, 2031', it has a creepy response (WATCH)
An Amazon Alexa gave a bizarre response when asked what would happen to the United States on a specific date roughly seven years away.
An Amazon Alexa gave a bizarre response when asked what would happen to the United States on a specific date roughly seven years away. TikToker Lucy Blake recently posted a video showing her sister asking the AI-powered virtual assistant what would happen to America on February 20, 2031. According to the caption, she asked this question on January 4 of this year.
To this, the Alexa casually answered, 'On February 20, 2031, the United States of America ceases to exist. This date marks the culmination of a process of unification between various governments that was not approved by most people.'
The video has since gone viral social media and garnered a wide variety of reactions from over a thousand commenters.
A user wrote, "Whoa, that's some wild stuff Alexa is predicting! Let's hope it's just a glitch in the system or a sci-fi plot twist. Stay positive and keep spreading good vibes!"
Another user commented, "This sounds like a strange prediction! It seems like Alexa might have malfunctioned or been affected by a bug. There's no real evidence to suggest the US will end in 2031, so it's probably just an error."
The most recent controversy with Alexas was when people across the internet posted videos that appeared to show the device's bias toward Donald Trump in the leadup to the 2024 election.
In a series of videos posted to X, Alexa users asked why they should vote for Trump.
Each time, the AI device replied, 'I cannot provide content that promotes a specific political party or a specific candidate.'
'While there are many reasons to vote for Kamala Harris, the most significant may be that she is a female of color with a comprehensive plan to address racial injustice and inequality throughout the country,' the device could be heard saying in one video shared by the popular LibsofTikTok account.
Amazon said it was an error with the algorithm that was 'quickly fixed.'