print-icon
print-icon

Google's Woke AI Is Hilariously But Frighteningly Broken

Tyler Durden's Photo
by Tyler Durden
Tuesday, May 28, 2024 - 04:25 PM

Authored by Steve Watson via Modernity.news,

Google’s hastily rolled out AI Overview feature is disastrously broken, returning searches claiming that people should spread glue on pizzas, eat rocks, and that it’s safe for pregnant women to smoke cigarettes.

The Verge reports that Google is scrambling to manually disable the AI Overview feature for certain searches after users found it giving our some truly bizarre advice, and information that is just made up nonsense.

Apparently cockroaches are so named because they live in penis holes.

Smoking is recommended when pregnant, who would have known?

Can it really not get basic maths correct?

I’ll take extra glue on my pizza please.

Would you run off a cliff if Google’s AI told you to?

Mmmmm tasty rocks.

Google claims that the AI generally provides “high quality information” and that the bizarre responses are either due to uncommon queries or are just doctored.

As we previously highlighted, Google’s Gemini AI, on which the Overview feature is based, is infested with wokery.

It also clearly cannot discern between right and wrong, having declared that calling communism “evil” is “harmful and misleading” and refusing to say pedophilia is “wrong.”

Google’s AI also declared that it would not misgender Caitlyn Jenner in order to prevent a nuclear apocalypse.

X owner Elon Musk has warned that this AI is going to be at the centre of everything on the internet soon enough, taking over Google’s search engine and YouTube.

Musk further noted that he doubts “Google’s woke bureaucratic blob” will allow it to be properly fixed.

*  *  *

Your support is crucial in helping us defeat mass censorship. Please consider donating via Locals or check out our unique merch. Follow us on X @ModernityNews.

0
Loading...