Thankfully there are a couple different ways to opt-out of the new Google AI service it’s automatically opting you into.
Google’s revolutionising the search engine game using AI… obviously. And that means it’s giving out worse advice than you’ll find a president espousing during a global pandemic. Okay, it wasn’t suggesting you inject bleach, but just prior to a full launch as Google’s new AI Overviews feature it was recommending users drink a couple of litres of specifically light-coloured urine.
This was during its initial Lab testing phase, where it was called the Search Generative Experience (SGE), but it was still recommending the drinking of urine as an effective method for quickly passing kidney stones a week or so before Google announced it was rolling the feature out across the US.
Now called AI Overviews, the feature—which essentially seems to be replacing Featured Snippets at the top of a Google search page—is kicking off in the US first, with other territories “coming soon.”
Even apart from the dodgy AI-driven advice, we’re not really here for it. To us it really looks like Google is intent on breaking the internet it helped create, and not for the benefit of anyone apart from itself. But because it is based on current generative AI technology, with all its flaws, you are going to get more erroneous advice than you would if it was just serving up different pages with actual authority and expertise for you to go check out yourself.
Instead, it’s doing the classic AI thing of presenting completely incorrect information as factually accurate with complete confidence in its fuzzy thinking. As one user states looking for ways to disable these overviews: “I’m more than capable of misinterpreting internet articles on my own.”
Google has learned some things in its Lab phase, however, and our own experts have seen it generating fewer AI Overviews than in testing, and it’s more readily citing its sources, too. It’s also gotten smaller than it was in earlier iterations, so it’s not taking up as much search page real estate.
But still, with its focus on so-called ‘how to’ content, AI’s propensity to confidently dish out the wrong information isn’t great. And I’d expect to see more weird, hilarious, or even worrying recommendations from AI Overviews as Google attempts to sharpen up its search experience.