You may have noticed if you've Googled something in the last week that the search results page on the site is now topped with an "AI Overview" that purports to answer whatever question you may have been trying to get answered. The trouble is, the answers look very confident and succinct, but some can be blatantly wrong.
There have been hiccups since Google began rolling out its Gemini AI, like when its image module was called out in February for producing inaccurate images of things like the United States' Founding Fathers — randomly inserting people of color for diversity's sake. And in early April, the Washington Post pointed out inaccuracies in an earlier version of Google's AI-powered search results. (A search for the "plot of The Big Bang Theory," for instance, resulted in a scientific explanation of The Big Bang itself.)
There may have been some improvements made since then, but some results still come out wonky. A search for "the first inaugural ball" resulted in an AI Overview summary that said it occurred at "Dolley Madison's hotel" in Washington, DC — when in fact it occurred at Long's Hotel, and Dolley Madison co-hosted. (That result subsequently came up with a correct reference to "Mr. Long's Hotel" when I tried it 24 hours later.)
A search for "first television show" came back with an AI Overview that is mostly correct — however further dives into links about the history of television show that there were earlier broadcasts than the one noted in 1928.
The Washington Post noted that Google's AI didn't know that the Academy Awards had just happened, three days after the ceremony, and instead it said the Oscars were coming up and provided a list of nominees. Meanwhile, scrolling down to regular search results would have been more helpful.
"What we see with generative AI is that Google can do more of the searching for you," says Liz Reid, Google's newly installed head of Search, speaking to The Verge last week. "It can take a bunch of the hard work out of searching, so you can focus on the parts you want to do to get things done, or on the parts of exploring that you find exciting."
But is a web search really such "hard work"?
Not every search gets an AI Overview — basic searches for stores or restaurants, for instance, will produce normal search results.
The shift toward AI for search could be damaging in a number of ways, including to content websites that rely on traffic for revenue, by providing a paragraph summary that discourages further clicking. As Axios notes, "This system still relies on web-based information, but it doesn't nourish the creators of that information with users' visits."
It also may impact Google's own bottom line when it comes to ad revenue and sponsored results — though they're already solving for this by sticking ads into the AI-generated results, as The Verge reported this week.
"In a world where everyone gets answers and doesn't have to click on links, the biggest loser is Google," says Perplexity AI CEO and co-founder Aravind Srinivas, per Axios.
There is also the issue of AI "hallucinations," several of which the Washington Post pointed out with Google's AI search results in early April — a search for a fictional restaurant called "Danny's Dan Dan Noodles in San Francisco" came back with a bizarrely convincing reply saying it can have "long lines and crazy wait times."
Google's Reid told the Post that the frequency of such hallucinations was "very low," and that accuracy in the results had "meaningfully" gone up over the past year.
"I don’t want to minimize it — it is a challenge with the technology" and something "we’re really working on," Reid told the paper. Reid also suggested that users should click on the links provided next to every AI Overview so that they can check for accuracy themselves.
Photo: Pawel Czerwinski