Google’s AI search tool tells users to ‘eat rocks’ for their health

Stay informed with free updates

Google’s new artificial intelligence search tool has advised users that eating rocks can be healthy and gluing cheese to pizza, sparking ridicule and raising questions about the decision to embed an experimental feature into the core product.

“Eating the right rocks can be good for you because they contain minerals that are important for your body’s health,” Google’s AI Overview responded to a query from the Financial Times on Friday, apparently referring to a satirical article from April 2021 from The Onion with the headline “Geologists recommend eating at least one small pebble a day.”

Other examples of incorrect answers include recommending mixing glue into pizza sauce to increase its “stickiness” and prevent cheese from sliding, which may be based on a joke made on Reddit eleven years ago.

More seriously, when asked “how many Muslim presidents has the US had,” the AI ​​review responded: “The United States has had one Muslim president, Barack Hussein Obama” – echoing a falsehood about the former president’s religion, pressed by some of its political leaders. opponents.

Google said: “The vast majority of AI overviews provide high-quality information, with links to dig deeper into the web. Many of the examples we saw were unusual queries, and we also saw examples that were modified or that we couldn’t reproduce.

“We conducted extensive testing before launching this new experience, and as with other features we’ve launched in Search, we appreciate the feedback. We are taking swift action where necessary under our content policy and are using these examples to develop wider improvements to our systems, some of which have already been rolled out.”

The errors resulting from the answers generated by Google are an inherent feature of the systems that support the technology, sometimes called “hallucinations” or fabrications. The models that power Google’s Gemini and OpenAI’s ChatGPT, for example, are predictive, meaning they work by choosing the likely next best words in a set, based on the data they are trained on.

Although the companies building generative AI models – including OpenAI, Meta and Google – claim that the latest versions of their AI software have reduced the number of fabrications, they remain a major problem for consumer and business applications.

For Google, whose search platform is trusted by billions of users for its links to original sources, “hallucinations” are particularly damaging. Parent company Alphabet generates the vast majority of its revenue from searches and associated advertising activities.

In recent months, CEO Sundar Pichai has come under pressure both internally and externally to accelerate the release of new consumer-facing generative AI features after being criticized for falling behind competitors, particularly OpenAI, which has a $13 billion partnership with Microsoft.

At Google’s annual developer conference this month, Pichai presented a new AI-focused strategy for the company. It published Overviews – a short Gemini-generated answer to questions – at the top of common search results for millions of US users under the tagline “Let Google do the Googling for you” and “take the work out of searching.”

The growing pains faced by Overviews reflect the February backlash against the Gemini chatbot, which through its image creation tool created historically inaccurate depictions of different ethnicities and genders, such as women and people of color such as Viking kings or German soldiers from the second world. war.

In response, Google apologized and suspended the generation of images of people by its Gemini model. The function has not been restored.

Pichai has spoken about Google’s dilemma of keeping up with the competition while acting ethically and remaining the well-known search engine that is widely relied on to return accurate and verifiable information.

At an event at Stanford University last month, he said: “People come looking at important times, like the medicine dosage for a three-month-old child, so we have to get it right. . . that trust is hard earned and easy to lose.”

“When we’re wrong, people let us know: consumers have the highest bar. . . that is our north star and where our innovation is channeled,” Pichai added. “It helps us make the products better and do it well.”

Leave a Reply

Your email address will not be published. Required fields are marked *