Home Technology Google AI chatbot Bard dishes up wrong answer in first demo

Google AI chatbot Bard dishes up wrong answer in first demo

0
Google AI chatbot Bard dishes up wrong answer in first demo

[ad_1]

Editor’s take: It’s been a big week for artificial intelligence but a misstep out of the gate highlights the danger associated with moving too quickly and pushing tech to the masses before it is fully vetted. Such is especially true of AI systems that dole out information that some could interpret as fact.

Microsoft announced an AI-powered Bing search engine and Edge browser while Google introduced the world to Bard, an experimental conversational AI service powered by its Language Model for Dialogue Applications (or LaMDA for short). Chinese tech company Baidu is also working on a ChatGPT-like service called Ernie.

They’re all in the early stage of development and will need more time for their respective creators to iron out all the wrinkles as evident by an embarrassing Bard misstep.

In a short video demonstrating how Bard works, the AI was asked about discoveries from the James Webb Space Telescope that could be shared with a 9-year-old. Bard supplied several answers, including that the telescope “took the very first pictures of a planet outside of our own solar system.” The problem is, that’s not accurate.

According to NASA, the first photo of an exoplanet (2M1207b) was imaged by the European Southern Observatory’s Very Large Telescope (VLT) in 2004. Webb captured its first photo of an exoplanet last year, but it wasn’t the very first photo of an exoplanet ever captured.

The tweet featuring the incorrect response was published on February 6 and has amassed over a million views. It remains live as of this writing and is still featured on Google’s blog post announcing Bard.

In its announcement earlier this week, Google said it was making Bard available to trusted testers ahead of a wider rollout to the public in the coming weeks.

In a FAQ for its AI-generated responses, Microsoft warned that Bing will sometimes misrepresent the information it finds and you could see responses that sound convincing but are inaccurate, incomplete or inappropriate. Redmond encourages people to use their own judgement and double check facts before making decisions or taking action based on Bing’s responses.

Image credit: George Becker



[ad_2]

LEAVE A REPLY

Please enter your comment!
Please enter your name here