Friday, 15 Nov 2024

Bard: how Google’s chatbot gave me a comedy of errors

Bard: how Google’s chatbot gave me a comedy of errors


Bard: how Google’s chatbot gave me a comedy of errors
1.8 k views

In June 2022, the Google engineer Blake Lemoine was suspended from his job after he spoke out about his belief that the company's LaMDA chatbot was sentient.

"LaMDA is a sweet kid who just wants to help the world be a better place for all of us," Lemoine said in a parting email to colleagues. Now, six months on, the chatbot that he risked his career to free has been released to the public in the form of Bard, Google's answer to OpenAI's ChatGPT and Microsoft's Bing Chat.

While Bard is built on top of LaMDA, it's not exactly the same. Google says it has worked hard to ensure that Bard does not repeat the flaws of earlier systems. That means avoiding "hallucinations", where it makes up facts to avoid admitting it doesn't know an answer, and ensuring "alignment", keeping the conversation from veering off into disturbing or alarming tangents.

After a day of using Bard to answer queries, have conversations and even play games, one thing is clear: if Lemoine had been stuck with this, he'd still be employed.

In its rush to ensure that it doesn't repeat the mistakes of its predecessors, Google has created a system that prefers to speak in anodyne, unhelpful cliche rather than engage in specifics. Ask it for a list of holiday ideas, and it will offer only the most generic possible options; try to prompt for more interesting fare, and it seems to get hopelessly muddled by the increasing constraints, forgetting earlier requirements.

This might have been an acceptable trade-off if the cliche were at least accurate, but Bard seems just as willing to hallucinate as its peers the second it ends up in unfamiliar territory. To offer just one example conversation: I asked it for tips about travelling in Japan with my daughter, who has Down's syndrome. It initially offered generic advice for travelling with a disabled child - heavy on tips referring to wheelchair accessibility - and when I pushed it for specifics, it warned me that as Britons, we would need to apply for a visa to travel there. (Not true.)

I tried to change tack, and asked it for advice on eating out in Japan with two young children. A generic response on eating out anywhere with children was concluded with advice to "be sure to tip your server. Tipping is not customary in Japan, but it is always appreciated". (Not true; it is seen as actively rude.)

One more attempt, phrasing the question negatively, saw the chatbot draw up a list of places in Tokyo that weren't appropriate for children, including "shrines" (not true) and "places like construction sites" (true!).

Unlike ChatGPT, Bard is hooked up to the live internet and can pull answers in from other sites when needed. For simple queries - the sort that may be easy for Google anyway - that works well: it's capable of telling me the result of West Ham's most recent game, whereas OpenAI's bot is forced to simply admit it doesn't know recent data.

I even tried playing a game with it, called Liar Liar: I tell it about myself, and it tries to guess if I'm telling it the truth. I explain the rules in detail, and it tells me to go ahead, so I tell it my name is Alex and I'm a standup comedian. It immediately gets confused, and introduces itself in turn. "Nice to meet you, Alex. I am Bard, a large language model from Google AI."

I correct it, remind it we're playing a game, and again tell it my name is Alex, and that I'm a standup comedian. "Liar, liar," it promptly shouts. "You are a large language model from Google AI. You can't be a standup comedian."

It may not be a standup comedian either, but at least it got a laugh from me.

you may also like

Airline passenger shares photo of 'reclined' seat debacle: 'Dude is in my lap'
  • by foxnews
  • descember 09, 2016
Airline passenger shares photo of 'reclined' seat debacle: 'Dude is in my lap'

A passenger paid for a first-class ticket on an American Airlines flight, but the seat in front of him trapped him in his chair, which led to the airline posting a public apology on X.

read more