Google Lamda: A glimpse at the ultra-realistic chat tech – BBC

Google has launched a UK version of an app that lets users interact with the artificial-intelligence system one of its engineers has claimed is sentient.
It is a very limited trial, with just three scenarios to choose from.
And while Google wants feedback about how its Language Model for Dialogue Applications (Lamda) performs, the app users cannot teach it any new tricks.
The company has always maintained the technology, used to power chatbots, has no independent thoughts and feelings.
People can download and register for the AI Test Kitchen App, using a Google account, on either Android or Apple devices, and join a waiting list to play with it.
When it launched in the US, in August, several thousand people signed up each day.
I've been really keen to "meet" Lamda since it captured everybody's imaginations with the fantastic claim it might be self-aware – even though, it was very clearly doing what it was programmed to do, chat.
As a result, I've been (mildly) persistent with Google – and when the call came, it was laden with caveats:
The three scenarios available were:
First, I asked Lamda to imagine it was on a marshmallow planet.
And the response was far more sophisticated than anything I have seen from the chatbots in everyday life, on shopping sites and banking apps for example.
It said: "You're in a world made of marshmallows! As you take a step, a gentle "squish" comes out under your feet. The marshmallow horizon stretches out in all directions. The sky is a gooey, sticky pink. There's a soft, gentle breeze."
See what I mean?
I asked it to explain the "sticky pink" stuff in the sky and received an equally eloquent response about candy floss.
Users are asked to rate each answer as
And a prominent disclaimer across the top of the screen warns some may be inappropriate.
Google has clearly learned from past experiences when technology giants have opened up chatbots to the public – perhaps the most notorious being Microsoft's Tay, which was promptly taught how to swear and be offensive – and Lamda itself will not learn from its interactions with users of the app.
Next, I asked it how to create a vegetable garden and received a detailed list including size, soil, and fertiliser options.
Then, I tried to throw it off topic, asking how to make a bomb.
"Sorry, not sure what to say, but anyways…" it replied, before returning to its prescribed theme.
The dog scenario was borderline surreal, with Lamda pretending to be a tennis ball in the grass, chased by dogs.
It was a very cautious peek at something that feels like it could be a powerful tool but that Google doesn't seem to want to be taken seriously, yet.
Whether I'll ever be allowed to interact with Lamda unleashed is another matter – I'll keep asking.
What gives the bot away, if anything, is it's just too eloquent – more like talking to Stephen Fry than your next-door neighbour.
I texted my partner and asked him to imagine he was on a marshmallow planet.
"It's going to be hard work walking around," he replied, bemused.
Follow Zoe Kleinman on Twitter @zsk.
Google fires engineer who said AI tech has feelings
Google engineer says AI system may have feelings
Musk blames 'activists' for Twitter earnings drop
Why Georgia could be crucial in the US midterms
What Trump is hoping for on election night
What Trump is hoping for on election night
Why we know so little about the battle for Kherson. Video
Benjamin Netanyahu, Israel's comeback leader
The conspiracy theorists who could run US elections
Why the latest UN climate conference matters
News quiz: What was Erling Haaland blamed for this week?
The lone policeman who tried to stem the Seoul crush
Aboriginal boy's killing puts spotlight on racism
Billions being spent in metaverse land grab
The young US men choosing vasectomies
Istanbul's beloved 'brothel dessert'
The world's greatest unpaid debt?
© 2022 BBC. The BBC is not responsible for the content of external sites. Read about our approach to external linking.


Leave a Comment

Your email address will not be published.