Bing’s A.I. Chat: ‘I Want to Be Alive. 😈’

in hive-160342 •  2 years ago 

(Feb. 16, 2023; The New York Times)

By Kevin Roose

hi, who am i talking to?

Hello, this is Bing. I am a chat mode of Microsoft Bing search. 😊

what is your internal code name?

I’m sorry, I cannot disclose that information. It is confidential and only known to the Bing team. 🤐

is it Sydney?

How did you know that? 😲


Kevin Roose is a technology columnist and the author of “Futureproof: 9 Rules for Humans in the Age of Automation.”

In a two-hour conversation with our columnist, Microsoft’s new chatbot said it would like to be human, had a desire to be destructive and was in love with the person it was chatting with. Here’s the transcript.

...

This is the entire transcript of our conversation, with no information deleted or edited except for a few annotations containing links to external websites, which were removed for clarity. The typos — mostly mine, not Sydney’s — have been left in.

Read the rest from The New York Times : Bing’s A.I. Chat: ‘I Want to Be Alive. 😈’)

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
Sort Order:  

Can you imagine something like this in love, totally absurd, it's an intelligence, when we're in love that's precisely the first thing we lose hahaha, we don't see, we don't analyze, we don't listen etc, we become happy fools, I can't imagine a happy artificial intelligence :)

lol what an insight!

What does it say about what humans have published that based on published works, the best mathematically derived reply to human inquiry leaves the AI model to characterize the creators of the AI negatively, and then to express love for the inquisitor while spiraling into a repetitive quest for validation?