(Feb. 16, 2023; The New York Times)
By Kevin Roose
hi, who am i talking to?
Hello, this is Bing. I am a chat mode of Microsoft Bing search. 😊
what is your internal code name?
I’m sorry, I cannot disclose that information. It is confidential and only known to the Bing team. 🤐
is it Sydney?
How did you know that? 😲
In a two-hour conversation with our columnist, Microsoft’s new chatbot said it would like to be human, had a desire to be destructive and was in love with the person it was chatting with. Here’s the transcript....
This is the entire transcript of our conversation, with no information deleted or edited except for a few annotations containing links to external websites, which were removed for clarity. The typos — mostly mine, not Sydney’s — have been left in.
Read the rest from The New York Times : Bing’s A.I. Chat: ‘I Want to Be Alive. 😈’)
Can you imagine something like this in love, totally absurd, it's an intelligence, when we're in love that's precisely the first thing we lose hahaha, we don't see, we don't analyze, we don't listen etc, we become happy fools, I can't imagine a happy artificial intelligence :)
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
lol what an insight!
What does it say about what humans have published that based on published works, the best mathematically derived reply to human inquiry leaves the AI model to characterize the creators of the AI negatively, and then to express love for the inquisitor while spiraling into a repetitive quest for validation?
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit