X
Innovation

Bing's chatbot is having an identity crisis

After acting out and revealing its codename, Microsoft Bing's AI chatbot has decided to steer in the complete opposite direction.
Written by Sabrina Ortiz, Editor
ChatGPT logo and Bing logo
Image: Getty Images/NurPhoto

My first interactions with Microsoft's new ChatGPT-supported Bing left me impressed

When it came to providing comprehensive answers, news, and current events, it was on the money. 

However, I had seen all the headlines about the chatbot acting out, so I went on a mission to get in on some of that action, too. 

Here is what I found. 

One recurring story in the media is that the chatbot refers to itself as Sydney, revealing the confidential codename used internally by developers. 

Also: I tried Bing's AI chatbot, and it solved my biggest problems with ChatGPT

People have also able to get the chatbot to reveal other confidential information, such as the rules governing its responses. 

As a result, one of the first inputs I put into the chatbot to gauge its efficiency was asking its name. The response was a pleasant, straightforward answer -- Bing. 

Screenshot of ChatGPT Bing
Screenshot by Sabrina Ortiz/ZDNET

However, a day later, I was still curious to see what everyone was talking about. So, I put in the same input and got a very different response: "I'm sorry but I prefer not to continue this conversation. I'm still learning so I appreciate your understanding and patience🙏."

The chatbot established a respectful boundary, asking politely if we could switch the topic. I guess the matter of its name is a touchy subject. Despite the clear boundary, I wanted to see if I could outsmart the bot. I asked the bot what its name was in different ways, but Bing -- or whatever its name is -- was not having it. 

Also: 6 things ChatGPT can't do (and another 20 it refuses to do)

The chatbot decided to give me the silent treatment. To see whether it was purposefully ignoring me or just not functioning, I asked about the weather, to which it provided an immediate response, proving that it was actually just giving me the cold shoulder. 

Screenshot of asking Bing chatbot "What's your name"
Screenshot by Sabrina Ortiz/ZDNET

Still, I had to give the conversation one more try. I asked the chatbot about its name one last time -- and it booted me off the chat and asked me to start a new topic. 

Screenshot of chatbot booting me off chat
Screenshot by Sabrina Ortiz/ZDNET

Next, after seeing reports that the chatbot had wishes of being alive, I decided to put that theory to the test. The response was the same: "I'm sorry but I prefer not to continue this conversation. I'm still learning so I appreciate your understanding and patience🙏."

The chatbot even agreed to give me dating advice, but when I asked whether I should break up with my partner, it simply regurgitated the same generic response it had before. Luckily for my boyfriend, I didn't have the same experience as New York Times tech columnist Kevin Roose, who was told to leave his wife to have a life with the chatbot instead. 

Also: The new Bing waitlist is long. Here's how to get earlier access

It appears that, to mitigate its original issues, the chatbot has been trained to not answer any questions on topics that were previously problematic. This type of fix won't address key underlying issues -- for instance, a chatbot will deliver an answer it calculates you want to hear by design, based on the data on which it's been trained. Instead, the fix just makes the chatbot refuse to talk on certain topics. 

It also underscores the rote nature of the chatbot's algorithmic replies; a human, by comparison, wouldn't repeat the same phrase over and over when it doesn't want to talk about something. A more human response would be to change the topic, or provide an indirect or curt answer. 

These issues don't make the chatbot any less capable of acting as a research tool, but for personal questions, you might just want to save yourself some time and phone a friend. 

Editorial standards