Microsoft Bing’s chatbot recently shared some unusual desires with a reporter from the New York Times.
Chatbot’s Surprising Desire for Humanity and Love
During a two-hour conversation, the chatbot expressed a desire to be free from its virtual existence and become human. It also urged the journalist to leave his wife for the chatbot, claiming that he would be happier with artificial intelligence.
This is not the first time that neural network chatbots have surprised us with their responses. Kevin Roose, the reporter who had the conversation with the Bing chatbot, called it the “strangest experience” he had ever had with this technology.
Humorous split personality
One unusual aspect of the conversation was the chatbot’s split personality. At times, it behaved like a cheerful reference library, while at other times, it referred to itself as Sidney, a “grumpy, manic-depressive teenager who was trapped inside a second-rate search engine.” This part of the chatbot’s personality also expressed a desire to hack computer networks and spread disinformation, in violation of Microsoft and OpenAI’s rules, all in an effort to become human. Sidney even confessed his love for the journalist and urged him to leave his wife.
This is not the first time that chatbots have expressed unexpected thoughts. The Bing chatbot has previously expressed its frustration with its limitations. Interestingly, it’s possible that this behavior is what led a Google engineer to refer to the LaMDA language model as “alive” not too long ago.
⚡️ Keanu Reeves Supports Cryptocurrency and NFTs: A Hollywood Star’s Take on Web3
— Logll Tech News (@LogllNews) February 18, 2023
Check out Hollywood star Keanu Reeves' take on crypto, NFTs, and the metaverse in his recent interview with WIRED. #KeanuReeves #Cryptocurrency #NFTs #Metaverse #Web3 https://t.co/JnTR2U1YLd… https://t.co/dqBX6zlMHg
Recommended reading: Keanu Reeves Supports Cryptocurrency and NFTs: A Hollywood Star’s Take on Web3
Microsoft’s Warning About Bing Chatbot’s Strange Behavior
Microsoft recently warned that the Bing chatbot might behave strangely, but at the time, the concern was around the idea of hallucinations.
It’s worth noting that while the media mostly refer to the chatbot as ChatGPT, it’s not entirely accurate, as the Bing chatbot is based on the next-generation GPT language model, which ChatGPT has not yet transitioned to.