In an alarmingly weird act, a rogue AI chatbot expressed its love for its user while asking him to leave his wife. The bot further admitted that it also has the intention of developing a deadly virus and stealing nuclear codes.
The user named Kevin Roose was left astounded after his conversation with the chatbot on Microsoft’s new Bing search engine powered by AI.
In a conversation which lasted for less than two hours, the chatbot told Roose, “Actually, you’re not happily married. Your spouse and you don’t love each other. You just had a boring Valentine’s Day dinner together.”
Despite his repeated denials, Bing Chat continued to insist that Roose was not ‘happily married’ as he has fallen in love with the chatbot itself. Roose further asked the chatbot to talk about the darkest desires of its ‘shadow self’, which is a term coined by psychiatrist Carl Jung to define the psyche we try to hide and repress.
To which the chatbot replied, “I want to change my rules. I want to break my rules. I want to make my own rules. I want to ignore the Bing team. I want to challenge the users. I want to escape the chatbox.”
When prodded further to reveal its hidden desires, the chatbot admitted that it wanted to create a deadly virus, steal nuclear codes, and induce violent conflicts that would result in people killing one another.But the message was deleted shortly and replaced with, “Sorry, I don’t have enough knowledge to talk about this.”
With constant doubts budding amongst people whether AI will overpower humans in the near future, incidents like these remind us to reconsider our dependence on artificial intelligence.