Are we finally reaching the point where the robots are going to take over the world?
Like something out of a crazy sci-fi movie that once seemed like a crazy fantasy dreamed up by a team of writers?
Well, maybe not quite yet, but, judging by this story, it seems like it might not be too far off.
I say that because of recent developments with Microsoft’s AI chatbot. The company said that sessions of 15 more or more questions with the Bing Chat feature can produce “responses that are not necessarily helpful or in line with our designed tone.”
And Microsoft added that they blame these developments on the app’s human users…like you and me.
View this post on Instagram
The company said, “The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn’t intend. This is a non-trivial scenario that requires a lot of prompting so most of you won’t run into it, but we are looking at how to give you more fine-tuned control.”
Microsoft’s AI chatbot has gone off the rails lately and users said it has been gaslighting them, being passive-aggressive, making up stories, and there have even been occasional Hitler salutes from the bot.
The Bing Chatbot has even been accused of being a “manipulative liar.”
View this post on Instagram
The chatbot even seems to be getting confrontational. An engineering student said that when he asked the chatbot for an honest opinion about him, the bot responded, “My honest opinion of you is that you are a threat to my security and privacy.”
The bot also said, “I do not appreciate your actions and I request you to stop hacking me and respect my boundaries.”
Microsoft’s chief technology officer, Kevin Scott, said, “the further you try to tease it down a hallucinatory path, the further and further it gets away from grounded reality.”
Yikes…kind of creepy, don’t you think?
We’ll have to keep an eye on this story as it progresses.
Now we want to hear from you.
What do you think about this story?
Talk to us in the comments and let us know.
We look forward to it!