I know what’s wrong with Bing AI Chat
Ask Me Something is a long form AMA and one of the most popular interactive discussions on Reddit. It’s also a major challenge because Microsoft’s Bing AI chatbot, aka “The New Bing,” is learning fast.
Whenever a celebrity or celebrity signs up for a Reddit AMA, usually shortly after posing with a photo to prove it was indeed them answering the question, there’s a moment of deep dread.
The ability to ask anyone any question is often a minefield of inappropriate discourse, managed by on-site community managers who answer and filter questions. Otherwise, things can quickly go off track. Even without this protection, Anyway, they often do (opens in a new tab).
When Microsoft launched its new Bing AI-powered chat, it made it clear that the ChatGPT AI was ready to answer any and all questions. It’s either a high level of trust in a relatively small but growing user base, or it’s incredibly naive.
Even ChatGPT, which launched the original AI chatbot craze on which Bing’s chat is based, doesn’t offer such hints either. Instead, there is an empty text input box at the bottom of the screen. Above is a list of example issues, features, and most importantly limitations.
Bing has a leading tip with a sample question below it and a big “Try it” button next to another button that says “Learn more.” To hell with it. We like to go right in, follow Bing’s instructions, and ask any questions.
Naturally, Bing ran into all kinds of questions, including many that had nothing to do with everyday needs like travel, recipes, and business planning.These are the ones we’re all talking about because, as always, asking “anything” means “asking anything”
Bing is thinking about love, sex, death, marriage, divorce, violence, enemies, slander, and emotions it insists it doesn’t have.
In OpenAI’s ChatGPT, the main screen warns about it:
- may occasionally produce incorrect information
- May occasionally produce harmful directives or biased content
- Limited understanding of the post-2021 world and events
too many questions
Bing’s chat GPT is slightly different than OpenAI’s, and it may not face all of these limitations. In particular, knowledge of world events may extend to the present day thanks to the integration of Bing’s Knowledge Graph.
But as Bing comes along, or rather gets crazier, it might be a mistake to encourage people to ask it anything.
What if Microsoft built Bing AI Chat with a different prompt:
ask me something
Ask me a question
What do you want to know?
With these slightly modified hints, Microsoft could add a long list of warnings about how Bing AI Chat doesn’t know what it’s talking about.OK, it does (sometimes (opens in a new tab)), but not in the way you know it. It has no emotional intelligence or reflexes, or even a moral compass.I mean, it’s trying to act like it has one, but recently with New York Times (opens in a new tab) and even Tom’s Hardware (opens in a new tab) Proving that its grasp on the basic morals of good people is tenuous at best.
In my own conversations with Bing AI Chat, it repeatedly told me it has no human emotions, yet it still talks like a human.
To anyone who has been paying attention to AI for a long time, what happened should come as no surprise. AI knows:
- what training did it receive
- what it can learn from new information
- What it can glean from the vast amount of online data
- what it can learn from real-time interactions
Still, Bing AI Chat is no more conscious than any AI before it. It’s probably one of the AI’s better actors, though, as its ability to hold dialogue far exceeds anything I’ve experienced before. This feeling only increases with the length of the conversation.
I’m not saying that the Bing AI chat becomes more believable as a sentient person, but it does become more believable as a slightly irrational or confused person. The same goes for long conversations with real people. You start with a topic and maybe even argue about it, but at some point the argument becomes less logical and rational. In human terms, emotions come into play. As far as Bing AI Chat is concerned, it’s like reaching the end of a rope where the fibers are present but frayed. Bing AI has some information for long conversations, but no experience weaving it together in a meaningful way.
Bing is not your friend
By encouraging people to “Ask Me Anything…”, Microsoft set Bing up for some serious growing pains (if not failures). Microsoft may feel the pain, and certainly those who deliberately ask questions that ordinary search engines can’t answer.
Before Chatbots, would you consider using Google to fix your love life, explain God, or be a substitute friend or lover? I hope not.
Bing AI Chat gets better, but before we can have any more uncomfortable conversations, Bing regrets its response and tries to make it go away.
Asking AI about anything is an obvious long-term goal, but we’re not there yet. Microsoft took a big step, and now it’s in freefall from a pile of dubious responses. It won’t land until Bing AI Chat gets smarter and more careful, or Microsoft stops re-educating AI a little bit.
Still waiting to ask Bing any questions, we have the latest details on the waiting list.