Microsoft’s Clumsy Bing Chatbot Has Users Asking Bizarre Questions That Indirectly Expose Its Errors

The past week has been very interesting for software giant Microsoft after it ventured into the world of AI with a bang.

The company was witnessed putting forward a competitor for ChatGPT called the new Bing Search Engine. And popularity soared to the top with millions of downloads across the board.

But you’ll be interested to find out how each user is now on a waiting list before they can actually start to use the feature. Meanwhile, it’s interesting how the first few invites actually went out to those individuals in the public that were most enthusiastic.

However, the decision may not have been so great as now, new reports show that people are not working in the chatbot’s best interest. They are busy trying to expose the tool’s errors by asking it some ridiculous questions to attain a break. And as expected, the replies from the chatbot are awfully odd as spotted in the original demo.

A series of such responses were put out by several members using Bing on the subreddit. They did not hold back in explaining their experience. One such user who goes by the name yaosio says he put the chatbot to work and what he ended up with was an existential crisis. While same was the case with u/EldritchAdam.



The tool failed to recall the last chat it had with the user and gave out an interesting but weird response. It claimed to have zero clues about why a certain thing happened, twice. It then says it has no clue what to do and how to fix the problem. And in the end, it claims it has no idea how it can remember this.

But that was just the start. Many others started to come forward and share similar experiences they had with the chatbot. Another user by the name vitorgrs displayed a conversation that reportedly gets this chatbot so unhappy with those chatting with it.


The tool called the user out and claimed it lied to it about all things. He further questions why the user would do something like this and how it can be ok with being so dishonest.
Then we had a user by the name u/Alfred_Chicken on Reddit how he may have put this tool to work in a feedback loop. This happened when it ended up asking if the chatbot was sentient. And as expected, the response was funny and very odd.


The responses have really been so entertaining for so many people. But other chats exposed some major errors just as seen during a live error conducted by Microsoft last week.

This entailed a series of fake facts and information provided to a Mexican trip itinerary. A lot of figures mentioned were wrong and the fact that it was summed up into a final press release was definitely a little concerning.

On the other hand, a spokesperson from Microsoft replied to a media outlet and highlighted more in detail the major errors made by Bing. He says that the company has always expected this system to make some huge mistakes during such a period. And feedback from users is what will assist the firm in overcoming such errors in the future.

The goal is to identify the mistakes during such a period and getting a response in this period is crucial so they know what they may have missed out on. Clearly, the company wants to overcome the errors and cares for its models.

Such reports prove that the chatbot is still in its infancy stage and is not yet a big threat to leading search engines as some may believe so.

Read next: CEO of OpenAI declares ChatGPT a (cool yet) horrible product
Previous Post Next Post