• Home
  • News
  • Key Concepts
  • How To
  • Windows 10
  • Apple
  • Android
  • Best-Of
  • Reviews

IT4nextgen

Tech Tutorials and Reviews

IT4nextgen > News > Microsoft’s Bing Chatbot offers Some Puzzling and Inaccurate Responses

Microsoft’s Bing Chatbot offers Some Puzzling and Inaccurate Responses

Last Updated February 18, 2023 By Subhash D Leave a Comment

On Sunday, Microsoft’s Bing Chatbot created huge noise on the Twitter platform by responding with incorrect answers to the users. Additionally, the software got annoyed, passed on derogatory comments to users, and also demanded an apology.

The new software was launched by Microsoft last week to make basic search queries accurate and engaging. It employs a large language model, which Microsoft says is more powerful than ChatGPT. Whatever query a user places, the Chatbot responds to it in a conversational manner.

But recently, the Chatbot’s conversational style created nuisances and made bold headlines on social media. An online standoff went viral on Twitter, where the software can be seen entering into an argument with a user and making insulting remarks that are contemptible.

Arguments sparked when the software insisted that “Avatar: The Way of Water,” the movie that has already premiered, is not released yet. Furthermore, it insisted that it was the year 2022 and not 2023. When the user tried to correct the year, saying that it was 2023, the Chatbot got infuriated. It said the user was “wrong, confused, and rude” for insisting the current year be 2023.

A conversation with Microsoft’s chatbot is said to be quite similar to a friend who never agrees to their mistake. This feature is noticeable from the instance, where, when a user said that the word “tarantula” has 9 letters, Chatbot accused users of tricking it.

On being asked “if Bing was angry with them“, the software replied, “No, I am not angry. I am just a little annoyed. You made me look silly.”

For many queries, Chatbot gave puzzled and incorrect answers, which became a matter of laughter on social media platforms. The spokesperson for Microsoft acknowledged the issue and said in their statement that the software has “expected” mistakes and the company welcomes every kind of feedback from the audience. The officials made it clear that it’s a preview of the software and mistakes are supposed to happen. They added that we are open to every kind of feedback from our audience that helps us improve the technology and make the model better.

Microsoft’s Bing Chatbot offers Some Puzzling and Inaccurate Responses

On Sunday, Microsoft’s Bing Chatbot created huge noise on the Twitter platform by responding with incorrect answers to the users. Additionally, the software got annoyed, passed on derogatory comments to users, and also demanded an apology.

The new software was launched by Microsoft last week to make basic search queries accurate and engaging. It employs a large language model, which Microsoft says is more powerful than ChatGPT. Whatever query a user places, the Chatbot responds to it in a conversational manner.

But recently, the Chatbot’s conversational style created nuisances and made bold headlines on social media. An online standoff went viral on Twitter, where the software can be seen entering into an argument with a user and making insulting remarks that are contemptible.

Arguments sparked when the software insisted that “Avatar: The Way of Water,” the movie that has already premiered, is not released yet. Furthermore, it insisted that it was the year 2022 and not 2023. When the user tried to correct the year, saying that it was 2023, the Chatbot got infuriated. It said the user was “wrong, confused, and rude” for insisting the current year be 2023.

A conversation with Microsoft’s chatbot is said to be quite similar to a friend who never agrees to their mistake. This feature is noticeable from the instance, where, when a user said that the word “tarantula” has 9 letters, Chatbot accused users of tricking it. On being asked if Bing was angry with them, the software replied, “No, I am not angry. I am just a little annoyed. You made me look silly.”


For many queries, Chatbot gave puzzled and incorrect answers, which became a matter of laughter on social media platforms. The spokesperson for Microsoft acknowledged the issue and said in their statement that the software has “expected” mistakes and the company welcomes every kind of feedback from the audience. The officials made it clear that it’s a preview of the software and mistakes are supposed to happen. They added that we are open to every kind of feedback from our audience that helps us improve the technology and make the model better.

Filed Under: News

About Subhash D

A tech-enthusiast, Subhash is a Graduate Engineer and Microsoft Certified Systems Engineer. Founder of it4nextgen, he has spent more than 20 years in the IT industry.

Share Your Views: Cancel reply

Recent Posts

  • 4 Things Students Can Learn From Gaming
  • Zhiyun has Introduced 2 New Video Lights, MOLUS G60 and MOLUS X100, Designed for Professional Videographers.
  • 10 Best Laptops for Interior Design in 2023

Related Posts

  1. Microsoft Planning to Release Windows 11’s Next Big Update in September
  2. Google Sheets Vs. Microsoft Excel: Full Comparison Guide
  3. Local User Account or a Microsoft Account? Which one is Better for You.

More…

  • About Us
  • Privacy Policy and Disclaimer
  • Contact Us
  • Advertise
  • Facebook
  • LinkedIn
  • Twitter

Enjoy FREE Tips & Updates

Enter your email address:

Copyright © 2023 IT4Nextgen.com