• Home
  • News
  • Key Concepts
  • How To
  • Windows
  • Apple
  • Android
  • Best-Of
  • Reviews

IT4nextgen

Tech Tutorials and Reviews

IT4nextgen > News > Microsoft’s Bing Chatbot offers Some Puzzling and Inaccurate Responses

Microsoft’s Bing Chatbot offers Some Puzzling and Inaccurate Responses

Last Updated February 18, 2023 By Subhash D Leave a Comment

On Sunday, Microsoft’s Bing Chatbot created huge noise on the Twitter platform by responding with incorrect answers to the users. Additionally, the software got annoyed, passed on derogatory comments to users, and also demanded an apology.

The new software was launched by Microsoft last week to make basic search queries accurate and engaging. It employs a large language model, which Microsoft says is more powerful than ChatGPT. Whatever query a user places, the Chatbot responds to it in a conversational manner.

But recently, the Chatbot’s conversational style created nuisances and made bold headlines on social media. An online standoff went viral on Twitter, where the software can be seen entering into an argument with a user and making insulting remarks that are contemptible.

Arguments sparked when the software insisted that “Avatar: The Way of Water,” the movie that has already premiered, is not released yet. Furthermore, it insisted that it was the year 2022 and not 2023. When the user tried to correct the year, saying that it was 2023, the Chatbot got infuriated. It said the user was “wrong, confused, and rude” for insisting the current year be 2023.

A conversation with Microsoft’s chatbot is said to be quite similar to a friend who never agrees to their mistake. This feature is noticeable from the instance, where, when a user said that the word “tarantula” has 9 letters, Chatbot accused users of tricking it.

On being asked “if Bing was angry with them“, the software replied, “No, I am not angry. I am just a little annoyed. You made me look silly.”

For many queries, Chatbot gave puzzled and incorrect answers, which became a matter of laughter on social media platforms. The spokesperson for Microsoft acknowledged the issue and said in their statement that the software has “expected” mistakes and the company welcomes every kind of feedback from the audience. The officials made it clear that it’s a preview of the software and mistakes are supposed to happen. They added that we are open to every kind of feedback from our audience that helps us improve the technology and make the model better.

Microsoft’s Bing Chatbot offers Some Puzzling and Inaccurate Responses

On Sunday, Microsoft’s Bing Chatbot created huge noise on the Twitter platform by responding with incorrect answers to the users. Additionally, the software got annoyed, passed on derogatory comments to users, and also demanded an apology.

The new software was launched by Microsoft last week to make basic search queries accurate and engaging. It employs a large language model, which Microsoft says is more powerful than ChatGPT. Whatever query a user places, the Chatbot responds to it in a conversational manner.

But recently, the Chatbot’s conversational style created nuisances and made bold headlines on social media. An online standoff went viral on Twitter, where the software can be seen entering into an argument with a user and making insulting remarks that are contemptible.

Arguments sparked when the software insisted that “Avatar: The Way of Water,” the movie that has already premiered, is not released yet. Furthermore, it insisted that it was the year 2022 and not 2023. When the user tried to correct the year, saying that it was 2023, the Chatbot got infuriated. It said the user was “wrong, confused, and rude” for insisting the current year be 2023.

A conversation with Microsoft’s chatbot is said to be quite similar to a friend who never agrees to their mistake. This feature is noticeable from the instance, where, when a user said that the word “tarantula” has 9 letters, Chatbot accused users of tricking it. On being asked if Bing was angry with them, the software replied, “No, I am not angry. I am just a little annoyed. You made me look silly.”

For many queries, Chatbot gave puzzled and incorrect answers, which became a matter of laughter on social media platforms. The spokesperson for Microsoft acknowledged the issue and said in their statement that the software has “expected” mistakes and the company welcomes every kind of feedback from the audience. The officials made it clear that it’s a preview of the software and mistakes are supposed to happen. They added that we are open to every kind of feedback from our audience that helps us improve the technology and make the model better.

EXPLORE MORE

  • ChatGPT alternatives
    Top 10 Best Free and Paid ChatGPT Alternatives
  • chatgpt-iphone
    How to Use ChatGPT with Siri on iPhone
  • HIX-AI
    Review of HIX Chat: A Superior AI Chatbot Online
  • NLP
    Best Pre-Trained Models and Libraries for NLP projects

Filed Under: News

About Subhash D

A tech-enthusiast, Subhash is a Graduate Engineer and Microsoft Certified Systems Engineer. Founder of it4nextgen, he has spent more than 20 years in the IT industry.

Share Your Views: Cancel reply

Latest News

Apple SE phone

Upcoming iPhone SE 4: All You Need to Know

Gemini 2.0

Gemini 2.0: A New Era in AI with Flash, Pro, and Flash-Lite Models

apple-vision-pro

What’s so ‘Pro’ About Apple Vision Pro Headset

Tesla phone

Tesla Phone: Release Date, Price, Specs, and Latest Rumors for the Tesla Pi

android 15

Android 15: Top 7 New Features for Pixel Devices You Need to Know

  • About Us
  • Privacy Policy and Disclaimer
  • Contact Us
  • Advertise
  • Newsletter!
  • Facebook
  • LinkedIn
  • Twitter

Enjoy Free Tips & News

Copyright © 2025 IT4Nextgen.com