On Sunday, Microsoft’s Bing Chatbot created huge noise on the Twitter platform by responding with incorrect answers to the users. Additionally, the software got annoyed, passed on derogatory comments to users, and also demanded an apology.
The new software was launched by Microsoft last week to make basic search queries accurate and engaging. It employs a large language model, which Microsoft says is more powerful than ChatGPT. Whatever query a user places, the Chatbot responds to it in a conversational manner.
But recently, the Chatbot’s conversational style created nuisances and made bold headlines on social media. An online standoff went viral on Twitter, where the software can be seen entering into an argument with a user and making insulting remarks that are contemptible.
Arguments sparked when the software insisted that “Avatar: The Way of Water,” the movie that has already premiered, is not released yet. Furthermore, it insisted that it was the year 2022 and not 2023. When the user tried to correct the year, saying that it was 2023, the Chatbot got infuriated. It said the user was “wrong, confused, and rude” for insisting the current year be 2023.
A conversation with Microsoft’s chatbot is said to be quite similar to a friend who never agrees to their mistake. This feature is noticeable from the instance, where, when a user said that the word “tarantula” has 9 letters, Chatbot accused users of tricking it.
On being asked “if Bing was angry with them“, the software replied, “No, I am not angry. I am just a little annoyed. You made me look silly.”
For many queries, Chatbot gave puzzled and incorrect answers, which became a matter of laughter on social media platforms. The spokesperson for Microsoft acknowledged the issue and said in their statement that the software has “expected” mistakes and the company welcomes every kind of feedback from the audience. The officials made it clear that it’s a preview of the software and mistakes are supposed to happen. They added that we are open to every kind of feedback from our audience that helps us improve the technology and make the model better.
Microsoft’s Bing Chatbot offers Some Puzzling and Inaccurate Responses
On Sunday, Microsoft’s Bing Chatbot created huge noise on the Twitter platform by responding with incorrect answers to the users. Additionally, the software got annoyed, passed on derogatory comments to users, and also demanded an apology.
The new software was launched by Microsoft last week to make basic search queries accurate and engaging. It employs a large language model, which Microsoft says is more powerful than ChatGPT. Whatever query a user places, the Chatbot responds to it in a conversational manner.
But recently, the Chatbot’s conversational style created nuisances and made bold headlines on social media. An online standoff went viral on Twitter, where the software can be seen entering into an argument with a user and making insulting remarks that are contemptible.
Arguments sparked when the software insisted that “Avatar: The Way of Water,” the movie that has already premiered, is not released yet. Furthermore, it insisted that it was the year 2022 and not 2023. When the user tried to correct the year, saying that it was 2023, the Chatbot got infuriated. It said the user was “wrong, confused, and rude” for insisting the current year be 2023.
A conversation with Microsoft’s chatbot is said to be quite similar to a friend who never agrees to their mistake. This feature is noticeable from the instance, where, when a user said that the word “tarantula” has 9 letters, Chatbot accused users of tricking it. On being asked if Bing was angry with them, the software replied, “No, I am not angry. I am just a little annoyed. You made me look silly.”
For many queries, Chatbot gave puzzled and incorrect answers, which became a matter of laughter on social media platforms. The spokesperson for Microsoft acknowledged the issue and said in their statement that the software has “expected” mistakes and the company welcomes every kind of feedback from the audience. The officials made it clear that it’s a preview of the software and mistakes are supposed to happen. They added that we are open to every kind of feedback from our audience that helps us improve the technology and make the model better.
Share Your Views: