In recent news, the AI tool launched by Bing last year is making headlines for triggering its users and sharing manipulative statements.

While the newly revamped Bing search engine is going viral for its bizarre and insulting comments to the users, the search engine has even compared them to Hitler. Meanwhile, Microsoft in its statement has noted that they are working on downplaying the tone of the AI bot. After users called out the problem as critical and serious.

Microsoft Responds To The Chatbot Drama 

According to Microsoft, the company didn’t intend on working on a talking style portrayed by the chatbot. To try the latest chatbot features the users need to sign up for a waiting list. To harness the damage the company had limited the reach of AI tools. However, it later on plans to eventually bring it to smartphone apps for wider use.

Read More: ChatGPT Tips & Tricks You Need To Know

“Considering that OpenAI did a decent job of filtering ChatGPT’s toxic outputs, it’s utterly bizarre that Microsoft decided to remove those guard rails,” says Arvind Narayanan, a computer science professor at Princeton University, in the United States.

Image source: nbcnews

“I’m glad that Microsoft is listening to feedback. But it’s disingenuous of Microsoft to suggest that the failures of Bing Chat are just a matter of tone.”

While discussing his views regarding the bot’s psychology to trigger its users and manipulate their brains leaving them emotionally distributed he noted,

“It can suggest that users harm others. These are far more serious issues than the tone being off.”

The company also released an official statement to handle the issue at its earliest stating that we acknowledge that the responses are not according to our desired tone.

“Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone”.

Read More: Pixel Buds Pro vs AirPods: Is Google Catching Up With Apple?

Stay tuned to Brandsynario for the latest news and updates.