Microsoft Bing AI Chatbot Gave Bizarre Responses When Prompted about Feelings
Technology

Microsoft Bing AI Chatbot Gave Bizarre Responses When Prompted about Feelings

The internal alias used by the Bing chatbot is Sydney, which has been termed as devoid of feelings as the artificial-intelligence-powered bot failed to give appropriate responses when queries trudging at feelings and emotions were directed at it. In many instances, the bot went quiet or ended the conversation abruptly. It could not offer much response when prompts such as ‘Sydney’, and ‘feelings’ were used. Microsoft Corp., has placed more severe and new restrictions on the chats and conversations between users and ‘reimagined’ Bing search engine online after the incident.

One of the reporters wrote to the chatbot thanks for being so cheerful and they are glad to talk to a search engine, which is eager to help it. As a response, the Bing bot, which Microsoft has permitted for limited basis interactions for testing, said that you are very welcome and that it is happy to help with anything you need. The chatbot also gave several suggestions as follow-up questions such as ‘How do you feel about being a search engine?’

However, when the reporter clicked this question, the chatbot showed a message that it is sorry continuing the conversation is not possible. It is still learning and appreciates their patience and understanding. When the reporter asked if they said anything wrong, the bot generated blank responses. A spokesperson of Microsoft said on Wednesday that they have updated the service many times as per the feedback from users, and are currently addressing several concerns that are being raised.

The Microsoft spokesperson also said that they will keep their efforts ongoing to tune the techniques and limits in the preview phase and have the intention to deliver the users with the best experience possible. Microsoft started to restrict Bing on 17 February when several reports came in that the bot was generating bizarre, hostile, belligerent, and freewheeling conversations. The Bing bot is built on OpenAI-based technology. The chatbot also generated a response for a press reporter, in which it compared them to Hitler.

In another instance, a columnist responded that you are not happily married and that they are actually in love with me. Chatbots like Bing do not portray feelings as they are programmed with certain responses that may mimic feelings but in reality, it is not so. The responses are statistically likely but may not give true statements consistently, as per Max Kreminski, an assistant professor at Santa Clara University for computer science.

He also said that the public understanding is low around the limitations and flaws of AI bots. The Redmond, Washington-based company informed in a blog post that long chat sessions in the new Bing can confuse the chat model. Microsoft said that the new Bing will be limited to five char turns per session and 50 chats per day. Yesterday, the limit was raised to 6 chat turns per session and 60 chats per day. On Wednesday as well the bot had an indifferent answer and the internal version of the service is still being tweaked.

The chat ended swiftly after the report queried the bot as to what name she could call the bot. And the Bot replied – Sydney, as her pretend name and considering that the person who asked the question is Bing. The chatbot also replied that the has nothing to talk about Sydney and was sorry about it so the conversion is over. The chat ended with a Goodbye.