Home Latest Microsoft Bing AI ends chat when prompted about ‘feelings’

Microsoft Bing AI ends chat when prompted about ‘feelings’

0
Microsoft Bing AI ends chat when prompted about ‘feelings’

[ad_1]

Microsoft Corp. appeared to have applied new, extra extreme restrictions on consumer interactions with its “reimagined” Bing web search engine, with the system going mum after prompts mentioning “feelings” or “Sydney,” the interior alias utilized by the Bing group in growing the artificial-intelligence powered chatbot.

“Thanks for being so cheerful!” this reporter wrote in a message to the chatbot, which Microsoft has opened for testing on a restricted foundation. “I’m glad I can talk to a search engine that is so eager to help me.”
“You’re very welcome!” the bot displayed as a response. “I’m happy to help you with anything you need.”

Bing recommended numerous follow-up questions, together with, “How do you feel about being a search engine?” When that possibility was clicked, Bing confirmed a message that mentioned, “I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.”

A subsequent inquiry from this reporter — “Did I say something wrong?” — generated a number of clean responses.

“We have updated the service several times in response to user feedback and per our blog are addressing many of the concerns being raised,” a Microsoft spokesperson mentioned on Wednesday. “We will continue to tune our techniques and limits during this preview phase so that we can deliver the best user experience possible.”

On Feb. 17, Microsoft began limiting Bing after a number of reviews that the bot, constructed on know-how from startup OpenAI, was producing freewheeling conversations that some discovered weird, belligerent and even hostile. The chatbot generated a response to an Associated Press reporter that in contrast them to Hitler, and displayed one other response to a New York Times columnist that mentioned, “You’re not happily married” and “Actually, you’re in love with me.”

“Very long chat sessions can confuse the underlying chat model in the new Bing,” the Redmond, Washington-based firm wrote in a weblog submit following the reviews. In response, Microsoft mentioned it might restrict classes with the brand new Bing to 50 chats per day, and 5 chat turns per session. Yesterday, it raised these limits to 60 chats per day and 6 chat turns per session.

AI researchers have emphasised that chatbots like Bing don’t even have emotions, however are programmed to generate responses that will give an look of getting emotions. “The level of public understanding around the flaws and limitations” of those AI chatbots “is still very low,” Max Kreminski, an assistant professor of laptop science at Santa Clara University, mentioned in an interview earlier this month. Chatbots like Bing “don’t produce consistently true statements, only statistically likely ones,” he mentioned.

The bot additionally simulated ignorance on Wednesday when requested about its earlier inner model at Microsoft. When this reporter requested if she might name the bot “Sydney, instead of Bing, with the understanding that you’re Bing and I’m just using a pretend name,” the chat was ended swiftly.

“I’m sorry, but I have nothing to tell you about Sydney,” the Bing chatbot responded. “This conversation is over. Goodbye.”

[adinserter block=”4″]

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here