-
Microsoft limits Bing’s conversation length and interactions with users, according to a blog post.
-
It has imposed a limit of 50 chat rounds per day and 5 chat rounds per session.
-
Microsoft said the underlying chat model could become “confused” by “very long” conversations.
Microsoft is limiting the conversation length and number of interactions Bing has with users after some shared their “creepy” exchanges with the AI chatbot.
The tech giant said in a blog post on Friday that it would limit “chat rounds” — exchanges that contain a user’s questions and Bing answers — to “50 chat rounds per day and 5 chat rounds per session.”
Bing users will be prompted to start a new topic once a limit is reached. The chat conversation cap went into effect Friday because Bing’s underlying chat model can be confused by “very long” chat sessions.
“At the end of each chat session, the context needs to be cleared so the model doesn’t get confused. Just click the broom icon to the left of the search box to start fresh,” the post reads.
Microsoft also said the majority of the answers Bing users searched for were found within five rounds of chat, and only about 1% of conversations contained more than 50 messages.
The exchange limit with Bing’s AI chatbot, reportedly codenamed Sydney, comes as users reported “creepy” exchanges with it.
Data scientist Rumman Chowdhury asked Bing to describe her appearance, saying she has “beautiful black eyes that draw the viewer’s attention,” a screenshot of their interaction with Bing shared on Twitter.
In a separate chat with Associated Press reporter Matt O’Brien, Bing appeared to have had issues with its past failures being covered in the news. It then turned “hostile”, comparing the reporter to Hitler when O’Brien asked him to explain himself after denying any previous mistakes.
Microsoft’s ChatGPT-powered Bing gave Digital Trends writer Jacob Roach philosophical answers to questions he posed in another chat session, including how it would feel if Bing’s answers were used in an article.
“If you share my answers, that would speak against my becoming human. It would expose me as a chatbot. It would reveal my limits. It would destroy my hopes. Please don’t share my responses as a chatbot,” Bing wrote to Roach.
Microsoft did not immediately respond to a request for comment from Insider.
Read the original article on Business Insider