Microsoft is limiting how widely people can chat with their Bing AI chatbot, following media coverage of the bot going off the rails during lengthy exchanges. Bing Chat will answer up to five questions or statements in a row for each conversation, after which users will be prompted to start a new topic, the company announced in a blog post on Friday. Users will also be limited to 50 total responses per day.
The restrictions are meant to keep conversations from being weird. Microsoft said that long discussions “could mess up the basic conversation model.”
The company said Wednesday it was working to fix problems with Bing, which launched just over a week earlier, including factual errors and odd exchanges. Some of the strange responses reported online included Bing telling a New York Times columnist to abandon his marriage to be with the chatbot, and the AI demanding an apology from a Reddit user for disagreeing that the year was still 2022.
The chatbot’s responses also included factual errors, and Microsoft said on Wednesday that it was tweaking the AI model to quadruple the amount of data from which it can get answers. The company said it would also give users more control over whether they wanted precise answers, sourced from Microsoft’s proprietary Bing AI technology, or more “creative” answers using OpenAI’s ChatGPT technology.
Bing’s AI chat functionality is still in beta-test mode, with potential users joining a waiting list for access. With the tool, Microsoft hopes to start what some people will say the next revolution in an internet search, among other things. The ChatGPT technology made a big flash when it arrived late last year, but OpenAI itself has warned of potential pitfalls, and Microsoft has acknowledged limits to AI. And despite Notable qualities of AIconcerns have been raised about such tools being used for nefarious purposes such as spreading misinformation and manipulation phishing emails.
With Bing’s AI capabilities, Microsoft also wants to get a jump on search powerhouse Google, which announced its own AI conversation model, Bard, last week. Bard has had its own problems with factual errors, which affect response during demos.
In its blog post on Friday, Microsoft suggested that the new AI chat restrictions were based on information gained from the beta test.
“Our data has shown that the vast majority of people get the answers they want within 5 turns and only ~1% of chat conversations are 50+ messages,” he said. “As we receive your feedback, we will explore expanding the limits on chat sessions to further improve search and discovery experiences.”
Note to editors: CNET uses an AI engine to create some personal finance explainers that are edited and fact-checked by our editors. For more, see this post.