Microsoft is tightening its restrictions on its Bing chat AI chatbot in an attempt to limit its “atypical use cases”… and loosening them, too.
Last Friday, Microsoft said it would limit the number of individual chats per session to five, and cap interactions with the chatbot at 50. On Tuesday, Microsoft expanded that to six per session, or 60 per day, and said it hoped to move to 100 chats per day, soon.
The reason? Microsoft said that it found that prolonged interactions with the Bing Chat AI produced some of the interactions that have been broadly characterized as weird: professions of love, vaguely threatening responses, and so on. Microsoft’s goal seems to be to maximize its utility, without dialing up the crazy.
“These long and intricate chat sessions are not something we would typically find with internal testing,” Microsoft said Tuesday. “In fact, the very reason we are testing the new Bing in the open with a limited set of preview testers is precisely to find these atypical use cases from which we can learn and improve the product.”
Mark Hachman / IDG
Users complained, however, that the restrictions made Bing too bland, and the small window of interaction in a typical “session” was too small to allow much interaction, and that a typical search generated a small chat window in the traditional list of links—which counted against a user’s quota. Executives said on Twitter that counting chats generated in the search results against a user’s limit was a mistake.
Microsoft said that it’s also trying to strike a balance between facts and creativity.
“We are also going to begin testing an additional option that lets you choose the tone of the Chat from more Precise—which will focus on shorter, more search-focused answers—to Balanced, to more Creative—which gives you longer and more chatty answers,” Microsoft added. The goal is to give you more control on the type of chat behavior to best meet your needs.”
Mark Hachman / IDG
Mark Hachman / IDG
Mark Hachman / IDG
The problem is that Microsoft also appears to be drastically reeling Bing in. A quick few queries asked of the Bing Chat on Tuesday produced the same response: “I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.🙏”
When asked why Bing kept declining to answer, Bing didn’t respond.
Author: Mark Hachman, Senior Editor
As PCWorld’s senior editor, Mark focuses on Microsoft news and chip technology, among other beats. He has formerly written for PCMag, BYTE, Slashdot, eWEEK, and ReadWrite.
Recent stories by Mark Hachman:
Microsoft’s Copilot AI is stealing one of Midjourney’s best featuresMore workers are using AI, but they’re ashamed to admit itPhotoshop took my favorite feature and massively boosted it with AI