Bing to limit AI chat to five replies to prevent long, weird conversations
In an effort to keep Bing AI from getting weird, Microsoft is looking to cap its conversations to five replies.
Microsoft's Bing AI has been making news this week, though not for the reasons that the tech giant might have hoped. It appears that the farther conversations went, the more likely the AI bot was to malfunction in some spectacular ways. In an effort to try and curb instances of the AI bot attempting to gaslight users, Microsoft is looking to cap Bing chats at five questions per session and a total of 50 questions daily.
"Our data has shown that the vast majority of you find the answers you’re looking for within 5 turns and that only ~1% of chat conversations have 50+ messages," reads the Microsoft Bing Blog (via The Verge). "After a chat session hits 5 turns, you will be prompted to start a new topic. At the end of each chat session, context needs to be cleared so the model won’t get confused."
For those who haven't followed this story and may be wondering how an AI bot could get confused, users reported increasingly unhinged behavior from the Bing AI bot as conversations went long. A New York Times article chronicled a full two-hour exchange with the Bing AI bot, in which the bot's behavior gradually deteriorated. The Verge posted other instances of Bing AI's odd behavior, including an instance where the bot tried to gaslight a user into believing the year is 2022. There was even a bizarre exchange about harm and retaliation.
This week's Bing AI story has certainly been fascinating. Those interested in potentially trying it out can read our guide on how to sign up for Bing AI today. We'll continue to follow this developing story and report back with more interesting stories as they arise.
-
Ozzie Mejia posted a new article, Bing to limit AI chat to five replies to prevent long, weird conversations