Published , by Morgan Shaver
Published , by Morgan Shaver
During an online presentation outlining features of Copilot, Microsoft’s new AI-powered chat technology for Office apps like Powerpoint, Teams, and Outlook, the company addressed the AI’s previous issues in providing incorrect answers. For example, back in February when Microsoft debuted a demo of its Bing chat tool created using OpenAI, there were instances where it was found to be giving incorrect answers.
As reported by CNBC, researchers have started calling this phenomenon seen with AI a “hallucination.” This is certainly an interesting way to describe it. Microsoft recently went a step further, however, in describing these incorrect answers as being “usefully wrong.” Microsoft also pointed out that, as long as people are aware of the potential for Copilot to provide inaccurate responses, they can edit around them and still use the tools to quickly send emails or finish presentation slides.
“For instance, if a person wants to create an email wishing a family member a happy birthday, Copilot can still be helpful even if it presents the wrong birth date. In Microsoft's view, the mere fact that the tool generated text saved a person some time and is therefore useful. People just need to take extra care and make sure the text doesn't contain any errors,” the CNBC report explains.
CNBC also cites that Microsoft chief scientist Jaime Teevan has previously assured that Microsoft has “mitigations in place” when Copilot gets things wrong, has biases, or is misused. "We're going to make mistakes, but when we do, we'll address them quickly," Teevan said.
It’ll be interesting to see moving forward how Microsoft works around AI having the potential to give inaccurate information, and make mistakes. It’s nice to hear though that Microsoft is aware of the problem, and are working to address them.
For more on Microsoft’s ventures into AI be sure to catch up with our previous coverage including Microsoft showing off its 365 Copilot AI-powered Office tool at its AI presentation, and how Bing is limiting AI chat to five replies to prevent long, weird conversations.