Radio host sues OpenAI for defamation after ChatGPT generates false accusations
ChatGPT generated false allegations that a radio host defrauded a non-profit organization.
The boom of AI has bred a lot of concerns about the potential dangers of the technology. One specific concern has been the potential to use AI to spread disinformation on a large scale. That’s exactly what led to a new defamation lawsuit filed against OpenAI after ChatGPT fabricated legal accusations against a Georgia radio host.
The lawsuit against ChatGPT creator OpenAI was filed earlier this week in Georgia’s Superior Court, as reported by The Verge. Mark Walters, a radio host in Georgia filed the lawsuit after a journalist used AI to generate documents that accused Walters of defrauding and embezzling money from a non-profit organization. The potential for such documents to be spread and negatively affect Walters’ reputation is right in line with one of the biggest concerns of AI, and is mentioned directly in the lawsuit.
The filing also mentions the fact that OpenAI is fully aware of the fact that ChatGPT sometimes fabricates information, failing to directly state that it is false. Known as “hallucinations,” it’s something that OpenAI has acknowledged in the past, and NVIDIA set out to lessen the frequency of these hallucinations with its NeMo Guardrails software.
The result of Mark Walters’ lawsuit against OpenAI could be a massive moment in the story of artificial intelligence. If the tech company is held accountable, we’ll be curious to see if it leads to any major changes across the board for ChatGPT and similar services. Shacknews continues to monitor and share the most important stories in the world of AI.
-
Donovan Erskine posted a new article, Radio host sues OpenAI for defamation after ChatGPT generates false accusations
-
The recent defamation lawsuit against OpenAI over false accusations generated by ChatGPT ( https://gptturkey.net/ ) highlights the ethical concerns surrounding AI technology. The potential for AI to spread misinformation at scale, as seen in this case, poses serious risks to individuals' reputations. It prompts important discussions about accountability and the need for safeguards to prevent such instances. As the lawsuit unfolds, it may influence the trajectory of AI development, prompting closer scrutiny and potential changes in how AI models like ChatGPT operate.