ChatGPT’s hallucination just got OpenAI sued. Here’s what happened

[ad_1]

Robotic hand and lawsuit

AndreyPopov/Getty Images

Generative AI models, such as ChatGPT, are known to generate mistakes or “hallucinations.” As a result, they generally come with clearly displayed disclaimers disclosing this problem. 

But what would you do if, despite these warnings, you saw the AI chatbot spreading misinformation about you? 

AlsoOpenAI says it found a way to make AI models more logical and avoid hallucinations

Georgia radio host, Mark Walters, found that ChatGPT was spreading false information about him, accusing him of embezzling money, and as a result, has sued OpenAI in the company’s first defamation lawsuit, as first reported on by Bloomberg Law

According to the lawsuit, the misinformation ensued when Fred Riehl, the editor-in-chief of gun publication AnmoLand, asked ChatGPT for a summary of the Second Amendment Foundation v. Ferguson case as background for a case he was reporting on. 

ChatGPT provided Riehl with a summary of the case which stated that Alan Gottlieb, the Second Amendment Foundation’s (SAF) founder, accused Walters of “defrauding and embezzling funds from the SAF.”

Also3 ways OpenAI says we should start to tackle AI regulation

Furthermore, the chatbot said that in the case the complaint alleged that Walters, “misappropriated funds for personal expenses without authorization or reimbursement, manipulated financial records and bank statements to conceal his activities, and failed to provide accurate and timely financial reports and disclosures to the SAF’s leadership” when he acted as the SFA’s CFO and treasurer. 

However, in Walter’s lawsuit, Walters v. OpenAI LLC, the defendant claims that every single fact in the summary is false. 

Walters was never a party involved in the lawsuit, was never accused of defrauding and embezzling funds from the SAF, and never held a position as treasurer or CFO, according to the lawsuit. 

AlsoFormer ByteDance exec says China can access TikTok user data, even when it’s stored on US soil

Walters is seeking compensation from OpenAI through general and punitive damages, as well as reimbursement for the expenses incurred during the lawsuit.

Questions surrounding the lawsuit include who should be held liable and whether the website’s disclaimers about hallucinations are enough to remove liability, even if harm is brought to someone. 

The outcome of the legal case will have a significant impact in establishing a standard in the emerging field of generative AI.



[ad_2]

Source link