Several tech experts, in the last couple of months, have warned us against AI chatbots’ tendency to hallucinate. Prabhakar Raghavan, senior vice-president at Google and head of Google Search, in conversation with a German publication, said that AI can sometimes ‘hallucinate’ and provide ‘answers which are convincing but completely made-up’. He had also added that one of the key tasks was to keep this to a minimum.
On the other hand, OpenAI CEO Sam Altman, at a recent event held in New Delhi, also said that he ‘doesn’t trust ChatGPT’s answers’.
And now, a radio host is suing OpenAI for defamation after ChatGPT allegedly created a fake legal document, accusing him of fraud and money embezzlement.
Radio host suing ChatGPT’s OpenAI
A Business Insider report says that as per a lawsuit filed in a court in Georgia, Mark Walters argued that ChatGPT dragged his name in a case that journalist Fred Riehl was working on and accused him (Walters) of “defrauding and embezzling funds.”
Walters further alleged that Riehl had asked ChatGPT for a summary of a Washington case, giving a link of the case to the AI chatbot.
In response to the query, ChatGPT dragged Walters’ name in the case and said that he was ‘accused of defrauding and embezzling funds from the SAF’. The case was originally between Attorney General Bob Ferguson and the Second Amendment Foundation (SAF)
Further, ChatGPT said that Walters was the organization’s ‘treasurer and CFO’ and also added that he had been accused of misappropriating funds for personal expenses and manipulating financial records.
The Business Insider report further states that when the journalist tried to get more details from ChatGPT about the case, the AI chatbot came up with an entirely ‘fake complaint’, the court documents reveal. The complaint (that ChatGPT came up with) was termed a ‘complete fabrication’ that ‘bears no resemblance to the actual complaint, including an erroneous case number’ by the Georgia court.
When ChatGPT got lawyer in legal trouble
This is not the first time that ChatGPT has provided a made-up answer when asked to summarise a legal case.
Last month, a BBC report had revealed that a lawyer based out of New York was facing a court hearing after a colleague at his firm used ChatGPT for legal research. The court found out that several legal cases referenced by the lawyer and his firm in an on-going case never existed. The judge had termed the incident as an “unprecedented circumstance.”