BEST OF THE WEB

Australian Mayor considers legal action against ChatGPT

Brian Hood, the Mayor of Hepburn Shire Council in Australia, has threatened legal action against OpenAI-owned chatbot, ChatGPT, for sharing false information about him.

The advanced tool reportedly claimed that Mr. Hood was imprisoned for bribery while working for a subsidiary of Australia’s national bank, when in fact, he was a whistleblower who was never charged with a crime. Mr. Hood’s legal team has sent concern notice to OpenAI, which is the first formal step in defamation action in Australia. If OpenAI fails to respond within 28 days, Mr. Hood may proceed with a lawsuit under Australian law.

If Mr. Hood pursues the legal claim, it would be the first time that OpenAI faces a defamation suit due to content created by ChatGPT. Despite receiving a request for comment from the BBC, OpenAI has yet to respond.

ChatGPT is an AI language model developed by OpenAI that can mimic human-like language and generate answers to various questions. Millions of people have used the tool since its launch in November 2022, and Microsoft has integrated it into Bing since February 2023.

Users of ChatGPT are shown a disclaimer that warns them that the generated content may contain inaccurate information about people, places, or facts. OpenAI’s public blog about the tool also admits that a limitation is that it “sometimes writes plausible-sounding but incorrect or nonsensical answers.”

In 2005, Mr. Hood was the company secretary of Notes Printing Australia, a subsidiary of the Reserve Bank of Australia. He blew the whistle on bribery taking place at the organization linked to a business called Securency, which was part-owned by the bank. Securency was raided by police in 2010, leading to arrests and prison sentences worldwide. Mr. Hood was not one of those arrested and expressed being “horrified” to see ChatGPT spreading false information about him.

The BBC tested the publicly available version of ChatGPT on OpenAI’s website and confirmed Mr. Hood’s claims. The tool described the case accurately but falsely stated that he “pleaded guilty to one count of bribery in 2012 and was sentenced to four years in prison.” However, the newer version of ChatGPT integrated into Microsoft’s Bing search engine correctly identified him as a whistleblower who was not involved in the payment of bribes.

The sources for this piece include an article in BBC.

IT World Canada Staff
IT World Canada Staffhttp://www.itworldcanada.com/
The online resource for Canadian Information Technology professionals.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

ITW in your inbox

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

More Best of The Web