Ad Code

Responsive Advertisement

Content

37/random/ticker-posts

ChatGPT: Mayor starts legal bid over false bribery claim

(image credit: bbc.com)

The Mayor of Hepburn Shire Council in Australia, Brian Hood, has stated that he may take legal action against OpenAI-owned chatbot ChatGPT over false information it shared. The chatbot had claimed that Hood had been imprisoned for bribery while working for a subsidiary of Australia's national bank, when in fact he had been a whistleblower and was never charged with a crime. Lawyers acting on behalf of Hood have sent a concerns notice to OpenAI, which is the first formal step in defamation action in Australia. OpenAI has 28 days to respond to the concerns notice before Hood can take the company to court under Australian law.

If Hood were to pursue the legal claim, it would be the first time OpenAI has publicly faced a defamation suit over the content created by ChatGPT. ChatGPT, which was launched in November 2022, has been used by millions of people. It is an advanced chatbot that can answer questions using natural, human-like language and mimic other writing styles by using the internet as it was in 2021 as its database. Microsoft has invested billions of dollars in it and it was added to Bing in February 2023.

'Plausible-sounding but incorrect':

When people use ChatGPT, they are shown a disclaimer warning that the content it generates may contain "inaccurate information about people, places, or facts". OpenAI also acknowledges in its public blog about the tool that a limitation is that it "sometimes writes plausible-sounding but incorrect or nonsensical answers". However, Hood was horrified to see what ChatGPT was telling people and has called it a wake-up call. The system is portrayed as being credible and informative and authoritative, and it's obviously not.

Different chatbots, different answers:

The publicly available version of ChatGPT on OpenAI's website provided a description of the case surrounding the Securency scandal, but then inaccurately stated that Hood "pleaded guilty to one count of bribery in 2012 and was sentenced to four years in prison". However, the newer version of ChatGPT integrated into Microsoft's Bing search engine correctly identifies Hood as a whistleblower and specifically says he "was not involved in the payment of bribes... as claimed by an AI chatbot called ChatGPT".

Post a Comment

0 Comments

Ad Code

Responsive Advertisement