Posted on

Artificial intelligence accused of misquoting and defaming people online could face litigation as a result of the false information it outputs, legal experts warn.

But the scholars split on whether the bots should be sued under the law of defamation or the law of product liability, given it’s a machine — not a person — spreading the false, hurtful information about people.

“It’s definitely unchartered waters,” said Catherine Sharkey, a professor at New York University School of Law. “You have people interacting with machines. That is very new. How does publication work in that framework?”

Brian Hood, a mayor in an area northwest of Melbourne, Australia, is threatening to sue OpenAI’s ChatGPT, who falsely reports he’s guilty of a foreign bribery scandal.

The false accusations allegedly occurred in the early 2000s with the Reserve Bank of Australia.

Mr. Hood’s lawyers wrote a letter to OpenAI, which created ChatGPT, demanding the company fix the errors within 28 days, according to Reuters news agency. If not, he plans to sue for what could be the first defamation case against artificial intelligence.

Mr. Hood is not alone in having a false accusation generated against him by ChatGPT.

Jonathan Turley, a law professor at George Washington University, was notified that the bot is spreading false information that he was accused of sexual harassment that stemmed from a class trip to Alaska. The bot also said he was a professor at Georgetown University, not George Washington University.

“I learned that ChatGPT falsely reported on a claim of sexual harassment that was never made against me on a trip that never occurred while I was on a faculty where I never taught. ChapGPT relied on a cited Post article that was never written and quotes a statement that was never made by the newspaper,” Mr. Turley tweeted on April 6.

The Washington Post reported April 5 that no such article exists.

Open AI did not immediately respond to a request for comment.

Neither did Google’s Bard or Microsoft’s Bing, both similar to ChatGPT, about the potential for errors and resulting lawsuits.

Eugene Volokh, a law professor at UCLA, conducted the queries which led to the false accusations surfacing against Mr. Turley.

He told The Washington Times that it’s possible OpenAI could face a defamation lawsuit over the false information, especially in the case of the Australian mayor who has put the company on notice of the error.

Typically, to prove defamation against a public figure, one must show the person publishing the false information did it with actual malice, or reckless disregard for the truth.

Mr. Volokh said putting the company on notice of the error lays out the intent needed to prove defamation.

“That is how you show actual malice,” he said. “They keep distributing a particular statement even though they know it is false. They allow their software to keep distributing a particular statement even though they know they’re false.”

He pointed to the company’s own technical report from March where it noted the “hallucinations” could become dangerous.

“GPT-4 has the tendency to ‘hallucinate,’ i.e. ‘produce content that is nonsensical or untruthful in relation to certain sources,’” the report read on page 46. “This tendency can be particularly harmful as models become increasingly convincing and believable, leading to overreliance on them by users.”

Ms. Sharkey, though, said it’s difficult to attribute defamation charges to a machine since it isn’t a person publishing the content — but rather a product.

“The idea of imputing malice or intent to a machine — my own view is, we are not ready for that,” she said. “What really it’s showing is … the future here is going to be about forming product liability claims.”

She said plaintiffs could potentially go after companies for faulty or negligent designs that result in algorithms putting out damaging information, impugning reputation.

Robert Post, a professor at Yale Law School, said all of this is new and will have to be tested through lawsuits in the courts — or lawmakers will have to address the issue with a statute.

“There are lawsuits. Judges make rulings in different states and gradually the law shifts about and comes to conclusion,” he said. “This is all yet to be determined.”

Leave a Reply

Your email address will not be published. Required fields are marked *