[ad_1]
Facepalm: During the short time that it has been popular among the public, ChatGPT’s usage within the legal space has not gone well. While this latest incident doesn’t involve the generative AI citing made-up cases, it did see a law firm rebuked by a judge for using the tool to calculate lawyers’ fees at an “excessive” $600 per hour.
New York City-based Cuddy Law Firm used ChatGPT to support its application for the massive fees the company was charging for a recently won trial.
Cuddy had sued New York City’s Department of Education on behalf of a mother and her special needs child. Under the Individuals with Disabilities Education Act, the district court, in its discretion, may award reasonable attorneys’ fees as part of the costs to the parents of the child with a disability who is the prevailing party.
OpenAI’s tool was asked what a reasonable hourly rate would be for an attorney with up to three years of experience in a hearing over disabilities education. It said the amount could range between $200 and $500 per hour, and that lawyers who specialize in certain areas of law could command higher rates of up to $1,200 per hour or more.
Taking ChatGPT’s advice, Cuddy Law submitted a final bill of $113,484 for its services, equating to approximately $550 to $600 per hour. But federal district judge Paul Engelmayer was not impressed by the firm’s use of the technology to calculate its bill.
“It suffices to say that the Cuddy Law Firm’s invocation of ChatGPT as support for its aggressive fee bid is utterly and unusually unpersuasive,” Engelmayer wrote in his opinion, calling it well above reasonable demands.
The judge noted that as ChatGPT did not identify the data it used to reach its conclusion, it was impossible to know if this information was “very real and relevant” or just something it made up, which remains a problem for generative AIs. The fact the company failed to identify the precise inputs used to receive ChatGPT’s conclusions was also mentioned.
Judge Engelmayer pointed out two of the many legal cases in which ChatGPT had hallucinated information. One was Mata v. Avianca, a lawsuit involving a personal injury case against Avianca Airlines. It saw attorney Steven A. Schwartz submit a 10-page brief citing several relevant court decisions referencing similar cases, all of which had been fabricated by ChatGPT.
The other case, Park v. Kim, involved lawyer Jae Lee, who was referred to the attorney grievance panel after she used ChatGPT for research in a medical malpractice lawsuit and did not confirm that the case she cited was valid.
In the Cuddy case, Judge Engelmayer ultimately decided that the company’s requested lawyers’ fees were to be slashed in half, to $53,050, for reasons that included the use of ChatGPT. He added that “Barring a paradigm shift in the reliability of this tool, the Cuddy Law Firm is well advised to excise references to ChatGPT from future fee applications.”
[ad_2]