ChatGTP verses Lawyers and the Court

Artificial intelligence (“AI”) will allegedly have a larger impact on humanity than the Industrial Revolution.  AI has become integrated into our everyday lives through online shopping, browsing the internet, ordering food delivery, hailing a cab, traveling, or simply enjoying music and movies.  Early in 2023, legal professionals began to worry that AI will soon take over their professions, but with recent developments this seems highly unlikely.

Launched on 30 November 2022, ChatGPT claimed to be able to generate insightful and interesting responses to queries on an almost endless range of topics by drawing from a tremendous quantity of vocabulary, knowledge, and contexts.  The AI program was marketed to legal professionals as having the potential to increase productivity and efficiency, improve communication, reduce costs, and provide access to a broad range of information.  Despite AI appearing to have endless potential, limitations have recently been discovered by the Court.

AI Lawyering Case Study:   Mata v.  Avianca, Inc., 1:22-cv-01461, (S.D.N.Y.)

On 27 August 2019, Mr Mata alleged that an airline employee struck him in the leg with the serving car while he was a passenger on Avianca Flight 670 from El Salvador to New York.  The matter was later transferred to the Southern District of New York (Federal Court).

  • Avianca Airline, the Defendant, relied upon statute law, pursuant to section 35 of the Montreal Convention 1999 (MC99) which states that the statute of limitations for such a claim is two (2) years.  Therefore, it argued that the Plaintiff’s claim was brought outside of the limitations period (approximately six (6) months after) and was subsequently statute barred.
  • Mr Mata’s lawyers, the Plaintiff, relied on case law to argue that the claim should proceed.  The “strong” case law cited included the following:-
    • Zicherman v. Korean Air Lines Co.  Ltd
    • Varghese v. China Southern Airlines
    • Shaboon v. Egypt Air

On considering the cases, an issue was discovered by the Court and the Defendant when they failed to locate the said cases relied upon by the Plaintiff.  The Plaintiff was given a Show Cause Order by the Court to produce the cases cited in their argument.  The Plaintiff’s lawyer, believing that ChatGPT was a reliable search engine and the cases were real, attempted to obtain copies of the cases but was unsuccessful since they were non-existent.  The Plaintiff had cited as many as six (6) cases, which the Court eventually determined “to be bogus judicial decisions with bogus quotes and bogus internal citations”.  The AI engine while in its current form was found to be prone to a fatal flaw – hallucinations and bias.

With the rise of technology and innovation, both businesses and individuals must act responsibly to implement these new technologies.  Used correctly they can improve the standard, cost, and timeliness of legal service.  While AI can assist people with routine activities, clearly not all jobs can be delegated to AI, least of all the role of advocacy.

 

turned_in_notArtificial Intelligence
Previous Post
Changes to Queensland tenancy laws
Next Post
Paternity Testing
Call (07) 4944 2000