*/
We have all heard the stories about AI-hallucinated cases finding their way into skeleton arguments and written submissions, but until relatively recently spotting one in the wild was a rarer occurrence.
Mangled case citations have been a feature of legal research enquiries for as long as there have been cases to cite. The seasoned law librarian can untangle jumbled years and volume numbers, decode anagrammed abbreviations, spell-check mistyped or misheard party names and, more often than not, locate your desired case.
Hallucinated citations, on the other hand, present an entirely different challenge. At first glance they seem legitimate but, despite meticulous efforts to track them down, they remain frustratingly elusive.
Take, for example, a recent encounter we had with a dubious citation during the course of an enquiry. After exhausting all available tools to decode and locate the case, our suspicion grew: could this be a rogue hallucination? The deeper we dug, the clearer it became that no such case existed, at which stage we turned to the likely source, Generative AI.
For a law librarian, encountering a hallucinated citation is a real Scooby Do reveal moment, so we excitedly entered prompts into various Generative AI applications – both free and paid – asking them to summarise our hallucinated case. The results were intriguing:
These examples are given not to suggest that any particular Generative AI tool should be preferred. Rather, they highlight that interrogation is key.
While the library remains the perfect starting point for legal research, with up-to-date practitioner texts and dedicated legal databases, in reality not everyone will have immediate access to such a resource and will instead begin their journey with readily available (and often free) Generative AI applications. These tools are adept at producing convincing imitations of case references and summaries, presented to the querent with an unruffled confidence that can mislead.
Keep your research on the right track with these simple steps:
We have all heard the stories about AI-hallucinated cases finding their way into skeleton arguments and written submissions, but until relatively recently spotting one in the wild was a rarer occurrence.
Mangled case citations have been a feature of legal research enquiries for as long as there have been cases to cite. The seasoned law librarian can untangle jumbled years and volume numbers, decode anagrammed abbreviations, spell-check mistyped or misheard party names and, more often than not, locate your desired case.
Hallucinated citations, on the other hand, present an entirely different challenge. At first glance they seem legitimate but, despite meticulous efforts to track them down, they remain frustratingly elusive.
Take, for example, a recent encounter we had with a dubious citation during the course of an enquiry. After exhausting all available tools to decode and locate the case, our suspicion grew: could this be a rogue hallucination? The deeper we dug, the clearer it became that no such case existed, at which stage we turned to the likely source, Generative AI.
For a law librarian, encountering a hallucinated citation is a real Scooby Do reveal moment, so we excitedly entered prompts into various Generative AI applications – both free and paid – asking them to summarise our hallucinated case. The results were intriguing:
These examples are given not to suggest that any particular Generative AI tool should be preferred. Rather, they highlight that interrogation is key.
While the library remains the perfect starting point for legal research, with up-to-date practitioner texts and dedicated legal databases, in reality not everyone will have immediate access to such a resource and will instead begin their journey with readily available (and often free) Generative AI applications. These tools are adept at producing convincing imitations of case references and summaries, presented to the querent with an unruffled confidence that can mislead.
Keep your research on the right track with these simple steps:
Chair of the Bar reports back
Marie Law, Director of Toxicology at AlphaBiolabs
A £500 donation from AlphaBiolabs has been made to the leading UK charity tackling international parental child abduction and the movement of children across international borders
Marie Law, Director of Toxicology at AlphaBiolabs, outlines the drug and alcohol testing options available for family law professionals, and how a new, free guide can help identify the most appropriate testing method for each specific case
By Louise Crush of Westgate Wealth Management
Marie Law, Director of Toxicology at AlphaBiolabs, examines the latest ONS data on drug misuse and its implications for toxicology testing in family law cases
A career shaped by advocacy beyond her practice, and the realities of living with an invisible disability – Dr Natasha Shotunde, Black Barristers’ Network Co-Founder and its Chair for seven years, reflects on a decade at the Bar
Responding to criticism on the narrow profile of government-instructed counsel, Mel Nebhrajani CB describes the system-wide change at GLD to drive fairer distribution of work and broader development of talent
The odds of success are as unforgiving as ever, but ambition clearly isn’t in short supply. David Wurtzel’s annual deep‑dive into the competition cohort shows who’s entering, who’s thriving and the trends that will define the next wave
Where to start and where to find help? Monisha Shah, Chair of the King’s Counsel Selection Panel, provides an overview of the silk selection process, debunking some myths along the way
Do chatbot providers owe a duty of care for negligent misstatements? Jasper Wong suggests that the principles applicable to humans should apply equally to machines