*/
How to cross-examine your Gen AI tools and interrogate the outputs? Sally McLaren’s tips for using AI safely in legal research
Unless you’ve been completely off-grid for the past 12 months or so, you’ve likely encountered the deluge of news, articles, explainers, and enthusiastic LinkedIn posts about the wonders and/or terrors of Generative AI.
If you have been offline and missed it all, then congratulations! It’s been a lot! This bit is for you. The AI savvy/weary may skip ahead:
The term ‘artificial intelligence’ (AI) has been in use since the 1950s and refers to the simulation of human intelligence in machines, enabling them to perform tasks that typically require human-like understanding, reasoning, learning, and problem-solving.
Generative AI (Gen AI) is a type of AI that can create or generate content, such as text, images, or other data, by learning from large datasets and producing novel outputs based on observed patterns. Popular examples include: ChatGPT, Claude and Gemini.
There are probably as many Gen AI evangelists as there are prophets of doom, but in between the two camps is a DMZ populated by many more wary adopters, curious skeptics and AI casuals. It is increasingly unrealistic to think that students, pupils, barristers, or indeed law librarians, won’t be using Gen AI. Quite the opposite. Leveraging these new tools is fast becoming a marketable skill. However, as useful as Gen AI can be, there is a significant degree of risk attached to employing it in your studies and practice.
Here are eight tips to help minimise the risks associated with using Gen AI:
Gen AI is just another tool to be leveraged, albeit carefully. Investing time in mastering this new skill and learning more about risks and effective use is key. Explore a curated list of online courses, many of which are free, here.
Unless you’ve been completely off-grid for the past 12 months or so, you’ve likely encountered the deluge of news, articles, explainers, and enthusiastic LinkedIn posts about the wonders and/or terrors of Generative AI.
If you have been offline and missed it all, then congratulations! It’s been a lot! This bit is for you. The AI savvy/weary may skip ahead:
The term ‘artificial intelligence’ (AI) has been in use since the 1950s and refers to the simulation of human intelligence in machines, enabling them to perform tasks that typically require human-like understanding, reasoning, learning, and problem-solving.
Generative AI (Gen AI) is a type of AI that can create or generate content, such as text, images, or other data, by learning from large datasets and producing novel outputs based on observed patterns. Popular examples include: ChatGPT, Claude and Gemini.
There are probably as many Gen AI evangelists as there are prophets of doom, but in between the two camps is a DMZ populated by many more wary adopters, curious skeptics and AI casuals. It is increasingly unrealistic to think that students, pupils, barristers, or indeed law librarians, won’t be using Gen AI. Quite the opposite. Leveraging these new tools is fast becoming a marketable skill. However, as useful as Gen AI can be, there is a significant degree of risk attached to employing it in your studies and practice.
Here are eight tips to help minimise the risks associated with using Gen AI:
Gen AI is just another tool to be leveraged, albeit carefully. Investing time in mastering this new skill and learning more about risks and effective use is key. Explore a curated list of online courses, many of which are free, here.
How to cross-examine your Gen AI tools and interrogate the outputs? Sally McLaren’s tips for using AI safely in legal research
Chair of the Bar reports back
Marie Law, Director of Toxicology at AlphaBiolabs
A £500 donation from AlphaBiolabs has been made to the leading UK charity tackling international parental child abduction and the movement of children across international borders
Marie Law, Director of Toxicology at AlphaBiolabs, outlines the drug and alcohol testing options available for family law professionals, and how a new, free guide can help identify the most appropriate testing method for each specific case
By Louise Crush of Westgate Wealth Management
Marie Law, Director of Toxicology at AlphaBiolabs, examines the latest ONS data on drug misuse and its implications for toxicology testing in family law cases
A career shaped by advocacy beyond her practice, and the realities of living with an invisible disability – Dr Natasha Shotunde, Black Barristers’ Network Co-Founder and its Chair for seven years, reflects on a decade at the Bar
The odds of success are as unforgiving as ever, but ambition clearly isn’t in short supply. David Wurtzel’s annual deep‑dive into the competition cohort shows who’s entering, who’s thriving and the trends that will define the next wave
Where to start and where to find help? Monisha Shah, Chair of the King’s Counsel Selection Panel, provides an overview of the silk selection process, debunking some myths along the way
Do chatbot providers owe a duty of care for negligent misstatements? Jasper Wong suggests that the principles applicable to humans should apply equally to machines
There is no typical day in the life as a Supreme Court judicial assistant, says Josephine Gillingwater, and that’s what makes the role so enjoyably diverse