Artificial intelligence (AI) tools are significantly better at answering legal questions, but still cannot replicate the capabilities of junior lawyers, new research suggests.
Linklaters, a leading UK law firm, has tested the chatbot by setting up 50 “relatively difficult” questions about UK law.
We concluded that Openai’s GPT 2, released in 2019, was “hopeless,” but the O1 model, released in December 2024, has improved considerably.
Linklaters said that for real-world legal work, the tool has “reached a stage where they could be useful,” but only showed expert human supervision. Ta.
Law – Like many other professions, they struggle with how it will affect the recent rapid advances in AI, and whether it should be considered a threat or opportunity.
International law firm Hill Dickinson recently blocked general access to several AI tools after discovering “significant increases in usage” by staff.
There is also a fierce international debate about how dangerous AI is and how strictly regulated it is.
Last week, the US and the UK refused to sign an international agreement on AI, and US Vice President JD Vance criticized European countries for prioritizing safety over innovation.
This is the second time Linklaters has performed a Linksai benchmark test, and the original exercise took place in October 2023.
In the first run, Openai’s GPTs 2, 3, and 4 were tested alongside Google’s Bard.
The exam has been expanded to include Openai and Google’s Gemini 2.0 O1 released at the end of 2024.
It didn’t include the R1 or other non-US AI tools from Deepseek, the obviously low-cost Chinese model that surprised the world last month.
The test included suggesting the type of questions that required advice from a “competent medium-sized lawyer” with two years of experience.
The new model showed “significant improvements” to its predecessor, Linklaters said, but it still fell below the level of qualified lawyers.
Even cutting-edge tools have made mistakes, omitted important information, and invented quotations, although fewer than previous models.
The tool is “starting to run at a level that can support legal research,” Linklaters says, providing examples of providing initial drafts and checking answers.
However, the lawyer said there is a “danger” to use them if they “have not yet had a good idea of answers.”
Despite recent “incredible” advancements, he added that there are still questions about whether it will be replicated in the future, or whether there were “inherent limitations” to what AI tools can do. .
In any case, client relationships are always an important part of what lawyers have done, so even future advancements in AI tools do not necessarily end what it calls “a bit of meat quality in the provision of legal services.” That’s not necessarily the case.