The head of the government-funded LawtechUK programme has made a new call on the government to support technology-based approaches to relieving the access to justice crisis. Christina Blacklaws, chair of LawtechUK, was speaking at an International Bar Association event exploring the potential of artificial intelligence to help individuals and non-profit advice agencies to navigate the justice system.
Blacklaws outlined several potential 'use cases' and name-checked three British startups working in this field - but pointed out that only 7% of the 356 systems identified by the 'LawtechUk ecosystem tracker' are aimed at individuals rather than commercial firms. 'There’s nothing really in the social welfare area, where the need is greatest and the provision is least,' she said.
In this sector, generative AI-based systems could be 'a real leveller', Blacklaws said. 'I’m super excited about the possibilities.’ But barriers to their development include lack of money, technology skills and the sheer pressure on advice services. ‘People coping with a tsunami of need cannot start to think about doing anything else,’ she said.
Read more
Blacklaws, a former family practitioner, confessed to being a ‘long-term tech nerd’. As Law Society president she chaired a pioneering commission on algorithms in the justice system, which reported in 2019. The IBA conference heard that many of the issues identified in the report, such as the risk of built in bias, are still relevant today.
Criminal defence attorney Sherry Levin Wallach, of the Legal Aid Society of Westchester County, New York, said current challenges include a risk to client confidentiality - even with so-called 'closed model' systems. Others include 'hallucinations' caused by the inability of current generation systems to admit they cannot answer, and the admissibility of AI-generated evidence.
Wallach advised practitioners that the use of AI, to draft documents, for example, is acceptable ‘so long as the information is helping you, not doing it for you. It’s another tool'. However human lawyers must check the output - and not simply by asking the AI 'is this a real case authority'? she said.
Another issue is the lack of consistent rules by courts for declaring when AI has been used in a submission. ‘Generally the idea is, if you’re using AI in support, you don’t need to disclose it. But if it is helping you draft a document submitted to the court, that’s when you need to disclose,’ she said - stressing that this is down to individual judges.
Despite such issues, Wallach pointed to the danger of premature over-regulation. ‘We have to be careful that we don’t regulate before we know what we’re regulating,’ she said.
Blacklaws agreed, recalling the regulatory panic which blocked the adoption of genetically modified food in the UK. But AI cannot be put back in the box, she said. 'Let’s be optimistic and use this powerful technology to create a level playing field.'
Achieving this ambition will require government support, she said. The Ministry of Justice-funded LawtechUK programme itself concludes in April 2025. 'We are fortunate our government has invested,' she said. 'I hope it will continue to invest in the provision of solutions through technology for people whose needs are currently unmet.'
This article is now closed for comment.
3 Readers' comments