Law students and academics need guidance on how to discern whether AI applications ‘pass the smell test’. A roundtable in Leeds heard that a ‘massive education programme’ is needed
Of the thousands of words of guidance about generative AI and the law that have been dispensed in the past 15 months of hype, surprisingly few have been given to a key constituency: law students. But today’s students – who will almost certainly go on to practise with the help of AI – currently lack firm rules about what is and is not acceptable use of the technology.
‘We’ve got to find a middle ground,’ Adelina Skarra, first year LLB student at Leeds Beckett University, told a roundtable discussion organised by the Law Society’s Leadership and Management Section in Leeds this week. Like most of her contemporaries, Skarra is at home using software such as ChatGPT for summarising information and as an aid to understanding. However, she is concerned about the consequences of using it to create coursework. ‘There’s a danger of losing my independence,’ she said, ‘of getting to the point when it’s not my answer, it’s AI’s answer. No one is telling us what we can do.’
The need for guidance – or guardrails – for digital native students emerged as a core theme in the discussion, chaired by Law Society president Nick Emmerson and featuring leading figures from Leeds’s law and technology scene.
And students are not the only ones who need guidance. ‘What do we want to use AI for as lecturers?’, asked Nnenna Ifeanyi-Ajufo (pictured above), professor of law and technology at Leeds Law School. She is excited by the potential – though she has already caught a system out for generating an untruth. ‘I was intrigued by its ability to apologise,’ she joked.
Patrick Grant, project director for legal tech and innovation at the University of Law, said that the entire profession ‘needs a massive education programme’ on the capabilities and risks of generative AI, which he suggested is at the peak of its hype cycle before the next inevitable trough of disillusionment rolls along. ‘We need to get back to basics: there should be a note in the code of conduct: if you don’t understand it don’t use it,’ he said.
The Law Society is already working on spreading understanding though its 21st Century Justice project, Emmerson said. He predicted one powerful application of the technology in access to justice, in particular bridging the wide gap between clients who qualify for legal aid and those who can afford to pay fees in cash. Could AI at least start the advice process? However that leads us into the vexed question of providing unbundled legal services: ‘Are we happy for the client to use an AI-generated letter?’
'There is loads of innovation going on in this space. The technology will catch up with some of these concerns in the next two to three years'
Andrew McKinlay, Page White Farrer
Andrew Dyson, cyber risk expert at DLA Piper, pointed out that, despite well-known concerns about hallucinations and other shortcomings, we should not throw the baby out with the bathwater. ‘Some applications are very low risk: AI can be used to open up access to justice if managed and controlled in the right way,’ he said. The bedrock should be professional standards, he added.
And Andrew McKinlay, partner at IP specialist Page White Farrer – a patent attorney with a PhD in machine learning – said that today’s flaws in the technology will not necessarily be there forever. ‘There is loads of innovation going on in this space. The technology will catch up with some of these concerns in the next two to three years.’
But what of adoption by law firms? Emma Pearmaine (pictured below), managing director of Yorkshire firm Ridley & Hall, agreed that AI offered a way to fill the access to justice gap. The snag is that the vast majority of firms are themselves SMEs – and ‘that’s the sector that does not have career technologists on tap’.
‘There is a real risk of creating a two-tier legal profession,’ Sarah Sargent of Yorkshire firm Lupton Fawcett agreed. As for the students’ concerns, she suggested that what the current generation is experiencing is akin to the arrival of the internet.
She reminded students that there was a time when tools such as Westlaw were a novelty: ‘Education has not caught up, to say this is where the borders are.’ But the ability to take a sceptical look at information ‘was one of the first things we were taught as undergraduates’.
In the age of large language model AI, developing a critical skill set will be a core of legal education, Luke Corcoran, senior lawyer at the Government Legal Department, agreed. Presented with a fact, whether from a human client or a machine, lawyers need to be able to say ‘does this pass the smell test? That is a skill that will serve them well in the long term.’
No comments yet