Computer programs can already match judges in decision-making, a conference highlighting the growing use of artificial intelligence in law heard last night.
A poll of more than 300 attendees at the Law Society’s Robots and Lawyers conference found that 48% of respondents’ firms already use some form artificial intelligence (AI) — though only 4% agreed that lawyers will eventually be replaced by robots.
However, research conducted by the University of Liverpool suggests a decision-making algorithm could be as effective at dispensing justice as a judge.
Professor Katie Atkinson (pictured), head of the university’s department of computer science, said the university had researched whether its computer programs could replicate the reasoning that judges go through.
Looking at ‘a body of case law’ covering 32 cases, the programs had a 96% success rate and got only one case wrong, she said. Atkinson said she saw the technique as a 'decision support tool' to help make reasoning 'faster, more efficient and consistent', assimilating data over time 'so it will be there to help and support with the reasoning'.
Law Society president Jonathan Smithers told the event that although machine-learning and artificial intelligence may not strictly be human, their uses, applications and results ‘must still be subject to the rule of law’.
Smithers highlighted several examples where the concept of ‘legal person’ had been extended in the law to less traditional entities, from companies to — in some jurisdictions — animals and features of the natural environment.
He said: ‘The avenues for rights and redress used to be narrow, used to be restricted to people, but they are widening. This expansion, however, does not necessarily mean that robots will be recognised as legal persons or that they will automatically have rights.’
Smithers said it was essential that the law considers legal effects of the current uses of machine-learning technology. Questions on tort liability must be answered on the use of technology such as drones, the digital currency Bitcoin and driverless cars, he added.
‘There are increased pressures on developers and companies to make new forms of technology available more quickly in order to maximise commercial opportunities. Inevitably this brings risks that need to be identified and must be mitigated.’
But even though lawyers will be needed to provide ‘sound, robust and evidence-based’ answers to questions, they cannot become complacent, Smithers warned. ‘The new uses of machine learning and artificial intelligence show that technology has evolved from science fiction to science fact. Unless we keep up with the pace of technology, unless we show leadership and take action in this field, unless we show determination and imagination in this sector, our legal system may not be fit for purpose,' he said.
Atkinson said the law was a ‘rich domain’ for AI law researchers. Research topics in AI and the law include natural language processing, knowledge representation and computational argumentation.
Some law firms had approached her ‘about the amount of data they have and they do not know what to do’, she added. ‘People want to know what we can do to process this and learn from it. There are algorithms out there that are well developed for doing this kind of processing. Getting our hands on that data and seeing what we can produce might give some insights to the professionals using these [AI] tools.’
44 Readers' comments