Big data is making its presence felt in legal practice in the most direct possible way. A commercial database of court case outcomes in the US and now in the UK lays bare named advocates’ success rates. ‘Clients are using this information to make decisions about what to do,’ Ian Dodd, director of legal analytics company Premonition told a conference at the Law Society last week.
For example, data from 3,124 courts in the US shows a strong correlation between the outcome of cases and particular lawyers and judges. ‘If you find you have drawn the wrong combination of individuals you might want to settle out of court pretty quickly,’ Dodd said.
And if computers can reliably predict the outcome of a case, surely it is a simple step to dispensing with the formal proceedings entirely. While none of the speakers at the event, Legal Services in a Data Driven World, showed any enthusiasm for the prospect, all recognised the possibilities – and threats.
‘Technology is increasingly capable of looking at the parameters of a case and deciding whether a party is guilty or not,’ said Bruce Braude, head of strategic client technology at international firm Berwin Leighton Paisner. ‘Do we want that? Probably not. Personally I wouldn’t want to live in a society where people are judged by computers.’
Yet the data-driven world of the event’s title is no exaggeration. Former Microsoft executive and ‘chief envisioning officer’ Dave Coplin set the scene by describing how the impact of unprecedented amounts of data as a result of the digitisation of everyday life is changing the world. In particular, it is the driving force behind what is inaccurately termed ‘artificial intelligence’. Like humans, computers do not learn by memorising a rulebook but by absorbing millions of examples of how something is done – hence the sometimes spookily accurate predictions of internet search engines.
One such caused an awkward moment at Microsoft, Coplin revealed, when the authorities noticed that one of the top auto-completes for ‘How do I join…’ was ‘Isis’. Of course no one had set out to make the search engine a terrorist recruiting sergeant, the ‘intelligence’ the search engine was displaying was simply statistics-based pattern recognition.
But humans cannot absolve themselves of responsibility. A theme throughout the day was the need to recognise that the data we pick and the algorithms we design for our data-driven world are human constructs. And, as such, reflect – and potentially perpetuate – human biases. ‘Software reflects the values of its creators, data the values of its collectors,’ Silkie Carlo, senior advocacy officer at Liberty, warned.
Software prejudices
Coplin gave a simple example, inviting the audience to search the term ‘lawyer’ on the web. The vast majority of the images thrown up will be of men. Another danger is that of false correlations, such as the uncanny match between sea temperatures and incidents of piracy.
‘We have to take some accountability for what we teach the algorithm. We need to know who are the people who wrote these algorithms,’ Coplin said.
Yet this is emphatically not a reason to reject the data-driven world, but rather to embrace it. He urged law firms to ‘smother themselves’ with data, using it to analyse every aspect of how they work. To unlock this ‘data dividend’ firms must digitise their business so that every asset exists as a digital data, he said. ‘You have to develop a data culture.’
Part of this culture is learning how to handle and process this data effectively. Conference chair Peter Wright, Law Society council member and managing director of DigitalLawUK, reminded delegates that there are just 150 working days until the EU General Data Protection Regulation comes in to force. One expert, Dessislava Vitcheva, data privacy manager of consultancy KPMG, painted an alarming picture of businesses only just realising the implications. ‘There is total panic,’ she said.
Vitcheva pointed to several remaining grey areas in how the regulation will be interpreted, for example where the new position of data protection officer should sit within an organisation and when a privacy impact assessment should be carried out. Meanwhile the detail of what is expected to be the Data Protection Act 2018 has still to be thrashed out by parliament.
However any hopes that the new regime will have a soft launch on 25 May 2018 were dashed by Karen Round, senior policy officer at the Information Commissioner’s Office. ‘Day 1 is day 1. It will be in force,’ she said, promising that new guidance from the EU Article 29 Working Party would be available by the new year. The good news is that businesses already following best practice on data protection will have little to fear. The new regime really is ‘evolution not revolution’, she stressed.
Data protection is a snail
Whether this culture has permeated business culture remains to be seen. Another panellist, Pavel Klimov, former general counsel at computer giant Unisys, says too often data protection is viewed as a compliance issue rather than a matter of good business practice. That needs to change, he said, prompting a laugh from the audience with the observation: ‘If you make ginger beer, the reason you don’t put a snail in the bottle is not because it’s a breach of regulations.’
Opening the conference, Law Society vice president Christine Blacklaws reminded attendees that the challenges created by big data are ‘not in the far distant future but here, today or tomorrow’.
Law Society research shows that the profession knows that ‘there is a burning platform and we are on it’. But mercifully, the programme did not rehash any tired debates about robots replacing lawyers. Coplin stressed that the future will not be humans versus machines but ‘humans with machines’.
And Blacklaws summed up: ‘There will always be room for the trusted legal adviser. We’re talking about augmentation rather than extinction.’
No comments yet