Judges in England and Wales now have access to large-language model artificial intelligence software on their personal computers, the HM Courts and Tribunals Judiciary revealed today. The availability of a commercial Microsoft product is highlighted in an update to judicial guidance on the use of AI, first published at the end of 2023.
The seven-page guidance is a page longer than the original version, with a glossary expanded to cover terms including ‘hallucination’ and ‘AI agent’. Tips for spotting submissions produced by AI are also given.
'Provided these guidelines are appropriately followed, there is no reason why generative AI could not be a potentially useful secondary tool,' the note concludes. Potential uses include summarising large bodies of text, so long as care is taken to ensure accuracy. AI tools can also be used in writing presentations. However judges are warned against use for legal analysis: 'the current public AI chatbots do not produce convincing analysis or reasoning', the document states.
Judges are warned that 'AI tools are now being used to produce fake material, including text, images and video'. Indications that submissions may be AI-generated include:
- references to cases that do not sound familiar, or have unfamiliar citations (sometimes from the US),
- parties citing different bodies of case law in relation to the same legal issues,
- submissions that do not accord with your general understanding of the law in the area,
- submissions that use American spelling or refer to overseas cases, and content that (superficially at least) appears to be highly persuasive and well written, but on closer inspection contains obvious substantive errors.
The refreshed guidance repeats the warning 'Do not enter any information into a public AI chatbot that is not already in the public domain… Any information that you input into a public AI chatbot should be seen as being published to all the world.’ It also notes that information generated by AI 'will inevitably reflect errors and biases in its training data, perhaps mitigated by any alignment strategies that may operate'.
This guidance highlights the availability of Microsoft’s private AI tool, Copilot Chat, as part of an update on all judicial devices. 'As long as judicial office holders are logged into their eJudiciary accounts, the data they enter into Copilot remains secure and private,’ it states.
An introduction signed by the lady chief justice, the master of the rolls, the senior president of tribunals and the deputy head of civil justice, advises office holders 'to inform litigants that they are responsible for the AI-generated information they present to the court/tribunal, just as for any other type of evidence'.
No comments yet