Skip to main content

Law Society Report: AI in the Justice System

13 June 2019

The Law Society has now published the final report of its Technology and the Law Commission (the “Commission”) investigation into the use of algorithms in the justice system. It follows a year-long exploration by the Commission of whether algorithms’ use within the justice system should be regulated to protect human rights and trust and, if so, how.

Algorithms and the justice system

The Commission was created to explore the role of, and concerns about, the use of algorithms in the justice system.  It held a number of public evidence gathering sessions and received written submissions from a range of multi-disciplinary experts from the fields of technology, government, commerce and human rights, culminating in the final report.

As the report defines it, an algorithm is essentially a set of instructions for solving a computational problem.  Algorithms have some advantages over humans that could prove invaluable for the legal world.  They make decisions quicker, process tasks without becoming tired or distracted, are better at recognising patterns in data and, of course, are immune to human error.  However, as the Commission has pointed out, the benefits of algorithmic systems must not be taken for granted. Therefore the technologies being deployed must be critically assessed for potential unintended and undesirable side effects.

The Commission’s lines of enquiry included considering how artificial intelligence (“AI”) technology is currently used in the justice system, and its benefits and dangers. Given the breadth of the enquiry, it is surprising that the final report focuses only on algorithms in the criminal justice system alone. Nonetheless, the work of the Commission raises interesting points for the civil justice system and commercial litigators.

How are algorithms already being used in commercial litigation?

There is plenty of scope for the use of data and algorithmic systems in commercial litigation:

  • A paper on technology and legal services released by the SRA last year identified a number of innovative solutions already in use, including a chatbot – an AI robot that can simulate a conversation – that carries out the functions of a junior barrister’s clerk.  It can find out which barristers are available, conduct conflict checks and refer clients to solicitors;
  • Technology Assisted Review (TAR) for disclosure. TAR is a computer tool which predicts whether a document is relevant in a particular case. It learns from lawyers reviewing documents manually, and becomes capable of predicting how a document will be classified, thereby speeding up the disclosure process. Under the new disclosure pilot rules, the Disclosure Review Document specifically asks parties to consider the use of TAR, and where they decide against it, they must set out their reasoning; and
  • A burgeoning industry in using data, computer science and AI to assist law firms and funders in analysing litigation risk.  There are now databases of judgments sorted by court, judge, issue and outcome so that, in theory, parties can get a feel for which way their case may be decided based on their allocated judge’s past approach.  Machine learning can also play a part in assessing the chances of winning a case, predicting the other side’s strategy or valuing a settlement – all of which could potentially inform litigation strategy.

Evidence gathered for commercial litigators to note

While all these technological advances certainly sound attractive, they are not without their flaws. The Commission set out to consider when the use of algorithms is appropriate, what kind of oversight is needed and issues such as AI bias, ensuring trustworthiness in the system and the rights of those whose data is used.

For commercial litigators, there are some interesting points to consider arising out of the Commission’s evidence-gathering sessions.  In particular, concerns were raised about issues including:

  • Inbuilt bias in the datasets used to train algorithms. An algorithm is only as good as the instructions it is given, and if it is built with the same biases as the human behind it, it could recreate institutional discrimination.
  • Machines are not as good at humans at explaining the reasoning behind their decisions, raising issues of transparency and fairness. However, in low value cases where traditional paths to dispute resolution are uneconomic, parties may be willing to dispense with thorough explanations for the sake of a quick and cost-effective adjudication. Commercial barrister Matthew Lavy of 4 Pump Court chambers gave the example in his submission to the Commission of a civil dispute with a value of £20. “Explainability remains desirable,” he writes, “but faced with a choice of having a dispute adjudicated upon by an opaque algorithmic system or not having it adjudicated upon at all (which is the realistic alternative for a £20 dispute), the former option might perhaps be thought to be preferable.
  • There are privacy concerns. While algorithms require data to learn how to make decisions, there are GDPR and client confidentiality considerations about feeding client data into an artificial intelligence system.

The final report

Although the Commission focused its final report on the criminal justice system, many of the broader themes about legal safeguards around standards, best practice and transparency apply equally to the use of algorithms in commercial litigation.

As the report says, “governing algorithmic systems in criminal justice brings multi-dimensional tensions and value-laden choices to grapple with; the tensions are usually not between a ‘bad’ and a ‘good’ outcome, but between different values that are societally held to be of similar importance”. Arguably, that is also true of the civil justice system – the benefits of AI in efficiency and consistency must be weighed up against the risks of bias, opacity, oversimplification and a loss of autonomy. 

Whatever the arguments around AI’s role in the future of law, it is clear that while the role of algorithms in civil litigation is developing rapidly, legislation and regulation have not kept pace. The question is how long it will take to catch up.

Related items

Back To Top