Highlights
- Judges trained on restricted Microsoft Copilot version for case preparation.
- Asylum appeal backlog nearly doubles in one year.
- AI generates case outlines, summaries but cannot analyse evidence.
Artificial intelligence experts and barristers have raised serious concerns about immigration tribunals using AI chatbots to draft legal rulings.
They warn that current models are unreliable and require close monitoring for potential errors before being widely deployed across the justice system.
Despite these warnings, immigration tribunal judges have already been trained on a restricted version of Microsoft's chatbot Copilot and have official approval to use the technology for preparing hearings and writing decisions.
The deployment comes as the government faces a record backlog of 104,400 asylum appeals, nearly double the number from a year ago.
Justice secretary David Lammy announced last February that the judiciary was testing transcription in courts and tribunals, with some immigration and asylum chamber judges using it to formulate notes and write remarks.
Geoffrey Vos, the Master of the Rolls, suggested in February that people might prefer machines delivering justice in future as they would prove far quicker and cheaper than waiting for human judges.
Technology already deployed
Training materials show judges are encouraged to use AI for generating case outlines ,an index of parties' evidence bundles and bundle summaries that organise cases, evidence and create event timelines.
The AI can produce lists of disputed issues between parties and use them to populate decision templates.
Lord Justice Dingemans, senior president of tribunals, explained in a training video how judges could use AI and its decision-making tree to generate summaries of findings on matters including case anonymity, background, witness statements and arguments.
He noted all that work comes pre-done, meaning judges arrive at hearings better prepared and completely on top of issues.
Judges must deliver decisions within two weeks after hearings. They are instructed not to use AI for analysis and remain solely responsible for judgements.
However, they can use AI to review decisions against evidence and submission summaries.
The technology can comment on how fully decisions address matters raised in evidence and submissions, identifying any omissions.
HM Courts and Tribunals Service stated the tool focuses on transcribing decisions dictated by judges and was developed following the Ministry of Justice's responsible AI principles.
The AI does not contribute to analysing or balancing evidence or arguments presented.





