Regulators are confronting how far AI can go inside litigation workflows, just as firms and vendors ramp up their own AI programmes. This week’s lead story is the Law Society’s call for urgent SRA guidance on how the Mazur ruling applies to AI-driven decision making in live cases. Around it, we look at Linklaters’ new AI lawyers group, Clifford Chance's AI-linked job cuts, Robin AI’s search for a rescue buyer, and the UK's first court ruling on AI training data.

AI in Practice

Mazur, AI and the limits on delegated decision making

The Law Society has updated its guidance on Mazur again and, for the first time, squarely raised the AI question. The new text warns that using AI to make “key decisions” in a case, which would amount to conducting litigation if taken by an individual, is a “novel development” that the Legal Services Act 2007 did not contemplate. The Society says there is no certainty how a court would treat AI-based decision making and has asked the SRA to give clear guidance “as a matter of urgency” on what is and is not permissible.

The updated note also quietly tightens some of its earlier comfort. Initial wording that suggested non-authorised people could sign statements of truth has been removed. The guidance now focuses on drafting witness statements rather than signing them, and spends more time on the interaction between Mazur and existing Civil Procedure Rules provisions about who may sign on behalf of a party.

For litigators and in-house teams, this combines two current and key issues. First, the reserved activity of “conduct of litigation” remains under scrutiny because of Mazur, particularly around what can be delegated to non-authorised staff. Second, many firms are experimenting with AI systems to recommend strategy, drafting choices and procedural steps.

The Law Society is essentially saying that, where those systems move from assistance into making key decisions, you may get close to a reserved activity and the legal position is not settled.

Until the SRA issues formal guidance (and the Court of Appeal has spoken on Mazur), the safest assumption is that an authorised litigator remains responsible for making and recording the key decisions in a case, even where AI tools are used heavily in the background. That should, perhaps, be obvious but it bears stressing.

Takeaways

Watch for SRA guidance on Mazur and AI, the progress of the Mazur appeal, and any further court comments on AI use in litigation, including in cases where AI-generated material has already caused problems.

Overuse of AI tools is a slippery slope that may have regulatory consequences.

On your radar

  • Linklaters launches specialist AI lawyers group: The Magic Circle firm has created a 20-strong "AI Lawyers" group, mixing practising lawyers and tech specialists who will sit inside practice groups to support client work. Why it matters for UK lawyers: This shows that large firms are are continuing to formalise AI capability, and clients may increasingly expect named AI structured programmes rather than ad hoc tinkering. (Artificial Lawyer)

  • Clifford Chance cuts London jobs citing AI: In another Magic Circle development, Clifford Chance is making around 50 business services roles in its London office redundant, citing the increased use of AI as a contributing factor. Why it matters for UK lawyers: This is one of the first times a major UK law firm has publicly linked job reductions to AI adoption, indicating the technology's impact is moving from theoretical to tangible workforce changes. (The Guardian)

  • Robin AI close to rescue buyer: Reporting suggests the UK-based legal AI vendor is "very close" to securing a rescue buyer after a period of uncertainty. Why it matters for UK lawyers: This is a reminder that the legal AI sector continues to merge and that firms need to build business continuity, data export and step-in rights into AI tool contracts given the current volatility of the market. (Artificial Lawyer)

  • Stability AI largely wins Getty training data case in UK court: In Getty v Stability AI litigation, the High Court has held that training the Stable Diffusion model on Getty’s images did not in itself amount to copyright infringement, though some trade mark aspects remain unresolved. Why it matters for UK lawyers: The decision gives some comfort that model training does not automatically infringe copyright, but it does not remove IP risk around outputs or specific datasets. Firms still need careful clauses around training data and indemnities. (AP News)

  • Government signals shift on AI copyright: Technology Secretary Liz Kendall has stated she is "resetting the discussion" on AI and copyright, expressing sympathy for creators who want to be paid for their work. Why it matters for UK lawyers: This suggests a potential move away from the previous government's more tech-friendly stance and could lead to a new framework requiring AI developers to license training data, impacting advice on AI model development and use. (The Guardian)

For Review

Generative AI guidance for the Bar (Bar Council) Updated Bar Council guidance on using tools like ChatGPT, with clear warnings not to rely on AI for legal analysis and reminders that duties of independence and confidentiality apply. It is also good reference point for solicitors when thinking about how counsel will expect briefs and drafts to be prepared where AI is involved. Read: Bar Council guidance page and PDF

State of AI in 2025 (McKinsey & Company) Although not focussed on the legal sector, McKinsey’s latest State of AI report finds that a very high share of organisations now say they use AI in at least one business function. Read: McKinsey’s State of AI in 2025

Detailed analysis: Getty v Stability AI ruling (Ropes & Gray) A detailed legal analysis of the UK’s first High Court ruling on generative AI and copyright. It is a useful primer for lawyers advising on intellectual property and AI risk. Read: Getty Image Loses Copyright Infringement Claim Against Stability AI (Ropes & Gray)

Practice Prompt

Select one active, contentious matter and spend 20 to 30 minutes identifying where AI or automation is used in that file.

Note which steps relate to information gathering or drafting support and which relate to decisions that could be seen as conducting litigation.

Use that map to check whether the authorised litigator’s role is clear at each decision point and whether you need to tighten supervision, record keeping or client messaging before expanding AI use further.

How did we do?

Hit reply and tell me what you would like covered in future issues or any feedback. We read every email!

UK Legal AI Brief

Disclaimer

Guidance and news only. Not legal advice. Test outputs and apply professional judgment.

Keep Reading

No posts found