Bar Council has updated its guidance on generative AI, clarifying that AI use is now a mainstream professional conduct issue rather than an optional innovation project.
This week’s briefing looks at what the Bar Council note means in practice for supervision, confidentiality and verification, then scans the latest case law, firm experiments and longer reads, including the Law Society of Scotland’s relaunched Accredited Legal Technologist scheme.
Bar Council updates generative AI guidance: what it means for day to day practice
The Bar Council has updated its “Note on generative artificial intelligence”, responding directly to recent cases where lawyers have put fictitious authorities before the courts and to the rapid spread of tools like ChatGPT in legal work. The guidance recognises that barristers may use generative AI, but only where they remain fully responsible for the work product and comply with existing duties to the court and to clients. (Bar Council guidance)
The updated note highlights familiar but now formalised risks. These include hallucinated case law and misstatements of fact, the temptation to anthropomorphise systems as if they were junior colleagues, and the confidentiality and privilege risks of pasting sensitive content into public tools whose terms allow broad data use. It explicitly links misuse of AI to potential disciplinary action, negligence and breach of duty, rather than treating it as a purely technical issue. (Hogan Lovells detailed summary has further thoughts here)
The guidance shifts expectations onto chambers and, by implication, firms and legal teams. Policies and training is now expected to ensure that any AI use is appropriately supervised. The message is that AI outputs should be treated like a first draft from an inexperienced pupil or trainee: useful and with potential, but only after checking sources and testing of the output.
Although addressed to the Bar, these themes track closely with what many firms and in-house teams are already doing or contemplating. It is this writer's humble view that these guidance points can be expected to be adopted throughout the industry and across regulators (as we saw from the Law Society last week). This could be seen as a defensible minimum for AI use in litigation and advisory work. clear policies, record keeping, no blind reliance on outputs, and no ungoverned use of public tools for live matters.
Takeaways
The takeaways are clear:
Clear policies
Record keeping
No blind reliance on output
No ungoverned use of public tools
This should be ingrained in all lawyers' minds when using AI.
On your radar
Mishcon uses AI chatbot instead of traditional trainee application forms: According to the Law Society Gazette, applicants for Mishcon de Reya’s 2026 graduate recruitment scheme will no longer complete a standard form. Instead, they will submit basic information, which is then used by an AI tool to run a tailored introductory interview style conversation.
Why it matters for UK lawyers: This is a concrete example of AI moving into core HR processes in a City firm, raising familiar questions about fairness, bias, data protection and explainability that in house and private practice employment teams will increasingly be asked to address. (Law Society Gazette report)
ChatGPT turns 3 – adoption outpacing structural change: Artificial Lawyer’s reflection on 3 years of ChatGPT argues that, while individuals and legal tech vendors have changed their workflows dramatically, the core Big Law business model has barely shifted. New AI first “NewMod” firms are emerging, but still account for a small fraction of total market revenue.
Why it matters for UK lawyers: This is a useful corrective to the narrative that “everything has changed already”, and a reminder that clients, culture and pricing structures may move more slowly than the tools. (Article)
Law Society of Scotland relaunches Accredited Legal Technologist scheme: Artificial Lawyer reports that the Law Society of Scotland is relaunching its Accredited Legal Technologist scheme, first introduced in 2019. This recognises individuals with significant experience at the intersection of law and technology and maintains a public list of accredited technologists. The scheme requires evidence of legal tech experience, tools used, contribution to firm or organisational strategy, external thought leadership and adherence to ethical and professional standards. The scheme offers networking, conference access and a structured reaccreditation path. (Artificial Lawyer and Law Society of Scotland)
For Review
The era of general AI in legal work (Artificial Lawyer and LexisNexis)
Gabrielle Klopfer describes how LexisNexis’ “Protégé General AI” aims to move from simple search to context sensitive analysis that connects statutes, regulations, case law and commercial data, all tied back to verifiable sources. The emphasis is on curated content, traceable citations and human judgement, rather than opaque black box answers.
Ethics of AI in the workplace (Trowers and Hamlins)
This piece sets out a pragmatic approach for employers who are adopting AI while managing fears about job losses and bias. It suggests starting with an AI audit, involving staff in identifying use cases, and building an AI strategy that covers transparency, accountability, discrimination risk and the UK’s patchwork of regulation, including GDPR, the Equality Act and the forthcoming Employment Rights Bill. It also warns about “shadow AI” where employees use unsanctioned tools at work.
On the immortality of Microsoft Word (The Redline)
The Redline argues that Word and the docx format remain the backbone of legal work, and that legal tech which tries to bypass Word rather than integrate with it is likely to struggle. For firms planning their AI and document strategy, it is a reminder that whatever tools you deploy will need to sit comfortably alongside Word, court bundles and existing document standards.
Practice Prompt
Chronology Builder
Try the below prompt to build a chronology from documents. Ensure you fill in context and constraints and other aspects marked with {}.
Remember to adhere to the golden rules and do not upload confidential or privileged information to public tools.
Do not blindly rely upon your output. Ensure you check it thoroughly.
You are a UK solicitor. Goal: {goal}.
Context: {facts}.
Constraints: {constraints}.
Deliverable: {deliverable} (tone: {tone}, length: {length}).
Verify: cite sources; flag uncertainty; list assumptions.
Task: Build a neutral chronology from pasted email and note extracts.
Output:
1) A Markdown table with columns:
date (ISO) | author | recipient | summary | amount | document ref
2) A bullet list of missing documents, ambiguities and checks
3) Any immediate limitation or pre-action protocol considerations to watch.
Standards:
- Keep language neutral. Do not infer beyond the text.
- Normalise dates to YYYY-MM-DD.
- If amounts appear, include currency and whether inclusive or exclusive of VAT.
How did we do?
Hit reply and tell us what you would like covered in future issues or any feedback. We read every email!
We would also love to hear how the prompt contained in this newsletter worked for you and what tweaks you applied to it.
UK Legal AI Brief
Disclaimer
Guidance and news only. Not legal advice. Test outputs and apply professional judgment.
