The Civil Justice Council has proposed new procedural safeguards that would require legal documents submitted in civil litigation to disclose whether artificial intelligence was used in their preparation, a move the council says is intended to protect the integrity of evidence and preserve fairness in court proceedings. According to the council’s consultation, chaired by Lord Justice Birss, the measures would focus on pleadings, advocacy materials, witness statements and expert reports and invite comment from stakeholders through an eight‑week consultation.
Among the options under consideration is an “enhanced statement of truth” for certain materials, and a requirement that witness statements used at trial include an explicit declaration that the content was produced without substantive assistance from AI. The working group has signalled particular concern about witness statements governed by Practice Direction 57AC in the Business and Property Courts, saying that their purpose, to record the witness’s own words and account, would be undermined if AI were used to draft or significantly rephrase evidence.
For expert evidence the council proposes mandatory disclosure of any substantive use of AI beyond routine transcription or administrative tasks and that experts identify the specific tools they relied on. The move reflects growing unease about experts using generative models without adequate verification: recent commentary and analysis point to instances where experts have inadvertently incorporated inaccurate AI outputs into reports. Industry guidance stresses that expert duties under Part 35 of the Civil Procedure Rules require the expert’s opinion to be independently reached and transparently presented.
By contrast the working group judges that statements of case and skeleton arguments produced with the involvement of a named legal professional may not require additional formalities, although the consultation paper leaves open a simpler alternative, requiring a clear, specific declaration where AI has been used in drafting advocacy materials. That reflects a balance the council is seeking between permitting technological assistance and ensuring that the origins and reliability of materials placed before the court are clear.
The proposals mirror steps taken overseas to curb the use of generative AI in evidential material. The Supreme Court of the Turks and Caicos Islands has issued a practice direction forbidding GenAI in affidavits and witness statements so that evidential materials remain rooted in personal recollection, while the Caribbean Court of Justice has announced similar restrictions alongside conditions for attorneys who use AI for non‑evidential drafting. Those jurisdictions highlight common concerns about accuracy, bias and the need for independent verification when AI is involved.
Consultation respondents and legal commentators have also warned of practical tensions: regulating AI use by litigants in person presents particular difficulties because of the technology’s potential to improve access to justice, even as unchecked reliance on models risks introducing error into proceedings. The council has framed these questions as part of a broader policy trade‑off between ensuring reliable, verifiable evidence and not imposing disproportionate burdens that might impede access to the courts.
The consultation is open until 14 April 2026, with the working group inviting submissions on the proposed rule changes and on whether Part 32 and Practice Direction 35 should be amended to require the suggested declarations and disclosures. The council says responses will inform any recommendations to reform the Civil Procedure Rules.
Source Reference Map
Inspired by headline at: [1]
Sources by paragraph:
- Paragraph 1: [2], [3]
- Paragraph 2: [3], [4]
- Paragraph 3: [4], [7]
- Paragraph 4: [2], [3]
- Paragraph 5: [5], [6]
- Paragraph 6: [4], [3], [7]
- Paragraph 7: [2], [3]
Source: Noah Wire Services