The Civil Justice Council has proposed that solicitors and barristers should be required to confirm they did not use artificial intelligence to generate the content of witness statements intended for trial, while stopping short of mandating AI disclosures for most other court documents so long as a lawyer accepts professional responsibility for them. According to the council's interim report and consultation paper, the measure aims to preserve the truthfulness and personal voice of witness testimony while allowing legal teams to continue exploiting AI for research, drafting and administrative tasks where appropriate.

The working group, chaired by Sir Colin Birss, says large language models have already reshaped legal practice but carry well-documented risks including hallucination and the embedding of training-data biases. It therefore recommends a targeted rule: trial witness statements prepared in the context of civil procedure should carry a declaration that AI was not used to create, embellish, rephrase or otherwise alter the witness’s evidence, except for non-text-generating aids such as transcription. The proposal is presented as part of a wider consultation on whether new rules are required for pleadings, skeleton arguments and expert material.

The group argues that, for non-trial documents such as statements of case and skeleton arguments, existing obligations that a named legal representative takes professional responsibility should be sufficient to address concerns about AI-assisted drafting; those jurisdictions that already require AI declarations for such documents are noted but the council says blanket declarations are unnecessary if responsibility is clear. The consultation invites views on where disclosure should be required and whether current professional duties provide adequate safeguards.

The paper also points to comparable international practice and recent judicial guidance overseas that restrict the use of generative AI in affidavits and witness statements, citing concerns that automated drafting risks diluting or embellishing a deponent's own account. The working group highlights the approach taken in other common law jurisdictions and suggests those precedents help explain why stricter limits are appropriate for evidence admitted at trial.

On expert evidence the council proposes that experts should confirm in their statement of truth that any AI assistance has been identified and explained, apart from administrative uses such as transcription. The working group says that transparency about analytical tools used in expert reports would help the court assess reliability without banning use outright. Industry commentary has already emphasised the need to balance innovation with safeguards so courts can evaluate the provenance and limits of algorithmic outputs.

The consultation makes no new proposals on disclosure more generally, observing that AI-assisted review and analytics are long established in disclosure practice and that parties appear to be cooperating over their use. It does flag the particular difficulty of litigants in person, recognising both the access-to-justice benefits of AI tools and the risk that unregulated use could introduce inaccurate or fictitious material into proceedings; the working group says regulation of that area falls beyond its current remit but merits further study.

If adopted, the council’s recommendations would leave courts and professional regulators to translate principles into practice, with practitioners likely to see new drafting checklists or prescribed declarations for trial witness statements and clearer expectations for expert reports. Templates and precedents already used by practitioners for witness and expert declarations may need updating to reflect any final rules emerging from the consultation.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

  • Paragraph 1: [2], [3]
  • Paragraph 2: [2]
  • Paragraph 3: [3]
  • Paragraph 4: [4]
  • Paragraph 5: [3]
  • Paragraph 6: [2]
  • Paragraph 7: [6]

Source: Noah Wire Services