December 21, 2023 Practice Directions
1. Introduction and Scope
These guidelines apply to parties in proceedings before the DIFC Courts (“the Courts”) and are to be taken into consideration when using Large Language Models (“LLMs”) and Generative Content Generators (“GCGs”) during such proceedings.1
The Courts understand that the use of LLMs and GCGs are more commonplace across the legal industry and can significantly assist in the preparation and presentation of cases by saving time and costs. However, there are potential risks associated with the use of such technologies, which need to be taken into consideration by all parties that use them or are considering using them. Such risks include:
1. providing misleading or incorrect information/evidence to the courts and other parties;
2. breaching client confidentiality;
3. infringing intellectual property rights; and
4. breaching relevant data protection legislation.
Parties should be particularly mindful of the above when drafting documents that are to be relied on in the course of proceedings. These include pleadings, witness statements and affidavits, and skeleton arguments. The Courts expect parties to be transparent about what has been generated through the use of GCGs (both with the Courts and with other parties) and to not overly rely on such technologies.
These guidelines outline the core principles that parties/practitioners should take into account and the best practices to help them comply with their legal and professional obligations when relying on AI-generated content. However, this note is to be treated as guidance only and parties/practitioners are expected to be aware of all relevant legislation and regulations applicable to them.
2. Citation
This Practical Guidance Note will come into effect from 21 December 2023. It may be cited as Practical Guidance Note No. 2 of 2023 Guidelines on the use of AI in Court proceedings and may be abbreviated as PGN 2/2023.
3. Principles
Transparency: All parties involved in a legal proceeding, and the Courts, should be made aware of any AI-generated content used during the process. This includes disclosing the use of AI systems, the source of the AI-generated content, and any potential limitations or biases associated with the AI system.
Accuracy and Reliability: AI-generated content should be verified for accuracy and reliability. Parties and practitioners must take steps to ensure that the AI-generated content is up-to-date, relevant, and based on accurate data. Before submitting AI-generated evidence in the Courts, parties and practitioners should evaluate its reliability, taking into account the AI's training data, algorithms, and potential for bias or inaccuracies. The Courts have the power to reject any content generated through AI-systems under Rule 29.10 of the Rules of the DIFC Courts (as amended).
Professional and legal obligations: When using AI-generated content, practitioners should always be mindful of their obligations under the Mandatory Code of Conduct for Legal Practitioners in the DIFC Courts (DIFC Courts’ Order No. 4 of 2019) (“the Code of Conduct”) and parties should ensure they do not breach any relevant laws, including the Data Protection Law 2020 (DIFC Law No. 5 of 2020) and the Intellectual Property Law 2019 (DIFC Law No. 4 of 2019) (as amended).
Avoid over reliance: Parties should not become overly reliant on GCGs and LLMs to produce their documents for proceedings. Such technology should only be used to assist parties in putting forward submissions and not to replace the integral human decision-making that is required when preparing evidence or submissions to the courts.
4. Best practices
Summary
Below is a list of what the Courts consider to be best practice when using AI-generated content. If followed, they should help to ensure that:
- parties and practitioners comply with their legal and professional obligations;
- proceedings are not unnecessarily delayed or complicated due to arguments over the use of AI-generated content; and
- parties and practitioners select the best technology to meet their needs.
Parties and practitioners should therefore, as far as possible:
1. verify the accuracy and reliability of AI-generated content;
2. disclose their intention to use AI at an early stage in proceedings;
3. select the most appropriate GCG for their purposes;
4. educate their clients; and
5. protect client confidentiality and comply with legal obligations.
1. Verify the accuracy and reliability of AI-generated content
Parties should not rely on AI-generated content without first verifying its accuracy. This should be done through using independent sources such as case law, statutes and credible legal commentary (for example, practitioners’ texts). Parties are expected to thoroughly check any AI-generated content to ensure that it is both accurate and relevant to the issues in the case, including any sources relied upon by the GCG. Only then should it be relied upon.
Where a document is verified by a statement of truth (Rules 22.1 to 22.9 set out the documents to be verified by a statement of truth) (as amended) the person who signs that statement of truth is confirming that the contents are in his/her belief true. Rule 29.137 states that:
Proceedings for contempt of court may be brought against a person if he makes, or causes to be made, a false statement verified by a statement of truth without an honest belief in its truth.
Therefore, there are potentially very serious consequences for those who do not ensure that the contents of the document that they are verifying with a statement of truth are accurate to the best of their belief.
Particular care should be taken in respect of the use of AI technology to produce witness statements or affidavits, either in whole or in part. Attention is drawn to Rules 29.2, 29.24, 29.25, 29.32:
29.2
Evidence at a hearing other than the trial should normally be given by witness statement. A witness statement is a written statement signed by a person which contains the evidence that person would have otherwise submitted orally.
29.24
The witness statement must, if practicable, be in the intended witness’s own words….
29.25
A witness statement must indicate:
(1) which of the statements in it are made from the witness’s own knowledge and which are matters of information or belief; and
(2) the source for any matters of information or belief.
29.32
A witness statement is the equivalent of the oral evidence which that witness would, if called, give in evidence….
Witness statements must therefore, if practicable, be in the intended witness’s own words (Rule 29.24) and contain the evidence that the person would have said if submitted orally (Rules 29.2 and 29.32). If a GCG is used to produce a witness statement. the party responsible for drafting it must ensure that it is still in the witness’s own words and make any necessary amendments. It should only be verified by a signed statement of truth once the witness believes that the contents are true (as amended).
2. Early disclosure of the use of AI
Parties should declare at the earliest possible opportunity if they have used or intend to use AI-generated content during any part of proceedings. Any issues or concerns expressed by either party in respect of the use of AI should be resolved no later than the Case Management Conference stage. Early disclosure of the use or intention to use AI gives all parties the opportunity to raise any concerns they might have or to provide their consent to such use. It also provides the Courts with the opportunity to provide any necessary case management orders on the reliance on AI-generated content during proceedings.
Parties should not wait until shortly before trial or the trial itself to declare that they intend to use AI-generated content. This is likely to lead to requests for adjournments and the loss of trial dates, which must be avoided. Where parties seek to use AI in the course of proceedings, they must ensure that such use is first discussed with the other side, and where no agreement is made, the request may be put before the Courts by way of a Part 23 application for determination.
3. Select the most appropriate GCG
There are numerous GCGs currently available but not all will be suitable for producing legal content. Parties should ensure that they properly review the GCG they intend to use to understand if it can complete the task required. Parties should avoid using free conversational GCGs (i.e. tools that provide you with answers to questions). These are often not fit for purpose as they will not necessarily have access to all relevant legal data (such as caselaw and statutes).
Parties and practitioners should use specialist GCGs designed to produce legal content and that will be able to assist with content relevant to the DIFC’s jurisdiction. The Courts encourage parties and practitioners to speak with, and seek assistance from, experts in the field to understand exactly which GCG will best be able to meet their needs. Parties should understand the limitations of the GCG they intend to use, including the limitations of any training data used to teach the AI model, the algorithms used and any potential biases that might be produced.
4. Education of clients
Practitioners should keep their clients properly informed and seek their consent for the use of GCG material in making submissions during the course of proceedings before the DIFC Courts. If a practitioner believes that a GCG could be used and that it might be beneficial to do so, he/she should discuss this with the client. Practitioners should discuss with the client the reasons why it might be preferable to use a GCG and the implications of doing so.
Importantly, practitioners should clearly explain to their clients any potential risks of using a GCG and the steps that should be taken to mitigate any such risks. This should involve discussing with the client the most appropriate GCG to use for the particular tasks.
Only after a client has been properly informed of his/her options and the risks associated with the use of AI in legal proceedings, should practitioners use a GCG to carry out tasks.
5. Protect client confidentiality and comply with relevant legislation
Practitioners should not disclose any of their client’s confidential information unless with the client’s consent or if required to do so by the law or an order of the court (Paragraph 19 of the Code of Conduct).
Accordingly, if confidential information needs to be provided to a GCG, practitioners should ensure they first have their client’s consent. Further, practitioners should speak with any potential GCG and read their terms of use to understand exactly how they will use any personal data and for what purposes and whether they comply with the Data Protection Law (DIFC No. 5 of 2020).
Understanding a GCG’s terms of use and speaking with the GCG is also important for a party to ensure it does not infringe any copyright owned by the GCG and that it complies fully with the Intellectual Property Law (DIFC Law No. 4 of 2019).
5. Conclusion
The above is designed to assist parties in their considerations on whether to use a GCG with an LLM and, if so, the steps that they are encouraged to take to mitigate potential risks. The Courts appreciate that this is a constantly developing technology and urge parties to seek expert advice where appropriate, as well as ensure constant engagement with other parties to proceedings over the use of AI-generated content.