Insights

24th December 2025

Artificial Intelligence Before the Courts: Limits of Use and the Lawyer’s Responsibility in Light of the Judgment Issued by the Abu Dhabi Global Market (ADGM) Courts.

In recent years, legal practice has undergone a qualitative transformation with the introduction of artificial intelligence tools in areas such as legal research, case law analysis, and the drafting of pleadings. While this transformation has contributed to accelerating workflows and improving efficiency, it has simultaneously raised fundamental questions concerning professional responsibility and the permissible limits of reliance on technology in a field that demands the highest standards of accuracy and integrity particularly in proceedings before the courts.

In this context, the Abu Dhabi Global Market (ADGM) Courts issued a judgment that serves as a noteworthy example, as its reasoning reflects a careful balance between encouraging technological advancement and preserving the well-established professional foundations of legal practice.

The Court of First Instance considered a commercial dispute in which legal submissions were filed containing citations and references to judicial precedents and authorities. Upon examination, however, the Court discovered that some of these precedents either did not exist at all or where a reference could be identified differed materially in substance from the legal principles for which they had been cited. It became apparent that these references had been generated using AI-based research tools without proper verification or validation prior to being presented to the Court.

Despite uncovering this issue, the Court was careful to clarify a critically important point: the problem did not lie in the use of artificial intelligence per se. The Court did not adopt a negative stance toward modern technology in legal research; on the contrary, it implicitly acknowledged its role as a supportive tool in contemporary legal practice. The objection, however, concerned the uncritical reliance on the outputs of such tools and their submission to the Court as reliable legal authorities without adequate human review and verification.

The Court emphasized that AI tools, by their nature, may generate inaccurate or incomplete information and may, at times, even produce fictitious references or judgments that have no basis in legal reality. Accordingly, such tools must be used with professional caution and treated strictly as auxiliary resources that do not rise to the level of authoritative legal sources.

One of the most significant principles affirmed by the Court was that the duty to verify facts and legal authorities is a fundamental professional obligation that is neither debatable nor delegable. A lawyer remains personally and directly responsible for everything submitted to the Court and may not transfer this duty to any technological tool, regardless of its sophistication. The Court further stressed that human review is not an optional or merely formal step, but an essential obligation inherent in the duty of honesty and candor toward the Court, and a cornerstone of the integrity of judicial proceedings.

In light of the consequences arising from the submission of inaccurate references including the waste of judicial time, the prolongation of proceedings, and the imposition of unjustified costs on opposing parties the Court concluded that such conduct amounted to professional negligence. Accordingly, it resorted to an important procedural accountability mechanism by imposing what are known as “wasted costs,” holding the lawyer or law firm personally liable for the costs incurred as a result of unprofessional conduct, rather than burdening the client with those expenses.

This approach conveys a clear judicial message: technological innovation does not provide a shield against accountability, and the use of artificial intelligence does not relieve lawyers of their obligation to adhere to the professional standards governing traditional legal practice. Indeed, reliance on advanced technological tools necessitates a higher level of care and scrutiny.

Through this approach, the Court reaffirmed several procedural principles of broad practical significance, which may be summarized as follows: the duty of verification cannot be delegated to technology; human review is mandatory, not optional; and misleading the Court whether intentionally or through negligence, inevitably entails financial sanctions. In this case, the Court ordered the law firm to pay AED 282,508 as compensation, classifying it as wasted costs arising from procedural negligence.

These judicial practices carry important implications for the legal community in the United Arab Emirates. They confirm that the drive toward digital transformation and innovation does not conflict with preserving the profession’s core values, but rather must proceed in tandem with them. Artificial intelligence, regardless of its capabilities, remains an auxiliary tool that cannot replace human legal reasoning or absolve lawyers of their professional responsibility to conduct diligent research, seek the truth, and provide proper legal substantiation for claims through human intellect and effort.

Stay Updated

To learn more about our services and get the latest legal insights from across the Middle East and North Africa region, click on the link below.