Experts caution that using it without legal knowledge carries significant risks. June 30, 2025 Worldwide: An increasing number of litigants are using artificial intelligence (AI) tools to create legal pleadings and memos in response to exorbitant legal fees. Legal experts are raising concerns about the use of these tools by people without any legal knowledge, even while they provide rapid and affordable alternatives to traditional legal services.
The use of AI platforms, such as generative tools driven by massive language models, to compile case law, create legal arguments, and examine contracts is growing. But in courtrooms, misapplied legal principles, out-of-date legislation, or inaccurate citations can have major repercussions.
According to legal technology specialist Professor Laila Ahmed, “AI can be a useful tool for trained professionals, but in the hands of untrained users, it becomes risky.” “When interpreting the law, there is no substitute for legal education.”
Some jurisdictions have even seen AI-generated filings rejected or sanctioned due to fabricated cases or non-compliant formatting, highlighting the importance of human oversight. Courts in the U.S., U.K., and Canada have recently issued guidance warning against relying solely on AI for legal submissions.
Despite the warnings, many individuals—especially those representing themselves (pro se litigants)—see AI as a lifeline in navigating complex legal systems.
“I couldn’t afford a lawyer, but ChatGPT helped me understand what I was up against,” said Michael Ruiz, a small business owner in Texas. “Still, I knew enough not to file anything without checking it with a real attorney.”
Legal scholars are urging a regulated integration of AI in law, calling for platforms to include disclaimers, accuracy ratings, and collaborative features with licensed professionals.
As AI continues to evolve, experts emphasize a balanced approach: embracing its potential while maintaining ethical and legal safeguards to protect users—and the justice system—from unintended harm.