Blog Articles

Blog Articles

Blog Articles

Avoiding “Hallucinated” Case Law: How Evatt AI’s Osiris I Update Sets a New Standard

Jan 30, 2025

Ashley Edgar - Founder of Evatt AI
Ashley Edgar - Founder of Evatt AI
Ashley Edgar - Founder of Evatt AI


Try Evatt here https://www.evatt.ai/

The Rise of AI in the Legal Profession

Generative AI tools are transforming legal practice, automating tasks from document drafting to legal research. Yet the same technology that accelerates workflows can also mislead practitioners when its output is accepted uncritically. Large language models (LLMs) are trained on massive datasets and will readily fabricate case names or statutes when they lack the correct information. This phenomenon — known as hallucination — has become a serious concern for courts and regulators. The Federal Court of Australia has noted that these models can produce language “inconsistent with current legal doctrine and case law” and warned that unfaithful interpretations can lead to “harmful and inaccurate” advic

e

.

Real‑World Consequences of Hallucinated Citations

Luck v Secretary, Services Australia

In Luck v Secretary, Services Australia [2025] FCAFC 26, an unrepresented applicant used AI to generate submissions and cited a case that did not exist. When the Full Court discovered the citation was fabricated, the judges redacted the name and citation to prevent the false information from spreading and cautioned against unverified AI content. The case attracted national attention and underscored the need for human oversight.

LJY v Occupational Therapy Board of Australia

Another troubling example is LJY v Occupational Therapy Board of Australia [2025] QCAT 96, where a tribunal received a submission that included a non‑existent case. The Deputy President noted that ChatGPT had been used both to draft the submission and to summarise the fictional case, leading the tribunal to question the reliability of AI research. The incident prompted the court to remind practitioners that AI cannot replace proper legal research or absolve lawyers of their duty to verify authorities.

Valu v Minister for Immigration and Multicultural Affairs (No 2)

In Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95, a practitioner relied on AI‑generated case summaries and quotes without checking them against official reports. When the court found that several cited cases and quotations were nonexistent, it referred the practitioner to the Legal Services Commissioner for investigation. The court emphasised that time pressures or reliance on vendor marketing cannot justify neglecting the duty to verify legal authorities.

These cases illustrate that hallucinations are not hypothetical; they can derail proceedings, waste judicial resources, and expose practitioners to disciplinary action.

Regulatory Guidance on AI Use

Legal regulators are responding. The Law Institute of Victoria’s guidelines warn lawyers not to enter confidential or privileged information into AI tools and to independently verify any AI‑generated output before relying on. Courts across Australia have issued practice notes reminding practitioners that they are responsible for the accuracy of citations and must not mislead the court. Together, these directives reinforce a simple principle: AI can assist, but it cannot replace professional judgment.

Evatt AI and the Osiris Update: Eliminating Hallucinations

Evatt AI has built its reputation on empowering legal professionals with safe and reliable AI tools. The company’s latest Osiris update goes a step further by eliminating hallucinated case law. Here’s how:


  • Authoritative Cross‑Checking: When Evatt AI’s research module returns a case citation, the Osiris update automatically cross‑checks the reference against authorised law reports and online databases. If a citation doesn’t match a recognised case, the system flags it instantly, preventing fabricated authorities from entering your drafts.


  • Reference Verification Alerts: Osiris alerts users when AI outputs contain unverified information. Instead of blindly accepting every suggestion, lawyers receive prompts to investigate flagged citations. The system then provides links to the relevant case law or suggests alternative, verified cases.


  • Transparent Source Tracking: Each suggestion in Evatt AI is accompanied by a source trail, so lawyers can quickly see where the information originated and verify its authenticity. This transparency reduces the risk of inadvertently citing nonexistent cases.


  • Human‑in‑the‑Loop Design: Recognising that technology should augment — not replace — human expertise, Evatt AI requires a final review by a qualified lawyer before research outputs are finalised. This “human‑in‑the‑loop” approach ensures that professionals remain accountable while enjoying the efficiency gains of AI.

Why It Matters for Your Practice

By preventing hallucinated citations, Evatt AI doesn’t just protect practitioners from embarrassment or disciplinary action — it saves time and preserves trust. A misquoted case can prompt extensive back‑and‑forth with the court, damage client relationships, and undermine the credibility of the practitioner involved. Conversely, an AI platform that insists on accuracy can become a trusted partner.

The Osiris update also aligns with emerging regulatory expectations. Courts increasingly expect practitioners to demonstrate that they have critically evaluated AI output. Evatt AI’s verification alerts and source tracking provide an audit trail that lawyers can show to clients or regulators to confirm diligence.

Finally, eliminating hallucinations has a direct commercial benefit: it allows legal teams to invest more time in strategy and client service rather than fact‑checking AI outputs. For firms competing on quality and efficiency, that advantage can be decisive.

The Future of Law Is Reliable AI

Generative AI will undoubtedly play a growing role in the legal profession. The question is whether it will be used responsibly. The wave of cases in 2025 has shown that unverified AI research can lead to sanctions and wasted judicial resources. Regulators are watching, courts are reacting, and clients expect diligence.

Evatt AI’s Osiris update responds to this challenge. By embedding authoritative cross‑checks, verification alerts, transparent sources, and human oversight, Evatt AI ensures that your research is not just fast but reliable. “No hallucinations” isn’t a slogan — it’s the new standard for AI in law.

To learn more about the Osiris update and how Evatt AI can safeguard your practice while boosting efficiency, visit our website today. The future of law belongs to those who pair innovation with integrity.


Evatt AI 

Making Lawyers Lives Easier 

https://www.evatt.ai/


We've Released New Feature

Discover our latest innovation!

Unveil our latest innovation on SaasPlus, delivering unmatched capabilities to elevate your experience.

We've Released New Feature

Discover our latest innovation!

Unveil our latest innovation on SaasPlus, delivering unmatched capabilities to elevate your experience.

We've Released New Feature

Discover our latest innovation!

Unveil our latest innovation on SaasPlus, delivering unmatched capabilities to elevate your experience.