Skip to main content Skip to footer

Coverage

Business Reporter: Managing AI Hallucinations: From Crisis to Management

Publication: Business Reporter 

Date published: 2025-12-09

Author: Jan Van Hoecke, VP of AI Strategy, iManage

Type: Thought leadership / Opinion piece

In this article, Jan Van Hoecke of iManage explores the growing challenge of AI hallucinations, instances where large language models generate false but convincing information. He highlights several real-world cases, such as fabricated legal citations in U.S. court filings and a Deloitte report that invented expert references. By 2025, an international database had already tracked 402 hallucination cases in U.S. legal decisions, underscoring the scale of the issue.

The piece argues that as AI moves from pilot projects into enterprise-grade legal processes, hallucinations must be managed rather than ignored. Van Hoecke frames hallucinations as an inherent “feature” of LLMs, not a bug, and stresses the need for:

  1. Awareness and transparency about the risks.

  2. Governance frameworks to monitor and mitigate hallucinations.

  3. Human oversight to validate outputs in high-stakes contexts.

  4. Pragmatic adoption strategies that balance innovation with responsibility.

The article positions hallucination management as a critical step in ensuring AI can be trusted in professional environments, especially law and compliance-heavy industries.

Read the full article on Business Reporter