Skip Navigation
The Illusion of Diversion Data: Why Confirmed Diversion Counts Misrepresent True Risk

Blog Post

The Illusion of Diversion Data: Why Confirmed Diversion Counts Misrepresent True Risk

By Adam Rosenberg

This blog post explores the critical shifts discussed by Lauren Forni and Muzzy Rizvi during their session at the 2026 Diversion Symposium. They argue that to truly secure our institutions, we must stop measuring “wins” (confirmed cases) and start measuring the rigor of our detection process. To watch the session recording, click here.

Moving Beyond the “Confirmation Bias”: A New Framework for Diversion Success

In the world of patient safety, we are surrounded by clear metrics: CLABSI rates, sepsis bundles, and medication error ratios. Yet, when it comes to controlled substance diversion, most health systems rely on a single, lagging indicator: Confirmed Diversion Cases.

Lauren Forni and Muzzi Rizvi from the Bluesight clinical strategy team argue that this metric is deeply flawed. Relying on confirmed cases creates a “Confirmation Bias” — it only measures the diversion you were skilled enough to catch, not the diversion that is actually occurring.

The Problem with Confirmed Cases

  1. Detection is not equal to Prevalence: If your confirmed cases go up, it usually means your program is getting better at finding it, not that the problem is getting worse.
  2. Lack of Benchmarks: There is no “Leapfrog score” for diversion. Low counts may lead to a false sense of security.
  3. The Funnel Problem: Every investigation is a funnel. If your triage is too vague or your investigators lack a reproducible methodology, signals leak out at every stage, leaving you with a low (but inaccurate) confirmation count.

The Four Pillars of a Mature Diversion Program

To shift from a reactive to a proactive state, Lauren and Muzzi outlined four concrete pillars that build a foundation for “Continuous Intelligence.”

Pillar 1: Behavioral Baselines

You cannot identify “abnormal” if you don’t know what “normal” looks like. Mature programs establish tight peer-based comparisons at the unit, role, and shift level. Note: These baselines are only as good as your clinical discipline. If your staff has high variability in how they document waste or overrides, your baseline becomes “muddy.”

Pillar 2: Signal Lineage

Every investigation should be traceable to its originating signal (e.g., a dispense-to-admin mismatch or a peer group outlier). By tracking this lineage, you can identify which signals are truly predictive of risk and which ones are just “noise.”

Pillar 3: Rigorous Documentation

Documentation is where many programs lose their defensibility. Detailed documentation – especially for unsubstantiated cases – is vital. It proves that your program operated with rigor and good faith, and it captures systemic gaps that need to be closed regardless of the outcome.

Pillar 4: Continuous Feedback Loops

An investigation shouldn’t be an isolated event that ends in a closed file. Every case (confirmed or not) must feed back into your monitoring. Did a signal turn out to be a false positive? Adjust your thresholds. Did a case reveal a loophole in your cabinet override policy? Change the policy.

Thinking Like an Investigator: The “Hypothesis” Move

Muzzi emphasized that a mature investigator doesn’t jump to conclusions. Instead, they normalize uncertainty and use Testable Hypotheses.

When a clinician flags as a peer group outlier for opioid administration, a rookie assumes they are diverting. A pro holds two competing hypotheses in tension:

  1. Diversion Hypothesis: The clinician is diverting for personal use or sale.
  2. Non-Diversion Hypothesis: The clinician had a higher patient acuity this month or covered more shifts on a high-need unit.

By collecting evidence to prove or disprove both, you remove individual bias and ensure the investigation is grounded in data.

Strategic Reframing for Leadership

To get the funding and support a diversion program needs, the conversation with hospital leadership must shift:

  • From Transparency as Exposure to Transparency as Protection: Opacity is a bigger regulatory risk than a well-documented, rigorous process.
  • From Punitive to Strategic: Don’t just ask “who did this?”; ask “what about our system made this possible?”
  • From Reactive to Proactive: Use behavioral baselines to find patterns before a major event occurs.

Self-Assessment: How Rigorous is Your Program?

As you head to your next committee meeting, ask yourself:

  • What is our Time to Detection (from signal to triage)?
  • What is our Escalation Ratio (signals that move to full investigation)?
  • Are we documenting our “misses” as thoroughly as our “wins”?

True success in a diversion program is not defined by having the fewest cases – it is defined by the confidence and rigor with which you prevent provider and patient harm. How would your current investigation process hold up if you had to defend an “unsubstantiated” case to a regulator tomorrow?