The $2.4 Million Question: What Incompetency Actually Costs
By Farzad Najam, MD, FACS | Founder & CEO, VRKure
I have been a cardiac surgeon for more than twenty-five years. I have stood at the head of a table when everything was going wrong, and the room needed someone to lead. I know what it looks like when people are ready for that moment. And I know what it looks like when they are not.
What it looks like when they are not ready is chaos. People walking into a code who should know exactly what to do, struggling. Providers getting tense, getting angry, freezing — when the only thing the moment demands is the right decision, executed cleanly, without hesitation.
I have seen it more times than I should have. And every time, somewhere in the background, there is a laminated BLS card in someone’s badge holder, dated within the last two years, that says they were ready.
The card lies.
The Comfortable Fiction of the Two-Year Cycle
The American Heart Association sets a two-year recertification cycle for Basic Life Support. Hospitals follow it. Administrators budget for it. Providers comply with it. And the entire system proceeds on the assumption that compliance equals competency — that because a provider completed a course, they can perform when it counts.
This is a fiction that everyone in healthcare has quietly accepted.
The research is unambiguous. BLS skill decay begins within weeks of certification, not months. Compression depth, rate, recoil, ventilation — all of it degrades, invisibly, while the badge holder says otherwise. By the time a provider is due for recertification, the skills that card is supposed to represent may have deteriorated to the point of being clinically dangerous.
We have known this for years. And we have responded by renewing the card.
When I raise this with colleagues or administrators, the response I hear most often is not disagreement. It is a kind of resigned shrug. Everyone seems content with the two-year cycle because it satisfies the AHA guideline, it satisfies the accreditation standard, and it is manageable. It is a compliance activity, not a competency activity. And in healthcare, compliance activities get funded. Competency activities get questioned.
The most common pushback I receive about BLSXR is that it is not AHA certified. My response is simple: neither is competency.
Compliance Is Not the Same as Competency
This is the distinction that the entire conversation about healthcare training tends to avoid, because it is uncomfortable. Compliance asks: did the provider complete the required activity? Competency asks: can the provider perform under pressure, right now, to the standard that a patient’s life requires?
Those are not the same questions. And for too long, we have been answering the second question with data from the first.
BLSXR — Basic Life Support Extended Reality — costs less than traditional recertification when you account for instructor time, scheduling burden, and administrative overhead. But more importantly, this clinical competency verification platform delivers something the two-year card cannot: a verifiable, objective record of whether a provider can actually perform. Not whether they attended. Not whether they watched the video. Whether they can do the skill, in a realistic environment, under the kind of cognitive load that a real emergency produces.
That is not a training tool. That is a competency standard. There is a meaningful difference.
What Aviation Understood and Healthcare Still Has Not
Aviation is uncompromising about competency. Pilots do not just take a course and receive a credential. They train in simulators that replicate emergencies, equipment failures, and high-stress decision environments. They demonstrate performance before they fly. Recurrent simulation is not optional — it is the condition of holding a license.
The result is a safety record that healthcare cannot approach. Aviation’s fatal accident rate has fallen by more than 95% over the past 50 years. The simulator was central to that achievement — not as a nice-to-have, but as the mechanism by which the industry ensured that competency was real, not assumed.
Healthcare watched this happen. We cited the research. We wrote editorials arguing that healthcare should learn from aviation. And then we handed out another two-year card.
The reason, I think, is this: in aviation, the system owns competency. The airline, the regulator, the licensing authority — they are all accountable for whether a pilot can perform. In healthcare, competency has been outsourced to the individual. We give providers a card and tell them, implicitly, that staying competent is their personal responsibility. The system’s job ends at compliance.
This is not just philosophically wrong. It is structurally dangerous. Because when the code is called and the provider freezes, the system that issued the card does not bear accountability. The individual does. And so does the patient.
What the System Is Actually Costing
Healthcare is not just an expensive system. At its worst, it is a faulty one. People sense this. They ask around about the competency of a hospital, a surgeon, a team — not because they are cynical, but because they understand that the formal credential system does not answer the question they are actually asking, which is: can these people take care of me?
Hospitals have increasingly turned to Press Ganey scores and patient satisfaction surveys as proxies for quality. These have their place. But patient satisfaction is not the same as clinical competency, and a good bedside manner does not compensate for a provider who cannot run a resuscitation.
The financial exposure from competency gaps is rarely calculated honestly. The median wrongful death malpractice verdict in the United States exceeds two million dollars, before legal fees. A single preventable resuscitation failure can generate additional acute care costs, regulatory scrutiny, and reputational damage that dwarfs any simulation budget ever proposed. And yet the simulation budget gets questioned. The two-year card does not.
The cost of deploying BLSXR, a VR medical training platform, across a clinical workforce is a rounding error against a single adverse event. This is not a technology argument. It is a straightforward financial one.
A Different Kind of Accountability
I founded VRKure because I spent over three decades watching a problem that had a known solution go unsolved. The solution is not complicated. It is what aviation did: make the system responsible for competency, not just compliance. Use medical simulation training not as a box to check but as a genuine mechanism to verify that providers can perform when it matters.
Immersive Medicine — using AI-powered extended reality to simulate, assess, and continuously verify clinical competency — is not a niche product for teaching hospitals with surplus training budgets. It is the logical next step for any healthcare system that takes seriously the gap between what its certifications claim and what its providers can actually do.
The two-year card will not go away overnight. The AHA guidelines exist, and hospitals will continue to follow them. But the question for every administrator, every CMO, every CNO reading this is not whether to comply with the standard. It is whether compliance is enough.
I have been at the bedside when it was not enough. I have seen what chaos looks like when a team that should have been ready was not.
The card said they were ready.
The patient deserved better than the card.