Why Do We Trust Pilots But Question Surgeons?
Farzad Najam, MD Clinical Professor of Surgery, George Washington University School of Medicine and Health Sciences; Founder & CEO, VRKure
When booking a flight, you do not research the pilot.
You don’t Google their name. You don’t ask a friend if they’ve heard good things. There is no word-of-mouth recommendations. You don’t even remember their name after they announce it on the intercom. Captain somebody. First Officer something-or-other.
No matter where you’re flying, whether it’s New York to LA, Dubai to Singapore, or São Paulo to London, pilot training standards are basically the same everywhere. You trust the system. You assume the crew is trained, tested, and has spent hundreds of hours in a simulator before flying a real plane.
Now think about what happens when you need surgery or medical care.
You Google. You read reviews. You ask your neighbor whose cousin had the same procedure. You check malpractice records. You go on healthgrades. You look for the doctor with the best reputation, the most referrals, the shiniest hospital affiliation.
That’s not verification. That’s guesswork dressed up as due diligence.
The Uncomfortable Truth: Pilot vs. Surgeon Training
Aviation solved the competency problem decades ago. Healthcare still hasn’t.
Pilots must complete mandatory training every six months. They return to the simulator, not just to watch videos or take quizzes, but to actually fly. They practice handling emergencies, engine failures, and crosswind landings—situations they might never face in real life but still need to be ready for. One of my close friends, a commercial pilot and former F16 fighter pilot with over 18,000 hours, still trains on a simulator every six months. No one questions it. It’s just what professionals do.
Simulator-based systems offer objective assessments and standardized requirements. Skills are measured and improved regularly. A pilot’s certificate does not mean their competence is taken for granted; it must be shown and maintained.
Surgeons? We train for years and then we pass our boards. We do CME credits every two years for state licensure. We watch lectures and click through online modules. We self-report our outcomes. The system trusts us to stay sharp on our own.
I’ve been a surgeon for 34 years and a cardiac surgeon for over 25 years. I’ve performed over 10,000 cardiac surgical procedures. I was Chief of Cardiac Surgery at a University Hospital. Named a Top Doctor by Washingtonian for 14 consecutive years.
And I can tell you: our system isn’t good enough.
Volume Is Not Proficiency
Volume matters. I’ve always believed that. Surgeons who do more cases tend to have better outcomes. There’s no substitute for repetition. Repetition forms muscle memory.
But just doing more cases didn’t keep my skills sharp. What helped was deliberate practice: reviewing cases, studying failures—both mine and others’, mentally rehearsing complex procedures, and keeping up with new techniques. I would go over a complex operation in my mind the night before, visualizing each step before actually performing it.
Our system doesn’t require any of that. We count procedures. We don’t measure proficiency. When a surgeon applies for hospital privileges, the requirements are based on numbers and not on objectively proven competencies for those procedures.
Think about what that means. A surgeon can be credentialed to perform a procedure they’ve only done a handful of times in two years. A nurse could be certified in a skill that has faded since they last used it. The system doesn’t know. The system doesn’t check.
Skill decay is real. Studies show clinical skills can degrade significantly in just a few months without practice. CPR proficiency, for example, drops by 50% within three to six months of training. Yet we certify people for two years at a time.
In aviation, that would be unthinkable. A pilot who hasn’t flown in six months doesn’t just walk back into the cockpit. They recertify. They demonstrate they can still do the job safely.
“Trust Me, I’m a Doctor”
For a century, healthcare has operated on a handshake agreement. Patients trust us because we have credentials and certificates. We have diplomas on the wall. We have letters after our names.
But credentials do not correlate to competency. A diploma means someone passed an exam on a particular day years ago. It doesn’t tell you anything about their skills and competency today.
Every year, 250,000 Americans die from preventable medical errors. That’s like having three or four commercial plane crashes every day, or a 9/11 every two weeks. If aviation had numbers like that, no one would fly. There would be congressional hearings, public outrage, and executives called in front of cameras.
But healthcare’s failures happen quietly. One patient at a time. Behind closed doors. In ICU rooms and surgical suites, where families grieve privately, the system moves on to the next case.
I’ve sat in mortality and morbidity, and peer review committee meetings for decades. I’ve reviewed cases where trained, certified professionals made errors that cost patients their lives. Not because they were negligent. Not because they didn’t care. Because the system never ascertained their skills and competency. You don’t know what you don’t know.
How Aviation Got Safe
Aviation didn’t get safe by accident. It built infrastructure that made safety inevitable.
After several major crashes in the 1970s and 80s, the industry made a deliberate choice. They stopped assuming competence and started measuring it. They created standardized simulation training and introduced crew resource management. They built systems in which each pilot, whether they had 500 hours or 15,000, trained regularly to show their skills, mental sharpness, and judgment under pressure.
The result? Commercial aviation is now one of the safest industries in the world. In 2023, the fatal accident rate for commercial flights was about one crash per five million flights.
Healthcare has the same opportunity. The technology exists. Immersive simulation. AI-powered assessment. Biometric healthcare competency verification. We can measure whether someone actually has the skills they’re certified to have—not just whether they passed a test years ago.
The Path Forward
Healthcare doesn’t need more reputation. It needs more verification.
Imagine a world where clinical competency isn’t assumed—it’s demonstrated. Where skills are measured objectively, not self-reported. Where certification matters because it’s linked to proven proficiency, not just a certificate renewed every few years.
Imagine patients who don’t have to Google their surgeon because the system has already verified what they need to know.
That’s the change I’m working toward. It’s not because I think doctors and nurses are incompetent. I’ve worked with some of the most skilled clinicians in the world. But I’ve seen even skilled people fail when the system doesn’t support them, when training is only theoretical instead of hands-on, and when competency is assumed instead of checked.
Aviation figured this out fifty years ago. It’s time healthcare did the same.
We have the tools to save lives. The question is whether we have the courage to use them.