LIKE THE NUCLEAR NAVY, HEALTHCARE ENTITIES HAVE A CRITICAL MISSION

It’s a grim thought but a shocking reality in American healthcare. In 2016, the British Medical Association published a study by Johns Hopkins researchers that estimated nearly 250,000 deaths in 2013 in the United States due to medical errors. According to the CDC, only heart disease and cancer surpass that number. I’ve heard others use the analogy that it’s equivalent to seven commercial airliners crashing every day with no survivors. When COVID deaths reached 200,000, we considered that a crisis. Shouldn’t we consider that happening nearly every year from medical errors a crisis?

We truly have a crisis in the American healthcare system. We must, however, put this in perspective. I don’t believe these errors are caused by bad doctors and nurses. I’d suggest many of those systematic problems have as their root cause a culture where staff are viewed by the higher-ups as objects and not as people. Decisions are made in a vacuum favoring the financial bottom line.

As a healthcare executive, I’ve battled these systematic problems many times. However, we could also find major systematic problems in manufacturing, the service industry, non-profit organizations, sports teams, or governmental entities. What is that big, alarming number in your industry? What keeps you up at night? What is one thing about your industry you wish you could fix that would change lives? Most may not have the alarming rates of harm as medical errors, but they all require a model and method to change the culture to prevent accidents, financial loss, service failures, and make for a more rewarding experience for all involved. A legendary culture can do just that.

I was a bit of a unicorn when I moved from the nuclear Navy to clinical safety in healthcare. People often compared my experience in the Navy to working in other industries. I’d suggest three glaring differences. One, most industries punish mistakes. In the nuclear Navy, we celebrate errors because it could have been so much worse. Why place a stigma on someone or their leader who had someone under their watch make a mistake? To err is human. It’s not if an error will happen, but when. Second, trust suffers in most industries due to a lack of transparency. Would you be willing to report an issue if you thought you might get yourself or someone else in trouble? Would you report a bad leader that might cause harm? In my naval experience, we motivated the crew by rewarding transparency. Third, how many organizations champion productivity over safety? Safety is merely the cost of doing business for many. For a nuclear-powered submarine, safety drives our business.

When an error happens, and we know they will happen, we simply cannot stick our heads in the sand, never discuss them, and move on. If only we embraced the model of the Nuclear Navy, we could increase transparency, trust, and ultimately, save lives.

Previous
Previous

LEADING WITH LOVE: HAVING ONE’S BACK AT THE COST OF YOUR OWN