” The more you know, the more you realize how much you don’t know. The less you know, the more you think you know.” – David T. Freeman
In 2009, the FAA published “The Risk Management Handbook” designed to teach pilots about risk and how to manage it. The terminology, the pictorials, and everything associated with the early part of the discussion were intimately familiar to me. At that that time I had been working as part of a space shuttle safety team for twelve years, so I had been in the middle of the events surrounding the Columbia accident and everything that had come after it, including revised safety approaches to managing risk.
I’ve been a pilot since 1972 and today fly a light sport aircraft, a Flight Design CTSW. So far, I’ve been fortunate to have never had an aviation-related accident or been hurt (and obviously not killed) flying in general aviation or military aircraft. I certainly have had my close-calls. Every time I fly, I am taking what most people call “a calculated risk”. Am I engaged in risk management? Yes and no.
As the FAA discussion details, risk management involves recognizing a hazard and then determining the consequence of the hazard and the likelihood it will occur. The process behind that is represented by this graphic (Figure 17-4 in The Risk Management Handbook) that provides a scale for both categories (i.e., consequence …called “severity” on the graphic…and likelihood).
While pilots need to always think ahead about what risks a particular flight may entail (including the risks they themselves bring to it), I am not convinced we are able to judge the likelihood of any event particularly well, if at all. In fact, if a pilot isn’t both knowledgeable and perfectly honest, he may not even be able to judge the consequence realistically. Humans have this strange mental defect called “denial” that can and does affect the judgment of both consequence and likelihood. Where are we then?
Today, I work as a safety analyst on a team of engineers supporting the Johnson Space Center Flight Safety Office. Recently, I had the privilege of engaging in some training to examine our ability to gauge certain events. What we all learned is that most people, when performing qualitative judgments, are overconfident in their answers. Certainly, this was the case in the Columbia accident in which the judgments surrounding the consequence were absolutely off; in fact, we spent our time discussing the wrong problem. If rooms full of engineers can be that far off, how likely is it one aircrew can be counted on to be more accurate all the time?
If you go to Rod Machado’s website, you can find an essay where he also questions the FAA’s approach to risk management. He discusses “hazard avoidance”, i.e., simply avoiding hazards that pose a significant risk. I’m in agreement with this approach. It’s what I’ve done my whole life in aviation. For me, if I know there is a risk I don’t need to take…and especially one that I know can pose a significant risk of damage, injury, or death…I simply don’t take it. There are days I simply don’t fly not because I can’t, but because I choose not to.
The diagram shown in the figure I referred to earlier is known “in the business” as a “LxC” (pronounced, “L by C”), which stands for “Likelihood by Consequence”. While both the “likelihood” and “consequence” often is assigned qualitatively (i.e., best guess based on experience or analysis), the true value of the LxC shows when the likelihood is assigned quantitatively (i.e., by probability analysis) and the consequence is based on a detailed technical analysis (i.e., hazard report analysis, failure mode and effects analysis, etc.). This is the type of rigor often used by the shuttle program when using the tool; and when not, you can bet there are plenty of arguments about whether it’s accurate or not, making it of questionable value. The process was designed to allow one to gauge risk at the program level with all its resources.
There is no way any of us can fly without taking risk and dealing with hazards. I look at every flight and ask myself what the hazards are to me and my aircraft and whether they can be avoided; and if they can be, I do. For the ones that cannot be, I look at how I might mitigate each hazard if it does occur, i.e., what is my way out? It’s a huge “red flag” if I find one that has no escape and a significant consequence; I remind myself I don’t need me or my passenger to die or get hurt nor do I need damage my airplane just to fly today.
One of the things about flying outside of combat or a rescue mission is there are very few times when flying another day or another way is not an option. Even in air combat, one of the things you learn is when to duck out of a fight and come back to fly another day. It is a mark of the professional…and a survivor.