Lessons to be learned from aviation I: dangerous behaviors

In this series’ introduction, I stated that the corporate IT could benefit from the knowledge so hardly gained by other industries, especially aviation. The dangerous conditions in which airmen and airwomen have to operate require a demanding mix of technical expertise and interpersonal skills to handle the complexities that the normal operation requires. This situation is no different from IT, but corporate IT seems to have severe difficulties when it comes to management innovations.

In this article, I will describe and comment on the so-called “hazardous behaviors” in aviation and show how they also manifest in the IT world. To developers and system administrators, they might not pose any physical security risk, but they are potentially destructive because they predispose us to make costly errors precisely in the conditions that we need to stay focussed and be rational.

The following text is based mostly on “Aircraft Safety” by Shari Stamford Krause, Ph.D. (1). The reader will find further references in the footer.

Anti-authority

The anti-authority people resent any external control over their actions, and they show a strong tendency to disregard rules and procedures. They are “over” the rest. Common practices and protocols of any kind usually represent “bureaucracy.” It can also manifest itself as late-coming, ignoring co-workers or avoiding social norms; but in no case will the anti-authority type propose solutions nor improvements for those allegedly faulty rules.

It is a fallacy to diminish rules and procedures based on the fact that they do not solve 100% of the problems, or because they are not applicable to all cases. Denying improvements is an easy way out for people that cannot or will not commit to finding better solutions. 

The main problem resides in the continuous repetition of avoidable blunders. Procedures are there for a reason, namely, to guide people through a known sequence of tasks. By ignoring them, they disregard the knowledge gained in the past at high expenses.

Anti-authority is not the same as questioning authority. The precepts of assertive behavior tell us that we must question authority when the circumstances demand it.

Impulsivity

The Oxford dictionary defines impulsive as “acting or done without forethought.” It is an inherited feature from our ancestors, and its purpose is survival. In the wild, there is no time for applying the scientific method. When confronted with noises in the bushes or silent steps from behind, the individual that runs away without thinking has better chances of survival. In a technological environment, impulsivity is an enormous burden that leads to endless mistakes. Its consequences can have devastating effects, both financial and motivational. “Any action is better than no action” is often a rationale for such demeanor.

The logic of acting “at any price” falls under its weight. “Any price” might include the very same assets that we are trying to protect. It could be compromising the system’s stability, losing the sales of the day or spending long hours in recovery tasks to correct the damage inflicted. Only the “right” action is better than “any” action. 

Be aware that many departments value “action” very much and the ones moving fast are perceived as smart, diligent and highly qualified. Frequently, the very same promoters of such behaviors do not have a system to record, observe and analyze the consequences of the impulsive acts that they encourage. They fall victims of their questionable practices, but they get away with it because there is no evidence to prove the case.

Invulnerability

It can be defined as the belief that one cannot be harmed or damaged. It applies to people but also by extension to companies, systems, and tools. Invulnerability is, of course, an illusion. It will manifest as a tendency to take unnecessary risks, minimize dangers and ignore statistical data. “It won’t happen to me” is its classic slogan. Also, “It never happened before” together with the idea that the issues are always trivial, all leads to ignoring primary mitigation measures and well-known side effects.  The illusion of invulnerability breaks when something goes wrong but by then, it is too late and you can only fire-fight the adverse effects. The Dunning-Kruger effect could provide a secondary explanation of this attitude.

Macho

The “macho” is always trying to prove himself better than others, is overconfident and use to get difficult tasks for the recognition gained but without considering his weaknesses and limitations. “I will show you” or “I’ll prove them wrong” are externalizations of this trait. Notice that it is not just about being overconfident, there must also be a tendency to act for the errors to happen.

Maybe it is time to use some other word to make clear that overconfidence affects men and women equally. “Macho” is a relic from the 20th century and a sad reminder of the deep cultural origins of this manner.

The most common adverse results will be the failure to deliver due to lack of qualifications or the worsening of a lousy circumstance due to a biased analysis of the situation. This kind of person is vulnerable to influence, and they are prone to accept challenges no matter how much effort is required.

Many managers like to take advantage of such conduct. The idea that someone will work more and harder than the others “for free” sounds very seductive. The problem is that the risk of slips is so high, that they can hardly justify the dubious outcome produced by workers with such motivations. In the Agile world, there is a term to describe these situations: “The invisible gun effect.”

Resignation

Resigned people will go along with the most unreasonable requests because they assume they have no control over circumstances. The motto of this passive type is “What do you want me to do?”. They will not discuss orders nor confront anyone regardless how wrong or inadequate the others might be. “It is what it is” and “Let´s hope for the best” are common expressions predicting failure. Managers will prefer to go with “I don’t make the rules” or “I don’t like it either, but I can’t change it.” In any case, it is resignation itself that has devastating effects because it closes the path of communication and growth. If the resigned one is the head of a department, the team reached already a dead end.

Acceptance and resignation are two entirely different positions. From Michael J. Fox: “Acceptance doesn’t mean resignation; it means understanding that something is what it is and that there´s got to be a way through it.”

Afterword

It is crucial to repeat that we all have some inclination towards one or more of the behaviors described above, up to different extents. They usually overlap. But they are not equal to failures, and they do not fatally lead to them, though the risk is very high. 

I commented in the introduction that there is no escape from errors and therefore we must learn to handle them. The same can be said about “dangerous” behaviors. There is a substantial emotional factor around them that we cannot suppress; it is in our nature. Given the appropriate circumstances in each case, emotion will override our thought patterns, we will expose one or more ‘dangerous behaviors,’ and then the risk of making mistakes will dramatically rise, compromising our team and our project.

The key is not to act upon them. It is natural to experience any of the emotions described below, but it would be irrational to take action on their behalf.

Understanding these attitudes in the first step to recognize them when they arise. Once spotted, there are several ways to deal with them from which I will discuss Checklists and SOPs in the next articles of this series.

References

James, Michael. Is my boss on the Scrum Team? posted on http://blogs.collab.net/, 2009.

Weinberg, Gerald. Why software gets in trouble. Weinberg and Weinberg, 2014.

(1) Stamford Krause, Shari. Aviation Safety, Accident Investigations, Analyses & Applications, 2nd Edition. Mc Graw-Hill, 2003.

Beaty, David. The Naked Pilot: The Human Factor in Aircraft Accidents. Airlife, 2004.

Fox, John R. Digital work in an analog world. Studio City Media Endeavours, 2011.

Martin, Erica. Which Hazardous Attitude Do You Have?. Posted on http://www.pea.com/, unknown date.

National Cyber Security Alliance. Small Business Owners Suffer from False Sense of Cyber Security. Posted on https://www.prnewswire.com/, 2011.

Leave a comment