Finding crucial answers requires open decision making



How does an open organization make decisions when stakeholders have contradictory priorities? And what if safety and human life are two of those priorities?

In such a scenario, it seems that maximizing safety would supersede any other agenda, but engineering has a long history of failures that show otherwise. With their emphasis on open communication and clear guidelines, open organizations can help ensure those responsible for decisions avoid such failures.

Making ethical decisions

In my undergraduate computer science program, there was an engineering ethics class that covered incidents where people could have avoided failures if they’d taken proper care and attention: Bhopal, Therac-25, USS Vincennes—all scenarios where poorly tested or misconfigured systems led to the loss of human life.

One incident received special attention: the 1986 loss of the NASA Challenger space shuttle. The tragedy is well-known: Long exposure to low temperatures on the launch pad caused the failure of o-rings intended to seal the solid rocket boosters. The result was the destruction of the craft and loss of seven lives 74 seconds after launch. Some records show awareness of such risk as early as 1977, and the night before the launch there was at least one engineer unwilling to sign off on the launch. The conflict between multiple parties involved in these efforts, all of whom were pursuing what they saw as their appropriate mission, made this event especially noteworthy.

In my opinion, one point sent the entire event down its fatal course: when a senior manager asked another manager to “take off his engineering hat and put on his management hat.”

To be sure, nothing is inherently wrong with changing viewpoints when considering a problem. If considering a new car, you may use the criteria of enjoyment, usability, cost, safety, maintenance, etc., to make that decision. However, no matter the measure you ultimately select, the other factors don’t simply disappear. Moreover, if other criteria cannot be moved or replaced—say, one’s budget—then you must consider the other factors as secondary in the final decision. In the context of spaceflight, safety of the crew is that immovable criteria.

The error of changing personas in the Challenger discussion was that the “management hat” not only lessened but frankly disregarded the safety concerns that the “engineering hat” kept in the foreground. That persona demanded evidence “beyond a shadow of a doubt that it was not safe to [launch]. This is in total reverse to what the position usually is” in such pre-flight meetings.

This demand from NASA management to engineers “to prove that we should not launch rather than… prove that we had enough data to launch” was impossible to do, as there had been successful shuttle launches before but none in the same external environmental conditions facing the Challenger, namely prolonged exposure to very low temperatures. The refusal of the “management hat” persona to consider that context is the hallmark of a closed organization unable to adjust to new information.

The problem persists

Unfortunately, this inversion of concern still occurs in engineering. The New York Times article “A Cheaper Airbag, and Takata’s Road to a Deadly Crisis” is a thorough exploration of the timeline and decisions leading to the “largest automotive safety recall in history.” The article relays a story that will sound familiar: management waves off an engineer’s concerns about an unsafe process are, and the result is the death of innocent people.

This particular recall concerns propellant that has deteriorated over time, exploding the airbag inflators and generating shrapnel that has killed more than a dozen people to date. As with the o-rings in the Challenger disaster, internal stakeholders clearly knew the risk. They even developed a required test for properly checking (and mitigating) the risk: “The tests involved inserting a small amount of helium gas into the inflaters [sic]. The inflaters were then put in a vacuum. If too much helium was detected outside the inflater, that meant the inflater had a leak, was defective and should be scrapped.”

However, inflators marked “defective” would be re-depleted over and over again until there wasn’t any helium left in the inflater to detect. At that point, they’d pass the test and be shipped to unaware automobile manufacturers. An “engineer said he questioned his Takata bosses in 2001 about manipulating the tests, but was told ‘not to come back to any more meetings.’ He left the company later that year.” In both this example and that of the Challenger, the mindset is the same: a stakeholder has a goal and will not budge, no matter the value or source of contradictory input.

Open decision making, ethical decision making

How does this decision making process compare with those typical of open organizations?

As Jim Whitehurst writes in The Open Organization, “the people who are closest to the issue, rather than those responsible for the overall direction of the organization or team, tend to make the decisions.” Here Whitehurst cites A Company of Citizens: “Merit means that decisions are based on the best case put forward; excellence, not position, prejudice, or privilege, is the criterion for choice.”

Responsibility in an open organization is a multi-faceted effort, part of which is outlined above—who makes the decisions, and how are decisions made? A critical flaw in the Challenger tragedy was that those who were not “closest to the issue” were making the decisions. Likewise, in the Takata airbag scenario, management failed to recognize “the best case put forward” and instead used “position, prejudice [and] privilege” to intimidate those who brought contradictory points of view.

Simplifying these examples to merely “engineering versus management” does neither side any credit. Multitudes of successful companies encourage strong communication between the two groups. However, even in organizations that have positive processes, inflexible agendas and negative tactics can still creep into decision-making dynamics. At that time, it is important to remember that the primary responsibility of leaders in open organizations: “to build and support meritocracy by making sure the right people are working together on the right things.”

Not every decision made will have to weigh safety, cost, and quality against each other, and even those that do will likely not have consequences for life and death. However, keeping the practice of open responsibility at the forefront whenever possible will make those more monumental decisions easier when the time comes.



Source link

,

Leave a Reply