Look beyond the first order
On news headlines, the Toronto Pearson crash, and why it's critical to look at systems rather than people when big things go wrong
The airliner that flipped over at Toronto Pearson in February had warnings about a high rate of descent just a second or two before touchdown, according to the preliminary report of the Transportation Safety Board of Canada. This fact, as well as the damage to the right main landing gear, will undoubtedly feature centrally in the mainstream news coverage of the incident.
â– "The pilot landed the plane too hard" is a convenient and attractive summary of the situation, but people need to digest the incident thoughtfully. Something can be both true and dangerously misleading if it overshadows facts that need to be examined in the daylight.
â– Fixating on "pilot error" -- whether it's actually the case or not -- is a really hazardous way to operate, since it causes us to miss the systemic factors undoubtedly involved. Those are the most important details involved.
â– Whenever something goes wrong within a complex system, "the human at the controls" is almost always a contributing factor, but the system itself that produced the result is what should get the most attention. The system includes training, guidance, rules, technology, and a whole range of other factors that matter. What makes for an attention-grabbing headline can easily obscure the story that demands to be told.