If I had to pick an unconventional member for the financial stability board, I would seriously consider an aircraft safety expert. Let me explain.
Civil engineers know about one kind of safety; the safety of bridges and such like. The crucial thing about a bridge for our purposes is that the elements of it don’t change their nature when you change the design. They might change their role – whether then are in compression or tension say – but their physical properties are constant.
Aircraft safety adds an element that civil engineers don’t have to worry about (much) – people. People react to the situation they find themselves in. They learn. Importantly, they form theories about how the world works and act upon them. Thus aircraft accidents are often as much about aircrew misunderstanding what the plane is telling them as about mechanical failure. The system being studied reacts to ‘safety’ enhancements because the system includes people, and hence those enhancements may introduce new, hard to spot error modes.
The report into the Air France 447 crash is an interesting example of this. See the (terrifying) account in Popular Mechanics here. As they say in their introduction to the AF447 Black Box recordings:
AF447 passed into clouds associated with a large system of thunderstorms, its speed sensors became iced over, and the autopilot disengaged. In the ensuing confusion, the pilots lost control of the airplane because they reacted incorrectly to the loss of instrumentation and then seemed unable to comprehend the nature of the problems they had caused. Neither weather nor malfunction doomed AF447, nor a complex chain of error, but a simple but persistent mistake on the part of one of the pilots
AF447 was, by the way, an Airbus 330, a plane packed to the ailerons with sophisticated safety systems. Not only didn’t they work, the plane crashed partly because of the way to pilots reacted to their presence.
Aircraft risk experts understand this kind of reflexive failure whereby what went wrong wasn’t the plane or the pilot but rather a damaging series of behaviors caused by the pilot’s incomplete understanding of what the plane was and wasn’t doing. This is often exactly the type of behaviour that leads to financial disasters. Think for instance of Corzine’s incomplete understanding of the risk of the MF Global repo position.
Another thing aircraft safety can teach us is the importance of an open, honest post mortem. Despite the embarrassment caused, black box recordings are widely available, at least for civil air disasters. (The military is less forthcoming, although things often leak out eventually – see for instance here for a fascinating account of the Vincennes disaster.) In contrast, we still don’t have FSA’s report on RBS, let alone a good account of what happened at, to pick a distressed bank more or less at random, Dexia. UBS is a beacon of clarity in an otherwise murky world.
It is hard to learn from mistakes if you don’t know many of the bad things that happened and what the people who did them believed at the time. Finance, like air safety, is epistemic: to understand it, you have to know something about what people believe to be true, as that will give some insight into how they will behave in a crisis.
The more I think about this, the more I think risk managers in other disciplines have to teach us financial risk folks.