Safety through accidents December 1, 2010 at 12:45 pm
If you’re sailing icy seas you’d generally want to keep a watchful eye open for icebergs. Unless, of course, you’re in an allegedly unsinkable ship, in which case you’d probably prefer to opt for a spot of partying and an early snooze on the poop deck instead. The craft’s designers will likely not have bothered with wasteful luxury items like lifebelts, emergency flares or lifeboats either: what would be the point?
Thus belief in safety produces behaviour which, if the belief is incorrect, is highly dangerous. A small accident, not enough to sink the ship, but enough to remind people that ships do sink, would reduce risk taking. In financial terms it means being alert to, and managing, the loss given (bad thing) as well as reducing the probability of the bad thing happening.
The Psy-fi blog suggests that
The real route to safer systems is to make sure that they’re not safe at all without human intervention: which is always true anyway, but oft-times needs positive reinforcement.
One example of a mechanism for keeping people alert I saw recently involved a procedure whereby banks had to contribute prices on a financial instrument. The common good was best served by good prices but everyone saw the average and so at the margin had no incentive to be accurate themselves. The answer was to put the outlier firms into a real trade based on their price. Thus the lowest offers and the highest bids were actually executed, in small size. The resulting losses kept banks on their toes.
Well designed stress tests can have the same effect, forcing people to look at the consequences of an event that is not foreseen by the risk system. The problem is to avoid the resulting set of controls itself being seen as safe. In truth, good risk management is often best served by a large dose of paranoia about the performance of any man-made system.