Never saw it coming by Karen Cerulo is a study of disaster preparedness or rather its absence. Cerulo argues that the failure to prepare for disaster is not a matter of individual incompetence or fecklessness. Rather she argues it reflects a bias towards optimism that is deeply embedded in American culture.
In the abstract the argument seems convincing, and there is plenty of psychological evidence to support it. But I find myself disagreeing with a lot of the detailed argument. On the one hand, some disasters can’t be prepared for in any effective fashion, so it makes sense not to worry about them.
On the other hand, Cerulo cites as an example of successful preparedness the massive Y2K remediation effort undertaken in the United States. As I’ve pointed out on many occasions, other countries undertook no preparation and came out fine. Russia and Italy are notable examples – the US State Department issued a travel advisory for Italy as did UK authorities and Australia actually evacuated its embassy in Moscow leaving a skeleton staff to wait out the cataclysm. This isn’t being wise after the event. Once the 2000 fiscal year began with no serious incidents it was obvious that for anyone except nuclear reactor managers and the like, ‘fix on failure’ was the optimal response.
I’m not sure what to make of my disagreements on the details.Some disagreements is to be expected in any detailed argument. But the range of disagreement leads me to think that maybe handling low-probability catastrophic risk is something we are not very good at, sometimes preparing for non-existent risks and at other times failing to foresee obvious possibilities.
John,
“Fix on failure” may have been the optimum response, but characterising the pre-fixing of Y2K issues as a inappropriate reaction isn’t right either. A lot of us were fixing-on-failure a good deal before the date itself, although all of our time was booked to Y2K budgets (think futures contracts or airline tickets or hundreds of other things booked into the future).
John, I think you might be a little bit overbroad in your characterisations of systems which were suitable for a fix-on-failure approach for Y2K.
Banks, government departments, and the like were at serious risk because so much of their code was old COBOL code which has inbuilt design features rendering it particularly vulnerable, and is mission critical – if banks can’t transact, the economy goes haywire pretty quickly.
Furthermore, I wonder how much of the lack of problems by countries which didn’t prepare was because of a vaccine effect – because most of the off-the-shelf software marketed around around the world was audited for countries where Y2K attracted attention, the regular systems updates in those countries probably fixed much of the potential problem anyway.
The point about Y2K is that sites that genuinely needed to update their software had been planning to do since 1990. Mainly that was banks and airlines. It was just one of hundreds of engineering issues they manage, in the same way airlines undertake scheduled preventative maintenance of their aircraft.
The enormous scare campaign about Y2K is another matter entirely, where even governments were enlisted to frighten mid level and small businesses into needless expenditure on expensive accounting firms. This was orchestrated by powerful scare campaigns suggesting any managers or governments who failed to take preventative action would be held responsible after 2000.
The Y2K campaign was a disgrace.
Y2K was just a marketing campaign for HPAQ, IBM, Sun, Oracle and Computer Associates.
I’m kind of disappointed the rapture didn’t happen. I ended up breaking my leg later in the morning and spent my new years in a hospital emergency room.
What I cannot figure out about Y2K is how there was a deathly silence after nothing happened. Nobody was blamed for the millions spent rectifying a problem that never occurred. And most of the so-called compliance testing could have been done in advance of Y2K by changing times and dates.
I wasn’t actually working during the lead up to Y2K, but many of my friends were. They were so desperate for staff that they hired uni students to Y2K check code. By their accounts they were real work. They certainly weren’t sitting around billing people while drinking latte’s.
I think people outside of IT have a pretty poor appreciation of how much custom written code exists out there in large commercial organisations, and just how old most of it is. Its perfectly normal to have a medium sized custom application consist of 5000 seperate sub programs. The application can easily have been in existence for decades and growing the whole time. To re-work just one of those sub-programs can be a significant job. The banking industry does millions of transactions every day almost flawlessly. They could not do that if their IT infrastructure was just slapped together.
The talk about planes crashing out of the sky was absurd. But the idea that financial instutions could have used a fix on failure approach on applications consisting of hundreds of thousands, or even millions of lines of code is crazy. You can’t fix that much code in two weeks. And if you’re a bank that can’t do transactions for two weeks, whatever the reason, you’re broke.
JQ, ‘Law of unintended consequences’ at work, on top of the usual systemic failures that allow forseeable consequences to arise. Aviation safety hit a brick wall several years ago and some very respectable specialists came to conclusion that world safety record would never improve beyond a certain point, why? as human beings we still make mistakes, no matter how good the systems, no matter how good the training, no matter how good the expertise. Add into that random chance and your on your way. Problem is the larger and more complex systems become, the more toxic the substances or the larger the operation, probability of unintended consequence increases. I have found John Nash’s work on game theory helpful here.
“sometimes preparing for non-existent risks and at other times failing to foresee obvious possibilities.”
Politics? The failure to prepare NOLA for what must have been an ‘obvious possibility’ suggests that more than a technical analysis of risk is needed. Resources have to be expended to tackle risks. For the private sector, as in Y2K, they have to make a calculation and act on it, but when it comes to public goods like levees the expenditure is politically determined.
“sometimes preparing for non-existent risks and at other times failing to foresee obvious possibilities.”
Politics? The failure to prepare NOLA for what must have been an ‘obvious possibility’ suggests that more than a technical analysis of risk is needed. Resources have to be expended to tackle risks. The the private sector can make its own calculations and decide to act or not, as in Y2K, but when it comes to public goods like levees the expenditure is politically determined.