Every time I see a news report of a plane crash, wherever it is in the world, my mind races forward to any flight I might have to take in the coming months and whether I might just drive or take the train instead. Am I paranoid?
If I am, then a lot of other people are too. In the year after 9/11, many Americans chose to drive rather than take domestic flights. Understandable, but unfortunately it wasn't such a great decision:
an extra 1600 people died in road accidents that year as a result - that's six times the number who died in the hijacked aircraft.
We might think we know what we're doing when we take these kinds of decisions, but we generally don't. At times of risk, when we feel threatened or fearful, emotion overrides reason and we end up making rather poor choices. Can we do anything about that?
A lot of researchers think we can - that we can be helped to counteract our instinctive reaction to a risky situation and reason our way to a better choice (see our
How to keep your head in scary situations feature).
What do they have in mind? One idea is to present statistics about, say, health risks, in ways that don't leave people cold - for example, put them in the context of a narrative. "Feel the numbers", as one expert puts it, so we can connect with them more easily when emotions are running high.
That's all very well when you've grown up viewing numbers as abstract entities in a world of their own. Time to rethink the way we teach mathematics, then.
Another example of how people could be "helped" in their decisions about risks comes from some remarkable findings by a team at
Yale Law School.
These show that the single most important factor in determining how we judge the risks of issues such as nuclear power, nanotechnology, vaccination and climate change is the degree to which we share the cultural world-view of the person giving us the information. If they have different values or political sympathies to us, we are predisposed to reject their arguments, irrespective of what we thought previously. It's all about the messenger, in other words.
On the surface it sounds to me like a good idea to use insights like these in public policy to steer people towards wiser judgements.
This taps into something of a zeitgeist: both the Democratic presidential candidate Barack Obama and David Cameron, leader of the Conservative party in the UK, have recently sought the advice of two University of Chicago professors whose recent book,
Nudge, describes ways governments might "nudge" people into doing things that are good for them, or for society. For example, introduce automatic enrolment in pension and organ donation schemes unless people opt out (inertia means most people stay in).
[You can give the
Nudge authors your own suggestions
here].
But this kind of "libertarian paternalism" is not to everyone's taste because it involves a degree of manipulation. It's one thing for a government to intervene to stop people harming others, quite another for it to intervene to stop them harming themselves, particularly if the nudging is undisclosed. Is that what government should be for?
It seems to me this is a debate we shall need to have, especially as we learn more about the way people respond to risks. Of course, there are some smart things societies could do without government intervention – such as demand that the media report shocking or traumatic news in a more balanced way. That includes us!
Psychologists have known for some time that we seriously overestimate our chances of dying in a knife attack or plane crash because the extensive use of graphic media coverage makes it so much easier for us to bring such events to mind, a phenomenon known as the "
availability rule".
On the other hand, we tend to underestimate our chances of getting diseases because these are usually only reported as statistics. Many researchers claim this bias is the root of much of our poor decision-making. As one of them puts it, we aren't rational enough to be exposed to the modern press.
Until any of these changes happen, the best thing is to stop watching the TV and reading the newspapers. Not the science magazines, though – where else would you get such evidence-based advice as to stop reading the newspapers?
And a final thought, for those who are uncomfortable with any attempt to "nudge" us into better decision-making, remember that in this regard we are already seriously handicapped. If anything, translating good science into nudges could help us choose more wisely than ever before.
Michael Bond, New Scientist consultant