How modern GPS technology affects our navigational neurons and mental maps.
Keep reading with a Membership
• Read members-only articles
• Adverts removed
• Cancel at any time
• 14 day money-back guarantee for new members
How modern GPS technology affects our navigational neurons and mental maps.
Availability heuristic or the availability bias is a cognitive bias leading people to judge probabilities on the basis of how easily examples come to mind.
Availability heuristic or the availability bias is a cognitive bias leading people to judge probabilities on the basis of how easily examples come to mind.
The availability heuristic, sometimes known as they availability bias, helps explain why people continue to buy lottery tickets.
The reason is that if people really understood their chances of winning the lottery, they would never buy a ticket.
Yet tickets are bought so frequently that well-run lotteries are, for the organisers, virtually a license to print money.
All lotteries exploit a simple heuristic, bias or mental shortcut in the way the human mind works called the availability heuristic (that and people’s desperation).
This is the tendency to judge probabilities on the basis of how easily examples come to mind.
Since lottery organisers heavily promote the jackpot winners, people are continually hearing about those who’ve won big. On the other hand they hear almost nothing about the vast majority of people who haven’t won a bean.
So people assume they are much more likely to win the lottery than they really are.
The UK lottery is promoted with the phrase “It could be you”, which, of course, is technically true. To tell the whole truth it should read: “It could be you, but it almost certainly won’t be.”
The availability heuristic was first coined by Nobel-prize winning psychologists Amos Tversky and Daniel Kahneman.
They thought that the bias operate unconsciously using the principle that “if you can think of it, it must be important.”
What happens is that when you try to make a decision, a number of related ideas and situations automatically come to mind.
These are likely to be biased in certain predictable ways.
For example, when deciding whether to buy a lottery ticket, you are likely to remember the lottery winner who won 100 million last week.
After remembering this, you are more likely to assume that the same will happen to you.
While the heuristic can be useful and effective for making quick decisions, it also leads us astray.
Advertisers frequently exploit the availability heuristic by making us believe that certain unlikely events are common in order to sell their wares (think of insurance and health products).
This availability heuristic affects all kinds of judgements we make that rely on memory. Here are a few more examples:
One of my favourite examples of the availability heuristic, though, is that college students often think that in a multiple-choice test, you should always stick with your first answer, rather than changing it. In fact this is wrong and I explain why here: Multiple-Choice Tests: Why Sticking With Your First Answer is (Probably) Wrong.
And the examples go on…. when information is more obvious, vivid or easier to recall it biases the way we assess the probability of that event.
The reason we have this heuristic or bias is that some of the time it is an effective way to make quick decisions. For example, memorable events—like getting food poisoning from a bad prawn curry—teach important lessons (about cheap restaurants).
Often, though, the availability heuristic serves to make us more nervous than need be. But there is a way to beat it without resorting to a statistics textbook, an actuary and a super-computer. The key is to specifically try and recall instances of the event that aren’t so memorable. For example, how many people do you know who have:
If you’re anything like me, you’ll discover this is an awful lot of people. I find these thoughts are a good recipe for a less stressful life.
The research was published in the journal Cognitive Psychology (Tversky & Kahneman, 1973)
The mind has a tendency to search for illusory correlations everywhere, whether they mean anything or not.
The mind has a tendency to search for illusory correlations everywhere, whether they mean anything or not.
Correlations in statistics are when there appears to be a relationship between two variables.
As one thing changes, so does the other.
For example, as children age their shoe size gets larger, or the more I practice the better I get at tennis.
We say there is a correlation between age and foot size and a correlation between practice and prowess at tennis.
However, there are also many things that just happen to change together, but are not really related.
For example, when you turn the light on and there’s a power-cut, or when you stamp your foot and there’s a simultaneous clap of thunder.
For a single moment, you feel like you’ve got super-powers, but then you notice it is just an illusory correlation.
It demonstrates a common phrase in statistics: “correlation does not imply causation”.
However, our minds have a tendency to search out these types of correlations, not always with the best result.
To see how easily the mind jumps to the wrong conclusions, try virtually taking part in a little experiment.
Imagine that you are presented with information about two groups of people about which you know nothing.
Let’s call them the Azaleans and the Begonians.
For each group you are given a list of positive and negative behaviours.
A good one might be: an Azalean was seen helping an old lady across the road.
A bad one might be: a Begonian urinated in the street.
So, you read this list of good and bad behaviours about the Azaleans and Begonians and afterwards you make some judgements about them.
How often do they perform good and bad behaviours and what are they?
What you notice is that it’s the Begonians that seem dodgy.
They are the ones more often to be found shoving burgers into mailboxes and ringing doorbells and running away.
The Azaleans, in contrast, are a sounder bunch; certainly not blameless, but overall better people.
While you’re happy with the judgement, you’re in for a shock.
What’s revealed to you afterwards is that actually the ratio of good to bad behaviours listed for both the Azaleans and Begonians was exactly the same.
For the Azaleans, 18 positive behaviours were listed along with 8 negative.
For the Begonians it was 9 positive and 4 negative.
In reality, you just had less information about the Begonians.
What happened was that you built up illusory correlations between more frequent bad behaviours and the Begonians; they weren’t more frequent, however, they just seemed that way.
When the experiment is over you find out that most other people had done exactly the same thing, concluding that the Begonians were worse people than the Azaleans.
This experimental method is actually from a classic study by Hamilton and Gifford (1976), which is all about how we perceive other people’s positive and negative traits.
In the experiment, people had different perceptions of the two groups, good for the majority and bad for the minority, purely because they had more information about the majority.
It’s not hard to see why this sort of process might contribute to the formation of prejudice in society at large.
Now, psychologists have not agreed how to explain this and other types of illusory correlations.
One explanation is that people over-estimate the diagnostic power of infrequent events.
In other words: if there is only one Martian who lives in your street and he/she/it listens to skiffle music, then you tend to think that all Martians must like skiffle.
On the other hand if half the street is filled with law-abiding Martians, only a few of whom like skiffle, you’ll guess that it’s only a minority interest.
Others say that illusory correlations are down to how memory or learning works or just a function of incomplete information.
Whatever the explanation, we do see these illusory correlations everywhere.
Here’s an example of a much less subtle type of illusory correlation from the world of CEOs.
When you are deciding what to pay a CEO, what factors do you take into account?
I’m sure you can list a few but what about golfing ability?
Would you pay a CEO more because they were better at golf?
No?
One analysis has looked at the correlation between golfing ability and American CEO pay (Hogarth & Kolev, 2010).
It found that as golfing ability improved, their pay went up.
Non-golfers were, on average, the lowest paid of all.
And here’s the kicker: the better the CEOs were at golf, the worse their stocks performed.
So, in people’s minds being good at golf was associated with more pay, but in reality it was associated with worse performance!
The assumption is that there’s an illusory correlation going on here.
Somehow it’s assumed that because someone is good at golf, they must also be good at other stuff, like running a multinational corporation, and so they get paid more.
Sticking with the business theme, all sorts of illusory correlations exist in equity markets.
One sign that traders sometimes use to predict price movements is the ‘head-and-shoulders’ chart.
It’s when the stock’s price movement looks like a person’s head and shoulders: in other words, two smaller peaks with one big peak in between.
Although it’s considered a reliable signal, and is associated with increased trading, the head-and-shoulders shape on the chart doesn’t profitably predict price fluctuations (Bender & Simon, 2012).
It’s just another illusory correlation: what our meaning-hungry minds are seeing everywhere.
.
Decision-making skills and processes can be hard, but psychological research can help us combat our mind’s inherent flaws.
Decision-making skills and processes can be hard, but psychological research can help us combat our mind’s inherent flaws.
Accurate decision-making — about investments, business, relationships and the rest — is very difficult.
Our minds are wonderful and fabulously complex, but not without flaws.
In many ways, while the mind is amazing, it is also a clumsy, cobbled together contraption with many predictable foibles.
One of the biggest challenges to decision-making is that fact that we effectively each have two people inside us.
One is a party animal.
He wants to get as much pleasure as he can right now, to eat, drink and be merry.
The other is the boring guy.
The kind who saves for a rainy day, eats healthily, never drinks too much and does the ‘right thing’.
We’ll call the first guy ‘Want’ and the second guy ‘Should’.
The mental battle between Want and Should has been going on since most of us can remember.
Here some tips drawn from psychological research on the best ways to improve your decision-making skills in the face of these two sides of yourself.
One of the best ways to fox the Want guy is to carry out decision-making in advance.
When we make decisions in advance it’s Should that’s in charge.
Whatever area of life, whether it’s financial, dietary, work or any other, if you make the decision in advance, you’re likely to cut down on detrimental outcomes.
Studies find that when people choose things without comparing the options their Want guy easily gets out of control.
Without comparisons, it’s easier for the Want guy to justify the bad decision.
By comparing options, though, research finds that people are better able to make the choice that is in their long-term interests.
Remember that our brains are not good at evaluating evidence dispassionately.
Force yourself to generate alternatives.
Research has demonstrated the value of counter-factual thinking: thinking about the opposite helps us make better decisions.
Weight up costs and benefits is common decision-making advice, but actually quite tricky to do.
Research shows that our minds prefer to consider either costs or benefits; taking both into account takes considerable effort.
One factor we often forget is the ‘opportunity cost’: when we do one thing, we can’t be doing something else.
When I watch TV the benefit might be relaxation and enjoyment but the cost is that I can’t be reading that mind-improving book that’s being lying around for weeks.
Our memories are highly contextual so the background to any issue we consider has a huge impact on how we view it.
Politicians, advertisers and other influencers use framing extensively to persuade us of their point of view.
You can fight back by reframing the decision — think about the problem your own way, not the way others want you to.
When evaluating evidence for decision-making, remember that correlation does not equal causation.
To explain: there’s a clear correlation between foot size and being richer, owning your own house and having a better education.
On the other hand people with smaller feet are often still struggling with potty training.
Guessed it yet?
People with small feet are usually children, so of course they have less money, don’t own their own houses and, haven’t been to school yet.
Correlation doesn’t equal causation.
The best of decision-making intentions often break down in the face of vicious temptation.
People find it difficult to predict just how far off course their emotions can pull them (e.g. the projection bias).
Use any method you can to counter your impulsivity: cancel the credit card, join a Christmas Club, avoid the confectionary store.
It’s all about planning ahead.
One way of improving decision-making in the face of impulsivity is by using commitment devices.
We can stop ourselves acting on impulse by committing ourselves to a course of action that is in our long-term interests.
Commitment devices allow us to take the choice away from the Want guy.
Here are some methods people use to pre-commit to long-term interests:
Commitment devices are best when they are tailored to your own psychological preferences and circumstances.
For example, if you’re well-off then a year’s gym membership might not be enough commitment to make you exercise.
Or, if you don’t care about eating six small packets of a ‘bad’ food, one after the other, then this technique won’t work either.
You’ll have to discover what type of commitment device works for your own personal decision-making skills.
When decision-making, focus on concrete plans and goals.
Humans are better at concrete goals; abstract goals like ‘read more’ or ‘lose weight’ get lost in the mix.
Substitute these with: ‘read this book by next Tuesday’ and ‘don’t buy any junk food on the weekly shop’.
It is clearly better to carry out decision-making when relaxed and rested.
That is because feeling stressed changes the way people balance risk against reward, psychologists have found (Mather & Lighthall, 2012).
Surprisingly, being under stress makes people focus on the positive more than they would otherwise.
Professor Mara Mather, who co-authored the research, said:
“Stress seems to help people learn from positive feedback and impairs their learning from negative feedback.”
For example, imagine a person deciding whether to take a new job.
When under stress, they might give more weight to benefits of a higher salary while ignoring the longer commute.
When not under stress, though, the negative aspects of the choice would matter more.
When we think someone will check up on us we make more cognitive effort, leading to better decision-making.
Even if no-one is checking up on you, imagine their reaction if they did: would you be proud of your decision?
All sorts of weird things start happening when we imagine the choice we are making right now as one in a series.
Often not good things.
We tell ourselves things like: “I’ll have that cake now, then I’ll eat healthily for the rest of the week”.
No.
No ‘ifs’ and ‘buts’ and no tortuous logic to get what we want.
Shut the Want guy down by making one-shot decisions.
Am I going to be good or bad, right here, right now?
When making decisions we are influenced by whatever thoughts and emotions are swirling around in our heads at that moment.
Help distance yourself by thinking about how this decision will affect you in the future.
Big decisions are always better made after a night’s sleep.
Again, it’s common advice but it can be surprisingly difficult to distance yourself.
It’s so easy for us to be swayed by vivid or personal stories when decision-making.
Remember that our minds are naturally fascinated and influenced by the sensational at the cost of quotidian.
Look carefully at the information source – are you being manipulated?
Some decisions are more important than others.
Not all decisions warrant effortful deliberation: sometimes it’s better just to choose and be done with it.
The trick is knowing which is which – experience should provide strong clues.
The best time to make decisions is when you are not hungry.
Being hungry makes people think more short-term — not just about food, but across many areas, including money (Skrynka & Vincent, 2019).
Hungry people are willing to settle for smaller rewards that they can get sooner, ignoring the chance to make more money by waiting.
Hunger ruins people’s self-control, making them grasp for rewards they would otherwise be prepared to wait for.
Dr Benjamin Vincent, study co-author, said:
“People generally know that when they are hungry they shouldn’t really go food shopping because they are more likely to make choices that are either unhealthy or indulgent.
Our research suggests this could have an impact on other kinds of decisions as well.
Say you were going to speak with a pensions or mortgage advisor – doing so while hungry might make you care a bit more about immediate gratification at the expense of a potentially more rosy future.”
This sounds a little vacuous, but research does suggest that just reminding ourselves to think rationally could help us make better decisions.
Consciously trying to think rationally will also help activate all the other techniques described here.
Our memories being what they are, this is no bad thing.
This article is based on two sources:
.
Decision-making and even personality can be read from rapid eye movements known as saccades.
Introspection is difficult for people, which means we know surprisingly little about our own personalities, attitudes and even self-esteem.
Introspection is difficult for people, which means we know surprisingly little about our own personalities, attitudes and even self-esteem.
Introspection in psychology is the process of looking inwards to examine one’s own thoughts and emotions.
Introspection is supposed to bring us self-discovery through self-reflection.
Early psychologists such as Wilhelm Wundt thought that we could research human psychology through introspection.
Wundt trained people to observe their own internal states through introspection and how they reacted to different stimuli.
However, modern psychologists have discovered that the stories we weave about our mental processes are logically appealing, but fatally flawed more often than we’d like to think.
The gulf between how we think our minds work, as accessed through introspection, and how they actually work is sometimes so huge it’s laughable.
The same finding appears again and again in many other studies demonstrating our lack of insight into who we’re attracted to, how we solve problems, where our ideas come and other areas.
One reason that introspection is so hard is that it turns out that the unconscious is mostly inaccessible (Wilson & Dunn, 2004).
This is quite a different view of the mind than Freud had.
He thought that through introspection you could rummage around and dig things up that would help you understand yourself.
Modern theorists, though, see large parts of the mind as being completely closed off from introspection.
You can’t peer in using introspection and see what’s going on, it’s like the proverbial black box.
The idea that large parts of our minds can’t be accessed is fine for basic processes like movement, seeing or hearing.
Generally, I’m not bothered to introspect about how I work out which muscles to contract to pedal my bicycle; neither do I want access to how I perceive sound.
Other types of introspection would be extremely interesting to know about.
Why do certain memories come back to me more strongly than others?
How extraverted am I really?
Why do I really vote this way rather than that?
Here are three examples of areas in which demonstrate the difficulties of introspection and why our self-knowledge is relatively low:
Perhaps personality is one of the best examples of how difficult introspection is.
You’d be pretty sure that you could describe your personality to someone else, right?
You know how extroverted you are, how conscientious, how optimistic?
Don’t be so sure.
When people’s personalities are measured implicitly, i.e. by seeing what they do, rather than what they say they do, the correlations are sometimes quite low (e.g. Asendorpf et al., 2002).
In other words, our lack of insight through introspection means we seem to know something about our own personalities, but not as much as we’d like to think.
Just like in personality, people’s conscious and unconscious attitudes also diverge, again making introspection difficult.
We sometimes lie about our attitudes to make ourselves look better, but this is more than that.
This difference between our conscious and unconscious attitudes occurs on subjects where we couldn’t possibly be trying to make ourselves look better (Wilson et al., 2000).
Rather, we seem to have unconscious attitudes that consciously we know little about (I’ve written about this previously in: Our secret attitude changes)
Once again we say we think one thing, but we act in a way that suggests we believe something different.
Perhaps this is the oddest one of all and a very real challenge to the idea that accurate introspection is possible.
Surely we know how high our own self-esteem is?
Well, psychologists have used sneaky methods of measuring self-esteem indirectly and then compared them with what we explicitly say.
They’ve found only very weak connections between the two (e.g. Spalding & Hardin, 1999).
Amazingly, some studies find no connection at all.
It seems almost unbelievable that, through introspection, we aren’t aware of how high our own self-esteem is, but there it is.
It’s another serious gap between what we think we know about ourselves and what we actually know.
So, what if we want to get more accurate information about ourselves without submitting to psychological testing?
How can we introspect more accurately?
It’s not easy because according to modern theories, there is no way to directly access large parts of the unconscious mind — accurate introspection is not an option.
The only way we can find out is indirectly, by trying to piece it together from various bits of evidence we do have access to.
As you can imagine, this is a very hit-and-miss affair, which is part of the reason we find introspection so difficult.
The result of trying to piece things together is often that we end up worse off than when we started.
Take the emotions.
Studies show that when people try to use introspection to analyse the reasons for their feelings, they end up feeling less satisfied after introspecting (Wilson et al., 1993).
Focusing too much on negative feelings can make them worse and reduce our ability to find solutions.
Perhaps the best way to introspect and gain self-knowledge is to carefully watch our own thoughts and behaviour.
Ultimately, what we do is not only how others judge us but also how we should judge ourselves.
Taking all this together, here are my rough-draft principles for living with an unknowable mind that resist introspection:
.
Left brain vs right brain dominance is a myth, but there is a small grain of truth in the idea.
Left brain vs right brain dominance is a myth, but there is a small grain of truth in the idea.
Many people have heard the idea that our left-brains are logical, verbal, rational and scientific while our right brains are spatial, emotional, intuitive and creative.
There are online quizzes which purport to assess your left brain vs right brain dominance.
Those with right brain dominance, they say, are creative thinkers.
People with left brain dominance, though, are thought to be more logical and to have strong math skills.
In fact, this is all mostly a myth — we need both sides of our brains in order to function to our fullest potential.
However, like some of the mind-myths covered in this series, there’s a solid grain of truth here, but its extent has been wildly exaggerated.
The brain is divided into two halves, each of which is called a hemisphere.
The left brain and right brain are connected by a tract of nerve fibre called the corpus callosum.
Both the left brain and the right brain look very similar.
The idea of left brain or right brain dominance come from an early finding that our verbal powers are concentrated in the left side of our brains.
It was Nobel Prize winner Roger W. Sperry who, in the 1960s, first suggested that the left brain or hemisphere is specialised for language (Corballis, 2007).
He was studying patients suffering from crippling epileptic fits who had decided to undergo surgery to try and relieve their symptoms.
The surgery cut the bundle of white matter – the corpus callosum – that connects the two hemispheres of the brain.
Along with successfully treating their epilepsy, these ‘split-brain’ patients exhibited some strange new symptoms.
Sperry found that after the surgery patients were unable to name objects with the, now disconnected, right side of their brains.
Their left brain, however, seemed to have retained this ability.
This lead him to propose that the left hemisphere or left brain is specialised for language.
But this specialisation didn’t mean the right hemisphere or right brain has no language powers at all.
Further experiments suggested that the right brain could indeed still process language, just to a lesser degree.
For example, patients were able to point to the written names of objects which were presented to their right brain, although they found themselves unable to say the word.
Not long after the left brain language discovery, researchers began to wonder about the right brain’s skills.
Sure enough the right hemisphere seemed to perform better in some tasks, especially related to attention:
This seems to correspond well with the myth, after all right brains are spatial, emotional and creative, aren’t they?
Well, yes, but the actual differences found in these experiments are relatively small, especially when compared to the specialisation of the left brain in language.
In a classic paper published in the journal Neurology, renowned neuropsychologist Brenda Milner points out that while there are many measurable functional differences between the left brain and the right brain, there are actually many more similarities between the two hemispheres (Milner, 1971).
Perhaps the clearest evidence of this is from studies of brain damage.
To completely lose a particular mental faculty, a person normally needs to suffer damage to a particular area in both the left and right hemispheres.
Research continues apace into the functional differences between our right and left hemispheres.
But while findings about lateralisation continue to point out surprising new differences about our hemispheric twins, the overall message remains the same: apart from language these differences are generally small.
Even in language, to perform at our best, we need both left brain and right brain working together.
More recent research, though, has similarly suggested that there is no left brain vs right brain dominance.
The evidence for left brain or right brain dominance has always been very weak, but researchers have done much to debunk this idea by examining the functional Magnetic Resonance Imaging (fMRI) scans of over 1,000 people (Nielsen et al., 2013).
Each person was lying in the scanner thinking about nothing in particular for 5 to 10 minutes.
These resting brain states were then analysed for evidence of more activity in either the right or left sides of the brain.
The lead author, Jared Nielsen explained the results:
“…we just don’t see patterns where the whole left-brain network is more connected or the whole right-brain network is more connected in some people.
It may be that personality types have nothing to do with one hemisphere being more active, stronger, or more connected.”
None of this means that some people aren’t more creative, while others more analytical and logical, just that it’s not accurate to say that creative people are more ‘right-brained’.
It’s not their over-active right brain that’s making them more creative; it’s their whole brain.
This finding also does not contradict the idea that some of the brain’s functions are biased towards the left brain or right brain.
For example, language processing is biased more towards the left brain (in right-handed people), while attention is biased towards the right brain.
Although some functions are lateralised, then, one side isn’t dominant over the other in some people.
Despite having no solid basis in science, the expressions ‘left-brained’ and ‘right-brained’ will probably survive because it’s an easy way to talk about two aspects of personality.
But be aware that the expression is flawed: it’s far better to talk about people’s creativity or their analytical skills separately, rather than in opposition—especially since many people have plenty of both.
.
What the ultimatum game tell us about how people cooperate and how they cheat.
What the ultimatum game tell us about how people cooperate and how they cheat.
Sometimes games have a lot to teach us about human nature, other times they’re just games.
One game that some economists and psychologists claim has much to teach us is called the ultimatum game.
The game is very simple.
It’s played between two people who have to decide how to split an amount of money.
Let’s say it’s $100.
One of the two people is randomly chosen to make an offer to the other about how to split the money between them.
If the other person accepts this offer then they split it on that basis.
But, if the other person rejects it, neither of them gets anything.
That’s it.
The reason some economists and psychologists have got excited about it is because of how people behave when they play this game.
What you find is that most people make offers of splitting the cash somewhere between 40 percent and 50 percent.
Generally speaking if an offer is made below about 30 percent it will be rejected by the other person more often than not.
The ultimatum game has been pointed to as a way of showing that humans are economically irrational.
Why do people reject an offer of 25 percent of the total pot?
If the pot is $100 then they are choosing between getting $25 or nothing at all.
So why do they choose nothing at all?
The answer seems to be that people generally find offers below 30% to be insulting.
It’s insulting that the other person should suggest such a derisory sum, even when it’s free money.
So they prefer to have nothing and punish the other person’s greed.
And remember the other person is losing $75 in this case whereas I’m only losing $25.
To the economist what players in the simplest version of this game are forgetting is that it’s a one-shot deal.
It doesn’t matter if you aren’t fair, because the other person can’t get back at you.
All you need to do is work out the minimum offer that’s likely to be accepted.
So really what the ultimatum game is showing is that most people act fairly, or at least want others to see them acting in a fair way.
In addition, any unfair behaviour is punished by the recipient of the offer.
Only an economist would argue that this is evidence of human irrationality.
Acting fairly, or at least appearing to act fairly is a highly rational custom in a society in which we have to work together.
Cheats, as they say, do not prosper.
So, does the ultimatum Game really tell us anything about human nature or is just further proof of how difficult it is to model human behaviour?
The optimist might say it tells us that people are mostly just and fair–or at least want to appear that way.
The pessimist, though, might say that people are being selfish because they have to make a judgement, consciously or not, about what offer will be accepted.
Remember that it is in the interests of the offeror to have his offer accepted or he won’t get any money at all.
What it certainly shows is how many psychological complexities can be drawn out of a very simple game like this.
.
When taking a multiple-choice test, is your first guess usually right or is it better to think again?
When taking a multiple-choice test, is your first guess usually right or is it better to think again?
The standard advice for multiple-choice tests is: if in doubt, stick with your first answer.
College students believe it: about 75 percent agree that changing your first choice will lower your score overall (Kruger et al., 2005).
Instructors believe it as well: in one study 55 percent believed it would lower students’ scores while only 16 percent believed it would improve them.
And yet this is wrong.
One survey of 33 different studies conducted over 70 years found that, on average, people who change their answers do better than those who don’t (Benjamin et al., 1984).
In none of these studies did people get a lower score because they changed their minds.
Study after study shows that when you change your answer in a multiple-choice test, you are more likely to be changing it from wrong to right than right to wrong.
So actually sticking with your first answer is, on average, the wrong strategy.
Why do so many people (including many who should know better, like the authors of test-preparation guides) still say that you should stick with your first answer?
Kruger et al. (2005) argue that it’s partly because it feels more painful to get an answer wrong because you changed it than wrong because you didn’t change it.
So we tend to remember much more clearly the times when we changed from right to wrong.
And so when taking a test we anticipate the regret we will feel and convince ourselves that our first instinct is probably right (when it’s probably not).
.
Loss aversion is a psychological bias in which people prefer to avoid losses more than getting equivalent gains.
Loss aversion is a psychological bias in which people prefer to avoid losses more than getting equivalent gains.
The loss aversion bias in psychology is a finding from Nobel Prize-winning research that reveals the strange ways people make decisions in risky situations.
Here is a simple example of loss aversion: first, I give you $10 free and then ask would you bet that $10 on the flip of a coin if you stood to win $20?
So you’ve got a 50 percent chance of losing $10 and a 50 percent chance of winning $20.
This seems like a good bet to take and yet studies on loss aversion show that people tend not to take it.
The reason is loss aversion: people hate to lose more than they love to win.
Before Kahneman and Tversky (1979) published their ground-breaking research on loss aversion in psychology, risky decisions were usually analysed by thinking about the total wealth involved.
When you look at this bet in the context of the total wealth it makes sense to gamble.
It’s obvious you’ve got more to gain than you have to lose.
So, why do people tend not to?
What Kahneman and Tversky suggested was that, in fact people think about small gambles like this in terms of losses, gains and neutral outcomes.
It is actually the changes in wealth on which people base their decision-making calculations and it’s here that the loss aversion bias kicks in.
But that doesn’t completely explain why people don’t take the bet.
There’s a further piece to the puzzle.
It turns out that at low levels of risk, such as this coin flip situation, people are more averse to the loss of $10 than they are attracted by the chance of winning the $20.
Studies of loss aversion have shown that people actually need the chance of winning $30 before they’ll consider risking their own $10.
Just as people show illogical loss aversion in some circumstances, they also show risk-seeking behaviour in other circumstances.
Imagine you have to choose between these two options.
The first is that you have an 85 percent chance of losing $1,000 along with a 15 percent chance of losing nothing.
The second is a 100 percent chance of losing $800.
Not much of a choice, right!?
You’re between a rock and hard place.
Still, sometimes we have to cut our losses.
According to the maths you should choose the sure loss of $800, but most people don’t.
Most people choose to gamble, simply because of loss aversion.
So when the potential for loss is there, suddenly people prefer to take a risk.
They’ve become risk seekers, motivated by loss aversion.
Yet, when there’s the potential for gains, people display loss aversion.
This way of thinking about how people behave in risky situations in psychology, which Kahneman and Tversky called Prospect Theory, has a second major insight that follows on from the risk aversion and risk seeking described above.
What they realised was that people behaved in different ways depending on how the risky situation was presented.
Remember that if a risk is presented in terms of losses, people will be more risk seeking, and if it’s expressed in terms of gains, people will be more risk averse.
Their classic example involves this fictional situation:
“Imagine your country is preparing for the outbreak of a disease expected to kill 600 people.
If program A is adopted, exactly 200 people will be saved.
If program B is adopted there is a 1/3 probability that 600 people will be saved and a 2/3 probability that no people will be saved.”
Here, the risk is presented in terms of gains so people tend to choose option A (72%), which is, in fact, worse.
Here’s the same problem but this time presented in terms of losses:
“Imagine your country is preparing for the outbreak of a disease expected to kill 600 people.
If program A is adopted, exactly 400 people will die.
If program B is adopted there is a 1/3 probability that no one will die and a 2/3 probability that 600 people will die.”
Now most people (78 percent) choose B because the problem is presented in terms of losses.
People suddenly prefer to take a risk.
In fact, if you look at both the situations you’ll see that, mathematically, they’re identical and yet people’s decision is heavily influenced by the way the problem is framed to target the loss aversion bias.
This effect has been termed preference reversal.
After considering these sorts of problems for a few minutes, it’s easy to wonder what all of this abstract reasoning about loss aversion has to do with the real world.
Quite a lot, argue Kahneman and Tversky.
The Nobel Prize committee agreed, awarding it to Kahneman for his work on prospect theory, of which loss aversion forms a part.
Everyday life involves endless ‘gambles’ and betting examples are just one of the easiest ways to understand how humans make decisions in risky situations.
Certainly Kahneman and Tversky’s work on loss aversion has plenty to say about some of the apparently strange decisions people make in everyday life.
So, next time you’re agonising over a decision in terms of losses, try this simple trick.
Re-imagine the whole decision in terms of gains.
I can’t promise it will help you make your decision, but at least you’ll better understand Kahneman and Tversky’s insightful research on loss aversion.
Humans are not as rational as we would like to think.
.
Join the free PsyBlog mailing list. No spam, ever.