Decision-making and even personality can be read from rapid eye movements known as saccades.
Keep reading with a Membership
• Read members-only articles
• Adverts removed
• Cancel at any time
• 14 day money-back guarantee for new members
Decision-making and even personality can be read from rapid eye movements known as saccades.
Introspection is difficult for people, which means we know surprisingly little about our own personalities, attitudes and even self-esteem.
Introspection is difficult for people, which means we know surprisingly little about our own personalities, attitudes and even self-esteem.
Introspection in psychology is the process of looking inwards to examine one’s own thoughts and emotions.
Introspection is supposed to bring us self-discovery through self-reflection.
Early psychologists such as Wilhelm Wundt thought that we could research human psychology through introspection.
Wundt trained people to observe their own internal states through introspection and how they reacted to different stimuli.
However, modern psychologists have discovered that the stories we weave about our mental processes are logically appealing, but fatally flawed more often than we’d like to think.
The gulf between how we think our minds work, as accessed through introspection, and how they actually work is sometimes so huge it’s laughable.
The same finding appears again and again in many other studies demonstrating our lack of insight into who we’re attracted to, how we solve problems, where our ideas come and other areas.
One reason that introspection is so hard is that it turns out that the unconscious is mostly inaccessible (Wilson & Dunn, 2004).
This is quite a different view of the mind than Freud had.
He thought that through introspection you could rummage around and dig things up that would help you understand yourself.
Modern theorists, though, see large parts of the mind as being completely closed off from introspection.
You can’t peer in using introspection and see what’s going on, it’s like the proverbial black box.
The idea that large parts of our minds can’t be accessed is fine for basic processes like movement, seeing or hearing.
Generally, I’m not bothered to introspect about how I work out which muscles to contract to pedal my bicycle; neither do I want access to how I perceive sound.
Other types of introspection would be extremely interesting to know about.
Why do certain memories come back to me more strongly than others?
How extraverted am I really?
Why do I really vote this way rather than that?
Here are three examples of areas in which demonstrate the difficulties of introspection and why our self-knowledge is relatively low:
Perhaps personality is one of the best examples of how difficult introspection is.
You’d be pretty sure that you could describe your personality to someone else, right?
You know how extroverted you are, how conscientious, how optimistic?
Don’t be so sure.
When people’s personalities are measured implicitly, i.e. by seeing what they do, rather than what they say they do, the correlations are sometimes quite low (e.g. Asendorpf et al., 2002).
In other words, our lack of insight through introspection means we seem to know something about our own personalities, but not as much as we’d like to think.
Just like in personality, people’s conscious and unconscious attitudes also diverge, again making introspection difficult.
We sometimes lie about our attitudes to make ourselves look better, but this is more than that.
This difference between our conscious and unconscious attitudes occurs on subjects where we couldn’t possibly be trying to make ourselves look better (Wilson et al., 2000).
Rather, we seem to have unconscious attitudes that consciously we know little about (I’ve written about this previously in: Our secret attitude changes)
Once again we say we think one thing, but we act in a way that suggests we believe something different.
Perhaps this is the oddest one of all and a very real challenge to the idea that accurate introspection is possible.
Surely we know how high our own self-esteem is?
Well, psychologists have used sneaky methods of measuring self-esteem indirectly and then compared them with what we explicitly say.
They’ve found only very weak connections between the two (e.g. Spalding & Hardin, 1999).
Amazingly, some studies find no connection at all.
It seems almost unbelievable that, through introspection, we aren’t aware of how high our own self-esteem is, but there it is.
It’s another serious gap between what we think we know about ourselves and what we actually know.
So, what if we want to get more accurate information about ourselves without submitting to psychological testing?
How can we introspect more accurately?
It’s not easy because according to modern theories, there is no way to directly access large parts of the unconscious mind — accurate introspection is not an option.
The only way we can find out is indirectly, by trying to piece it together from various bits of evidence we do have access to.
As you can imagine, this is a very hit-and-miss affair, which is part of the reason we find introspection so difficult.
The result of trying to piece things together is often that we end up worse off than when we started.
Take the emotions.
Studies show that when people try to use introspection to analyse the reasons for their feelings, they end up feeling less satisfied after introspecting (Wilson et al., 1993).
Focusing too much on negative feelings can make them worse and reduce our ability to find solutions.
Perhaps the best way to introspect and gain self-knowledge is to carefully watch our own thoughts and behaviour.
Ultimately, what we do is not only how others judge us but also how we should judge ourselves.
Taking all this together, here are my rough-draft principles for living with an unknowable mind that resist introspection:
.
Left brain vs right brain dominance is a myth, but there is a small grain of truth in the idea.
Left brain vs right brain dominance is a myth, but there is a small grain of truth in the idea.
Many people have heard the idea that our left-brains are logical, verbal, rational and scientific while our right brains are spatial, emotional, intuitive and creative.
There are online quizzes which purport to assess your left brain vs right brain dominance.
Those with right brain dominance, they say, are creative thinkers.
People with left brain dominance, though, are thought to be more logical and to have strong math skills.
In fact, this is all mostly a myth — we need both sides of our brains in order to function to our fullest potential.
However, like some of the mind-myths covered in this series, there’s a solid grain of truth here, but its extent has been wildly exaggerated.
The brain is divided into two halves, each of which is called a hemisphere.
The left brain and right brain are connected by a tract of nerve fibre called the corpus callosum.
Both the left brain and the right brain look very similar.
The idea of left brain or right brain dominance come from an early finding that our verbal powers are concentrated in the left side of our brains.
It was Nobel Prize winner Roger W. Sperry who, in the 1960s, first suggested that the left brain or hemisphere is specialised for language (Corballis, 2007).
He was studying patients suffering from crippling epileptic fits who had decided to undergo surgery to try and relieve their symptoms.
The surgery cut the bundle of white matter – the corpus callosum – that connects the two hemispheres of the brain.
Along with successfully treating their epilepsy, these ‘split-brain’ patients exhibited some strange new symptoms.
Sperry found that after the surgery patients were unable to name objects with the, now disconnected, right side of their brains.
Their left brain, however, seemed to have retained this ability.
This lead him to propose that the left hemisphere or left brain is specialised for language.
But this specialisation didn’t mean the right hemisphere or right brain has no language powers at all.
Further experiments suggested that the right brain could indeed still process language, just to a lesser degree.
For example, patients were able to point to the written names of objects which were presented to their right brain, although they found themselves unable to say the word.
Not long after the left brain language discovery, researchers began to wonder about the right brain’s skills.
Sure enough the right hemisphere seemed to perform better in some tasks, especially related to attention:
This seems to correspond well with the myth, after all right brains are spatial, emotional and creative, aren’t they?
Well, yes, but the actual differences found in these experiments are relatively small, especially when compared to the specialisation of the left brain in language.
In a classic paper published in the journal Neurology, renowned neuropsychologist Brenda Milner points out that while there are many measurable functional differences between the left brain and the right brain, there are actually many more similarities between the two hemispheres (Milner, 1971).
Perhaps the clearest evidence of this is from studies of brain damage.
To completely lose a particular mental faculty, a person normally needs to suffer damage to a particular area in both the left and right hemispheres.
Research continues apace into the functional differences between our right and left hemispheres.
But while findings about lateralisation continue to point out surprising new differences about our hemispheric twins, the overall message remains the same: apart from language these differences are generally small.
Even in language, to perform at our best, we need both left brain and right brain working together.
More recent research, though, has similarly suggested that there is no left brain vs right brain dominance.
The evidence for left brain or right brain dominance has always been very weak, but researchers have done much to debunk this idea by examining the functional Magnetic Resonance Imaging (fMRI) scans of over 1,000 people (Nielsen et al., 2013).
Each person was lying in the scanner thinking about nothing in particular for 5 to 10 minutes.
These resting brain states were then analysed for evidence of more activity in either the right or left sides of the brain.
The lead author, Jared Nielsen explained the results:
“…we just don’t see patterns where the whole left-brain network is more connected or the whole right-brain network is more connected in some people.
It may be that personality types have nothing to do with one hemisphere being more active, stronger, or more connected.”
None of this means that some people aren’t more creative, while others more analytical and logical, just that it’s not accurate to say that creative people are more ‘right-brained’.
It’s not their over-active right brain that’s making them more creative; it’s their whole brain.
This finding also does not contradict the idea that some of the brain’s functions are biased towards the left brain or right brain.
For example, language processing is biased more towards the left brain (in right-handed people), while attention is biased towards the right brain.
Although some functions are lateralised, then, one side isn’t dominant over the other in some people.
Despite having no solid basis in science, the expressions ‘left-brained’ and ‘right-brained’ will probably survive because it’s an easy way to talk about two aspects of personality.
But be aware that the expression is flawed: it’s far better to talk about people’s creativity or their analytical skills separately, rather than in opposition—especially since many people have plenty of both.
.
What the ultimatum game tell us about how people cooperate and how they cheat.
What the ultimatum game tell us about how people cooperate and how they cheat.
Sometimes games have a lot to teach us about human nature, other times they’re just games.
One game that some economists and psychologists claim has much to teach us is called the ultimatum game.
The game is very simple.
It’s played between two people who have to decide how to split an amount of money.
Let’s say it’s $100.
One of the two people is randomly chosen to make an offer to the other about how to split the money between them.
If the other person accepts this offer then they split it on that basis.
But, if the other person rejects it, neither of them gets anything.
That’s it.
The reason some economists and psychologists have got excited about it is because of how people behave when they play this game.
What you find is that most people make offers of splitting the cash somewhere between 40 percent and 50 percent.
Generally speaking if an offer is made below about 30 percent it will be rejected by the other person more often than not.
The ultimatum game has been pointed to as a way of showing that humans are economically irrational.
Why do people reject an offer of 25 percent of the total pot?
If the pot is $100 then they are choosing between getting $25 or nothing at all.
So why do they choose nothing at all?
The answer seems to be that people generally find offers below 30% to be insulting.
It’s insulting that the other person should suggest such a derisory sum, even when it’s free money.
So they prefer to have nothing and punish the other person’s greed.
And remember the other person is losing $75 in this case whereas I’m only losing $25.
To the economist what players in the simplest version of this game are forgetting is that it’s a one-shot deal.
It doesn’t matter if you aren’t fair, because the other person can’t get back at you.
All you need to do is work out the minimum offer that’s likely to be accepted.
So really what the ultimatum game is showing is that most people act fairly, or at least want others to see them acting in a fair way.
In addition, any unfair behaviour is punished by the recipient of the offer.
Only an economist would argue that this is evidence of human irrationality.
Acting fairly, or at least appearing to act fairly is a highly rational custom in a society in which we have to work together.
Cheats, as they say, do not prosper.
So, does the ultimatum Game really tell us anything about human nature or is just further proof of how difficult it is to model human behaviour?
The optimist might say it tells us that people are mostly just and fair–or at least want to appear that way.
The pessimist, though, might say that people are being selfish because they have to make a judgement, consciously or not, about what offer will be accepted.
Remember that it is in the interests of the offeror to have his offer accepted or he won’t get any money at all.
What it certainly shows is how many psychological complexities can be drawn out of a very simple game like this.
.
When taking a multiple-choice test, is your first guess usually right or is it better to think again?
When taking a multiple-choice test, is your first guess usually right or is it better to think again?
The standard advice for multiple-choice tests is: if in doubt, stick with your first answer.
College students believe it: about 75 percent agree that changing your first choice will lower your score overall (Kruger et al., 2005).
Instructors believe it as well: in one study 55 percent believed it would lower students’ scores while only 16 percent believed it would improve them.
And yet this is wrong.
One survey of 33 different studies conducted over 70 years found that, on average, people who change their answers do better than those who don’t (Benjamin et al., 1984).
In none of these studies did people get a lower score because they changed their minds.
Study after study shows that when you change your answer in a multiple-choice test, you are more likely to be changing it from wrong to right than right to wrong.
So actually sticking with your first answer is, on average, the wrong strategy.
Why do so many people (including many who should know better, like the authors of test-preparation guides) still say that you should stick with your first answer?
Kruger et al. (2005) argue that it’s partly because it feels more painful to get an answer wrong because you changed it than wrong because you didn’t change it.
So we tend to remember much more clearly the times when we changed from right to wrong.
And so when taking a test we anticipate the regret we will feel and convince ourselves that our first instinct is probably right (when it’s probably not).
.
Loss aversion is a psychological bias in which people prefer to avoid losses more than getting equivalent gains.
Loss aversion is a psychological bias in which people prefer to avoid losses more than getting equivalent gains.
The loss aversion bias in psychology is a finding from Nobel Prize-winning research that reveals the strange ways people make decisions in risky situations.
Here is a simple example of loss aversion: first, I give you $10 free and then ask would you bet that $10 on the flip of a coin if you stood to win $20?
So you’ve got a 50 percent chance of losing $10 and a 50 percent chance of winning $20.
This seems like a good bet to take and yet studies on loss aversion show that people tend not to take it.
The reason is loss aversion: people hate to lose more than they love to win.
Before Kahneman and Tversky (1979) published their ground-breaking research on loss aversion in psychology, risky decisions were usually analysed by thinking about the total wealth involved.
When you look at this bet in the context of the total wealth it makes sense to gamble.
It’s obvious you’ve got more to gain than you have to lose.
So, why do people tend not to?
What Kahneman and Tversky suggested was that, in fact people think about small gambles like this in terms of losses, gains and neutral outcomes.
It is actually the changes in wealth on which people base their decision-making calculations and it’s here that the loss aversion bias kicks in.
But that doesn’t completely explain why people don’t take the bet.
There’s a further piece to the puzzle.
It turns out that at low levels of risk, such as this coin flip situation, people are more averse to the loss of $10 than they are attracted by the chance of winning the $20.
Studies of loss aversion have shown that people actually need the chance of winning $30 before they’ll consider risking their own $10.
Just as people show illogical loss aversion in some circumstances, they also show risk-seeking behaviour in other circumstances.
Imagine you have to choose between these two options.
The first is that you have an 85 percent chance of losing $1,000 along with a 15 percent chance of losing nothing.
The second is a 100 percent chance of losing $800.
Not much of a choice, right!?
You’re between a rock and hard place.
Still, sometimes we have to cut our losses.
According to the maths you should choose the sure loss of $800, but most people don’t.
Most people choose to gamble, simply because of loss aversion.
So when the potential for loss is there, suddenly people prefer to take a risk.
They’ve become risk seekers, motivated by loss aversion.
Yet, when there’s the potential for gains, people display loss aversion.
This way of thinking about how people behave in risky situations in psychology, which Kahneman and Tversky called Prospect Theory, has a second major insight that follows on from the risk aversion and risk seeking described above.
What they realised was that people behaved in different ways depending on how the risky situation was presented.
Remember that if a risk is presented in terms of losses, people will be more risk seeking, and if it’s expressed in terms of gains, people will be more risk averse.
Their classic example involves this fictional situation:
“Imagine your country is preparing for the outbreak of a disease expected to kill 600 people.
If program A is adopted, exactly 200 people will be saved.
If program B is adopted there is a 1/3 probability that 600 people will be saved and a 2/3 probability that no people will be saved.”
Here, the risk is presented in terms of gains so people tend to choose option A (72%), which is, in fact, worse.
Here’s the same problem but this time presented in terms of losses:
“Imagine your country is preparing for the outbreak of a disease expected to kill 600 people.
If program A is adopted, exactly 400 people will die.
If program B is adopted there is a 1/3 probability that no one will die and a 2/3 probability that 600 people will die.”
Now most people (78 percent) choose B because the problem is presented in terms of losses.
People suddenly prefer to take a risk.
In fact, if you look at both the situations you’ll see that, mathematically, they’re identical and yet people’s decision is heavily influenced by the way the problem is framed to target the loss aversion bias.
This effect has been termed preference reversal.
After considering these sorts of problems for a few minutes, it’s easy to wonder what all of this abstract reasoning about loss aversion has to do with the real world.
Quite a lot, argue Kahneman and Tversky.
The Nobel Prize committee agreed, awarding it to Kahneman for his work on prospect theory, of which loss aversion forms a part.
Everyday life involves endless ‘gambles’ and betting examples are just one of the easiest ways to understand how humans make decisions in risky situations.
Certainly Kahneman and Tversky’s work on loss aversion has plenty to say about some of the apparently strange decisions people make in everyday life.
So, next time you’re agonising over a decision in terms of losses, try this simple trick.
Re-imagine the whole decision in terms of gains.
I can’t promise it will help you make your decision, but at least you’ll better understand Kahneman and Tversky’s insightful research on loss aversion.
Humans are not as rational as we would like to think.
.
The illusion of control is people’s tendency to overestimate how much they control events in their lives or have choices.
The illusion of control is people’s tendency to overestimate how much they control events in their lives or have choices.
The ‘illusion of control’ is the finding in psychology that people tend to overestimate their perceived control over events in their lives.
The illusion of control is a bias in a positive direction, just like the above-average effect or the optimism bias, that help us feel better about life, even if it is at the cost of truth.
The illusion of control is well documented and has been tested over-and-over in lots of different studies over four decades.
Here’s an example of the illusion of control: you choose an apple which tastes delicious.
You assume you are very skilled at choosing apples (when in fact the whole batch happens to be good today).
Another example of the illusion of control: you enter the lottery and win millions.
You assume that this is (partly) a result of how good your lucky numbers are.
In fact, lotteries are totally random so you can’t influence them with the numbers you choose.
Although most of us know and accept this, we still harbour an inkling that maybe it does matter which numbers we choose.
Sometimes the illusion of control manifests as magical thinking.
In one study participants watched another person try to shoot a miniature basketball through a hoop (Pronin et al., 2006).
When participants willed the player to make the shot, and they did, they felt it was partly down to them, even though they couldn’t possibly be having any effect.
It’s like pedestrians in New York who still press the button to get the lights to change, despite the fact they do nothing.
Since the late 80s all the traffic signals have been controlled by computer, but the city won’t pay to have the buttons removed.
It’s probably just as well: they help boost people’s illusion of control.
We feel better when we can do something that feels like it might have an effect (even if it doesn’t).
It’s sometimes argued that the illusion of control is beneficial because it can encourage people to take responsibility.
It’s like when a person is diagnosed with an illness; they want to take control through starting medication or changing their diet or other aspect of their lifestyle.
Similarly, studies find that hospital patients who are able to administer their own painkillers typically give themselves lower doses than those who have them prescribed by doctors, but they experience no more pain.
Feeling in control can also urge us on to do things when the chances of success are low.
Would you apply for that job if you knew how little control you had over the decision?
No.
But if you never apply for any jobs, you can’t get them.
So we pump ourselves up, polish our résumé and practice our interview technique.
But the illusion of control isn’t all roses.
To return to the discussion of lotteries, we can see the illusion of control operating in the financial markets.
Traders often feel they have more control over the market than they really do.
Indeed one study has shown that the more traders think they are in control, the worse their actual performance (O’Creevy & Nicholson, 2010).
A word of caution there for those who don’t respect the forces of randomness.
More generally, some argue that the illusion of control stops us learning from our mistakes and makes us insensitive to feedback.
When you feel you’re in charge, you are more likely to ignore the warning signals from the environment that things are not under your control.
Indeed an experiment has shown that the more power you feel, the stronger the illusion of control becomes (Fast et al., 2009).
So far, so orthodox.
What’s fascinating is the idea that the illusion of control itself may be an illusion, or at least only part of the story.
What if the illusion of having control depends heavily on how much control we actually have?
After all, we’re not always totally out-of-the-loop like the experiments above suggest.
Sometimes we have a lot of control over the outcomes in our life.
This has been recently tested out in a series of experiments by (Gino et al., 2011).
What they found was that the illusion of control flips around when control over a situation is really high.
When participants in their studies actually had plenty of control, suddenly they were more likely to underestimate it.
This is a pretty serious challenge to the illusion of control.
If backed up by other studies, it reverses the idea that the illusion of control is usually beneficial.
Now we’re in a world where sometimes the illusion is keeping us back.
For example, applying for more jobs increases the chance of getting one, exercise does make you more healthy, buying a new car does make you poorer.
All these are areas in which we have high levels of control but which we may well be assuming we don’t.
This effect will have to be renamed the illusion of futility.
In other words: when you have high control, you underestimate how much what you do really matters.
.
Anchoring bias in psychology is an example of a cognitive bias that causes people to rely too much on the first piece of information they get.
Anchoring bias in psychology is an example of a cognitive bias that causes people to rely too much on the first piece of information they get.
The anchoring bias or anchoring effect or anchoring heuristic is a cognitive psychology finding that people over-emphasise the first piece of information they receive.
A simple example of the anchoring bias is the first price quoted for a car: this number will tend to overshadow subsequent negotiations.
The anchoring bias means that people rely too heavily on this first piece of information, even when more is known later on.
To illustrate the anchoring bias or effect, let’s say I ask you how old Mahatma Gandhi was when he died.
For half of you I’ll preface the question by saying: “Did he die before or after the age of 9?”
For the other half I’ll say: “Did he die before or after the age of 140?”
Obviously these are not very helpful statements.
Anyone who has any clue who Gandhi was will know that he was definitely older than 9; while the oldest person who ever lived was 122.
So why bother making these apparently stupid statements?
Because, according to the results of a study conducted by (Strack & Mussweiler, 1997), these initial statements, despite being unhelpful, affect the estimates people make.
This is the anchoring bias, effect or heuristic.
In their experiment, the first group guessed an average age of 50 and the second, 67.
Neither was that close, he was actually assassinated at 87; but you can still see the effect of the anchoring bias on the initial number.
These might seem like silly little experiments that psychologists do to try and suggest that people are idiots, but actually it’s showing us something fundamental about the way we think.
It’s so basic to how we experience the world that we often don’t notice it.
We have a tendency to use anchors or reference points to make decisions and evaluations, and sometimes these lead us astray.
This sort of things is going on in loads of different areas of our lives.
Take the emotions for starters.
Psychologists have found it can be difficult to predict our future emotions and one reason is that the anchoring bias affects how we feel right now.
That’s why people who have just had lunch feel like they’ll never be hungry again; compared with those who haven’t, who don’t display the same short-sightedness (I have described the relevant study in the context of the projection bias).
Real estate agents, car sellers or negotiators will be nodding their heads.
That’s because the anchoring bias is vital in all these lines of work, and many more.
The initial price you set for the car, house or, more abstractly, for a deal of some kind, tends to have ramifications right through the process of coming to an agreement.
Whether we like it or not, our minds keep referring back to that initial number.
That doesn’t mean you just set the highest possible price you can get away with (although in reality that’s often what is done).
In real life things are more complicated than the Gandhi experiment.
People usually have a choice about which house or car to buy or which deal to take and they can always walk away.
Still, there’s a good reason sticker prices on car forecourts are mostly so high.
You can see the effect of the anchoring bias in salary negotiations.
There’s some evidence that when the initial anchor figure is set high, the final negotiated amount will usually be higher (Thorsteinson, 2011).
Incidentally, the anchoring bias is another reason that you should open negotiations rather than waiting for the employer to tell you the range: because then you can set the anchor higher (more on this in: Ten Powerful Steps to Negotiating a Higher Salary).
Since the anchoring bias occurs in so many situations, no one theory has satisfactorily explained it.
There is, though, a modern favourite for explaining the anchoring bias in decision-making.
It is thought to stem from our tendency to look for confirmation of things we are unsure of.
So, if I’m told the price of a particular diamond ring is £5,000, I’ll tend to search around looking for evidence that confirms this.
In this case it’s easy: plenty of diamond rings cost about that, no matter the value of this particular ring.
For all I know about diamond rings it could be worth £500 or £50,000.
The problem is that this explanation is less satisfying when the anchor is so manifestly unhelpful, like when you tell people that Gandhi was older than nine when he died.
Perhaps, then, it’s all down to our fundamental laziness.
When given the Gandhi example we can’t be bothered to make the massive adjustment from the anchor we’re given up to the real value, so we go some way and then stop.
Whatever the reason for it, the anchoring bias is everywhere and can be difficult to avoid.
That’s especially true when we are deciding what to pay for stuff since we are overly influenced by the price that’s been set.
One way of avoiding the anchoring bias — whether it’s emotional or in decision-making — is by trying to from the anchor state.
This can be done by thinking about other comparisons.
That’s what we’re doing when we comparison shop: getting some new price anchors.
In the realm of the emotions it might mean trying to compare with other emotional states, not just how you feel right now (creating a ‘memory palace‘ for reference emotions may help with this).
When negotiating avoiding the anchoring bias might mean thinking about what the other options are (negotiation theorists call this the ‘BATNA’: the Best Alternative To a Negotiated Agreement).
Alternatively, for nullifying the anchoring bias in decision-making, find out more about the area: experts are less susceptible to it.
There’s little doubt it’s hard, though: some studies suggest that even when you know about it and are forewarned, the anchoring bias can still affect our judgements.
It just shows the power that first piece of information can have on how we make decisions.
.
Aphantasia is when people ‘see’ nothing at all when they try to imagine pictures in their minds.
Aphantasia is when people ‘see’ nothing at all when they try to imagine pictures in their minds.
Imagine a tree sitting atop a hill and on that tree, a small yellow bird.
What do you see in your mind’s eye?
About 95 percent of people can visualise something, varying in detail from vivid to vague, depending on their natural abilities.
However, up to 5 percent of people — as many as 1-in-20 — ‘sees’ nothing at all.
They have ‘aphantasia’: a lack of all mental imagery.
While they might be able to imagine the sound of the wind or the feeling of grass between their toes, there is no accompanying image.
What it means to be aphantasic is examined in research that surveyed 267 people with the condition.
Mr Alexei Dawes, the study’s first author, said:
“Aphantasia challenges some of our most basic assumptions about the human mind.
Most of us assume visual imagery is something everyone has, something fundamental to the way we see and move through the world.
But what does having a ‘blind mind’ mean for the mental journeys we take every day when we imagine, remember, feel and dream?”
The researchers compared the experience of aphantasics with 400 people who have mental imagery.
Mr Dawes said:
“We found that aphantasia isn’t just associated with absent visual imagery, but also with a widespread pattern of changes to other important cognitive processes.
People with aphantasia reported a reduced ability to remember the past, imagine the future, and even dream.”
People were asked to recall memories and indicate how vivid their mental imagery was for that moment.
People with aphantasia tended to agree with the statement: “No image at all, I only ‘know’ that I am recalling the memory.”
Those with strong mental imagery agreed with the statement: “Perfectly clear and as vivid as normal vision.”
Mr Dawes explained the results:
“Our data revealed an extended cognitive ‘fingerprint’ of aphantasia characterised by changes to imagery, memory, and dreaming.
We’re only just starting to learn how radically different the internal worlds of those without imagery are.”
Among the aphantasics, one-quarter also had difficulties imagining touch, sound, motion, smell, taste and emotion.
Aphantasics also dream less, which makes sense, considering how important visual imagery is to dreaming.
Professor Joel Pearson, study co-author, said:
“Aphantasics reported dreaming less often, and the dreams they do report seem to be less vivid and lower in sensory detail.
This suggests that any cognitive function involving a sensory visual component—be it voluntary or involuntary—is likely to be reduced in aphantasia.”
Aphantasics find it harder to recall memories and they are, overall, less vivid.
Mr Dawes said:
“Our work is the first to show that aphantasic individuals also show a reduced ability to remember the past and prospect into the future.
This suggests that visual imagery might play a key role in memory processes.”
Aphantasia can be an isolating experience for some, researchers have found (Zeman et al., 2015).
Tom Ebeyer, 25, from Ontario, Canada, who has aphantasia, didn’t discover he lacked a common mental ability until the age of 21:
“It had a serious emotional impact.
I began to feel isolated — unable to do something so central to the average human experience.
The ability to recall memories and experiences, the smell of flowers or the sound of a loved one’s voice; before I discovered that recalling these things was humanly possible, I wasn’t even aware of what I was missing out on.
The realisation did help me to understand why I am a slow at reading text, and why I perform poorly on memorisation tests, despite my best efforts.”
All of Mr Ebeyer’s senses are affected.
He can’t summon up any smell, emotion, sound, texture or taste.
Mr Ebeyer said:
“After the passing of my mother, I was extremely distraught in that I could not reminisce on the memories we had together.
I can remember factually the things we did together, but never an image.
After seven years, I hardly remember her.
To have the condition researched and defined brings me great pleasure.
Not only do I now have an official title to refer to the condition while discussing it with my peers, but the knowledge that professionals are recognising its reality gives me hope that further understanding is still to come.”
Professor Adam Zeman, the study’s first author, said:
“This intriguing variation in human experience has received little attention.
Our participants mostly have some first-hand knowledge of imagery through their dreams: our study revealed an interesting dissociation between voluntary imagery, which is absent or much reduced in these individuals, and involuntary imagery, for example in dreams, which is usually preserved.”
The study was published in the journal Scientific Reports (Dawes et al., 2020).
The psychology of money: including post-purchase rationalisation, the relativity trap, rosy retrospection and the restraint bias.
The psychology of money: including post-purchase rationalisation, the relativity trap, rosy retrospection and the restraint bias.
We all make mistakes with money, some more than others.
And in this economy, who needs it?
But many of these mistakes are avoidable if we can understand how we think about money.
Here are 10 biases that psychological research has shown affect our judgement…and how to avoid them.
One of the biggest reason people lose out financially is they stick with what they know, despite much better options being available.
We tend to choose the same things we chose before.
And we continue to do this even when better options are available, whether it’s goods or services.
Research on investment decisions shows this bias (e.g. Samuelson & Zeckhauser, 1988).
People stick to the same old pension plans, stocks and shares, even though there are better options available.
It’s hard to change because it involves more effort and we want to avoid regretting our decision.
But there is better value out there if you’re prepared to look.
After we buy something that’s not right, we convince ourselves it is right.
Most people refuse to accept they’ve made a mistake, especially with a big purchase.
Marketers know this, so they try to encourage part-ownership first, using things like money-back guarantees.
Once you’ve made a decision, you convince yourself it was the right one (see: cognitive dissonance), and also start to value it more because you own it (e.g. Cohen et al., 1970).
Fight it! If the goods or services aren’t right, return them.
Most country’s legal systems incorporate a cooling off period, so don’t rationalise, return it!
We think about prices relatively and businesses know this.
That’s why recommended retail prices are set high, then discounted.
Some expensive options on restaurant menus are there only to make the regular meals look reasonable in comparison.
The relativity trap is also called the anchoring effect.
One price acts like an anchor on our thinking.
It’s easy to fall for, but also easy to surmount by making comparisons they don’t want you to make (read more about the relativity trap).
Use price comparison websites.
And try comparing across categories of goods.
Is an iPad really worth a month’s groceries or three years of cinema trips or a new set of clothes?
We value things more when we own them.
So when it comes to selling our stuff, we tend to set the price too high.
It’s why you sometimes see second-hand goods advertised at ridiculous prices.
Unlike professionals, amateur sellers develop an emotional attachment to their possessions (read the research on 6 quirks of ownership).
It also works the other way. When bidding on eBay, it’s possible to feel you already partly own something before you actually buy it.
So you end up paying above the market value.
When buying or selling you have to try and be dispassionate.
Be aware that unless you set limits, your unconscious may take over.
In general humans prefer to get the pleasure right now, and leave the pain for later.
Economists call this hyperbolic discounting.
In a study by Read and van Leeuwen (1998), when making food choices for next week, 74 percent of participants chose fruit.
But when deciding for today, 70 percent chose chocolate.
That’s humans for you: chocolate today, fruit next week.
The same is true of money. Marketers know we are suckers for getting discounts right now, so they hide the pain for later on (think mobile phone deals). Unfortunately buy now, pay later offers are often very bad deals.
One way to get around this is to think about your future self when making a purchasing decision.
Imagine how ‘future you’ will see the decisions of ‘present you’.
If ‘future you’ wouldn’t like it, don’t do it.
People tend to sell things when they go up in price, but hold on to them when they go down.
It’s one demonstration of our natural desire to avoid losses.
This effect has been seen in a number of studies of stock-market trading (e.g. Weber & Camerer, 1998).
The fact that prices are falling, though, is a big clue.
If you can fight the fear of losing, in the end it could leave you better off.
Advertising works partly because we like what we know, even if we only vaguely know it.
We even choose familiar things when there are clear signals that it’s not the best option (Richter & Spath, 2006).
Always check if you’re buying something for the right reasons.
Mere familiarity means the advertisers are winning.
Smaller companies that can’t or won’t afford pricey TV commercials often provide better products and services.
We tend to remember our decisions as better than they really were.
This is a problem when we come to make similar decisions again.
We have a bias towards thinking our previous decision was a good one; it could be the holiday, house or car you chose (e.g. Mitchell & Thompson, 1994).
That’s partly why we end up making the same financial mistakes again: we forget we made the same mistake before.
Before making an important financial decision, try to dredge up the real outcomes of previous decisions.
Only without the rose-tinted spectacles can we avoid repeating our mistakes.
The word ‘free’ has a magical hold on us and marketers know it.
Behavioural economics research shows we sometimes take a worse deal overall just to get something for free.
Watch out if you are offered something for ‘free’ as sometimes the deal is not that good.
Many mistakes with money result from a lack of self-control.
We think we’ll control ourselves, but, when faced with temptation, we can’t.
Studies like Nordgren et al., (2009) show people are woefully optimistic in predicting their self-control.
So, don’t put yourself in the situation of being tempted.
This is why cutting up credit cards is often recommended.
We’re mostly weaker than we think, so we shouldn’t give ourselves the opportunity.
Join the free PsyBlog mailing list. No spam, ever.