Hypnosis: 8 Myths You Should Know

Hypnosis is a real phenomenon but there are many myths and misunderstandings about it. Here are 8 common ones.

Hypnosis is a real phenomenon but there are many myths and misunderstandings about it. Here are 8 common ones.

Hypnosis is a mental state in which people display focused attention, vivid fantasies, increased susceptibility to suggestion and reduced peripheral awareness.

In other words, hypnosis puts people in a trance.

Hypnosis certainly can be an effective therapy, particularly for pain and anxiety.

Hypnosis can achieve all sorts of fascinating effects, among other things, people can:

  • have visual or auditory hallucinations,
  • move their bodies without intending to,
  • and feel less pain.

But much of what many people believe about hypnosis is total and utter rubbish.

Here are 8 very common myths:

Myth 1: Only the mentally weak can be hypnotised

It is a myth that only the mentally weak are susceptible to hypnosis.

In fact, the exact reverse is probably more true.

The higher your intelligence and the stronger your self-control, the more easily you are hypnotised.

That’s because entering a hypnotic trance is all about concentrating, so people with mental health problems can find it difficult under hypnosis.

However, finding it hard to enter a hypnotic state doesn’t mean there’s anything wrong with you.

People naturally vary in how susceptible they are to hypnosis.

Studies have shown that around 30 percent of people are relatively resistant to hypnosis.

Although, with effort, a hypnotic state can usually be achieved eventually.

Myth 2: Under hypnosis, people are helpless

It is a myth that people are helpless under hypnosis.

It’s difficult to get people to do things under hypnosis that they wouldn’t normally do.

While hypnotised people are still in touch with their morals and normal standards of behaviour.

That said, though, it is possible to reduce people’s inhibitions under hypnosis and they will more readily accept suggestions.

Stage hypnotists rely on this heightened suggestibility, along with picking the types who, let’s say, don’t mind a little attention.

That’s how they get people to quack like ducks and the rest.

Don’t we all know someone who would quack like a duck if it meant everyone would look at them?

Myth 3: Hypnosis is sleep

Yes, people look like they’re asleep when they’re hypnotised because their eyes are sometimes closed and they look peaceful.

But it is a myth that under hypnosis they are asleep.

The brain waves of a person who is hypnotised are nothing like those of a person who is asleep.

In fact, the hypnotic trance is a heightened state of concentration.

A high level of alpha waves on an EEG show that a hypnotised person is awake, alert and very responsive.

Myth 4: Hypnotherapy works in one session

It is a myth that hypnosis alone can cure any ailment in one session.

Nevertheless, some of the most outrageous curative claims are made about hypnotism (although usually not by hypnotherapists themselves).

These have their origins in stage hypnotism as well as hucksters of all types.

Of course, people regularly repeat claims that they were cured in only one session of hypnotherapy because it’s such a good story.

Who wants to hear about how it took you a decade, three divorces and 19,423 nicotine patches to give up smoking?

The truth is that almost no one is cured in one session, if they are cured at all through hypnosis.

Hypnotherapists usually insist that patients commit to 6 sessions, or sometimes 20 sessions.

This isn’t naked profiteering, change takes time.

Even then, hypnotherapy is often used as an added extra to some other kind of treatment, rather than as the main method.

Myth 5: Hypnotists must be flamboyant or weird

It is a myth that practitioners of hypnosis need to look unusual or flamboyant.

That’s just people in showbusiness.

In reality, it would be distracting if the person trying to hypnotise you had swirling eyes, kept talking about black magic and wore very loud ties.

Your average hypnotist is much more likely to wear a grey suit.

Myth 6: Under hypnosis long forgotten memories can be retrieved.

It is a myth that under hypnosis, long forgotten memories can be retrieved.

But, if you believe this one, then you’re in very good company.

Many members of the public think this is true, as do some psychologists and many hypnotherapists themselves.

Except that nowadays most people in the know think that the hypnotic trance isn’t much good for accurately retrieving memories.

Worse, hypnotists can easily implant false memories, because people in a hypnotic trance are easily suggestible.

That scene in the movie where hypnosis helps the victim see the killer’s face is pure Hollywood: entertaining but total fiction.

Myth 7: You can’t lie under hypnosis

It is a myth that you cannot lie under hypnosis — in fact, you can!

Hypnosis is not some kind of magical state in which you can only speak the truth.

This is a natural result of the fact that you are not helpless when hypnotised and your usual moral (and immoral) faculties are still active.

Not only can you lie under hypnosis, but lying is not necessarily any more detectable hypnotised than when not (Sheehan & Statham, 1988).

Myth 8: You’ve never experienced hypnosis

Many people think they’ve never been hypnotised since they’ve never been to a hypnotherapist or been involved in stage hypnosis.

In reality, most of us have experienced a state of mild hypnosis, at least.

For example, when you drive a long distance and start to feel dissociated from your body and the car, that’s a mild state of hypnosis.

Your unconscious is taking care of all the mechanical aspects of driving while you conscious mind is free to float around.

Or if you’ve every meditated then you’ve hypnotised yourself.

Meditation is really a specific type of hypnosis.

.

Miller’s Magical Number Seven, Plus or Minus Two In Psychology

The magic number 7 plus or minus 2 in psychology refers to the fact that we can fit about seven pieces of information into our short-term memories.

The magic number 7 plus or minus 2 in psychology refers to the fact that we can fit about seven pieces of information into our short-term memories.

It is sometimes said human beings are nothing more than a collection of memories.

Memories for people, events, places, sounds and sights.

Our whole world is funnelled in through our memories. In fact, they may be our most prized possessions.

The study of memory has always been central to psychology – this article describes one of its most influential findings.

The title of this article comes from a 1956 study by the psychologist George A. Miller in which he describes the capacity of human memory (Miller, 1956).

The article’s opening has become famous amongst historians of psychology:

“My problem is that I have been persecuted by an integer.

For seven years this number has followed me around, has intruded in my most private data, and has assaulted me from the pages of our most public journals.

This number assumes a variety of disguises, being sometimes a little larger and sometimes a little smaller than usual, but never changing so much as to be unrecognizable.”

The magical number seven (plus or minus 2)

It’s not just Miller who was persecuted by this number though, it’s all of us.

What this magical number represents – 7 plus or minus 2 – is the number of items we can hold in our short-term memory.

So, while most people can generally hold around seven numbers in mind for a short period, almost everyone finds it difficult to hold ten digits in mind.

Remember that memory is a slippery concept: short-term memory for psychologists refers to things that are currently being used by our brains right now.

For example, as you’re reading this post the words you’ve read go into short-term memory for a very short period, you extract some meaning (hopefully) and then the meaning is either stored or discarded.

You’ll probably still have some faint memory of this article tomorrow, but won’t be able to remember most of the actual words.

Disputing the magical number seven

All sorts of experiments and theories have followed disputing the magical number seven approach to memory.

More recent studies have, for example, shown how we put items together in order to ‘chunk’ data.

Still, the basic concept that our immediate short-term memory is relatively limited is still valid.

If you think seven isn’t much then be thankful you’re not a six-month-old infant.

Recent research suggests they can only hold one thing in short-term memory (Kaldy & Leslie, 2005).

Poor little chaps — it explains a lot though.

.

Illusion of Truth Effect: Repetition Makes Lies Sound True

The illusion of truth effect in psychology is the tendency to believe false information if it is repeated often enough. 

The illusion of truth effect in psychology is the tendency to believe false information if it is repeated often enough.

The illusion of truth effect, is very simple: people are more likely to believe something, the more often it is repeated to them.

With repetition, it is easier for the human mind to process a statement relative to other competing ideas that have not been repeated over-and-over again.

Repetition is used everywhere to persuade people, in advertising, politics and the media, and it certainly works.

Examples of the illusion of truth effect

We see ads for the same products over and over again.

Politicians repeat the same messages endlessly (even when it has nothing to do with the question they’ve been asked).

Journalists repeat the same opinions day after day.

Can all this repetition really be persuasive?

It seems too simplistic that just repeating a persuasive message should increase its effect, but that’s exactly what psychological research finds (again and again).

Repetition is one of the easiest and most widespread methods of persuasion because of the illusion of truth effect.

In fact it’s so obvious that we sometimes forget how powerful it is.

People rate statements that have been repeated just once as more valid or true than things they’ve heard for the first time.

They even rate statements as truer when the person saying them has been repeatedly lying (Begg et al., 1992).

That is how powerful the illusion of truth effect is.

And when we think something is more true, we also tend to be more persuaded by it.

Several studies on the illusion of truth have shown that people are more swayed when they hear statements of opinion and persuasive messages more than once.

How the illusion of truth effect works

The illusion of truth effect works at least partly because familiarity breeds liking.

As we are exposed to a message again and again, it becomes more familiar.

Because of the way our minds work, what is familiar is also true — hence the illusion of truth.

Familiar things require less effort to process and that feeling of ease unconsciously signals truth, this is called cognitive fluency.

As every politician knows, there’s not much difference between actual truth and the illusion of truth.

Since illusions are often easier to produce, why bother with the truth?

Reversing the illusion of truth

The exact opposite of the illusion of truth is also true.

If something is hard to think about, then people tend to believe it less.

Naturally this is very bad news for people trying to persuade others of complicated ideas in what is a very complicated world.

Some studies have even tested how many times a message should be repeated for the maximum effect of the illusion of truth.

These suggest that people have the maximum confidence in an idea after it has been repeated between 3 and 5 times (Brinol et al., 2008).

After that, repetition ceases to have the same effect and may even reverse.

Because TV adverts are repeated many more times than this, advertisers now use subtle variations in the ads to recapture our attention and avoid the illusion of truth backfiring.

This is an attempt to avoid the fact that while familiarity can breed liking, over-familiarity tends to breed contempt.

When the illusion of truth fails

Repetition is effective almost across the board when people are paying little attention, but when they are concentrating and the argument is weak, the effect disappears (Moons et al., 2008).

In other words, it’s no good repeating a weak argument to people who are listening carefully — then the illusion of truth does not operate.

But if people aren’t motivated to scrutinise your arguments carefully then repeat away with abandon—the audience will find the argument more familiar and, therefore, more persuasive.

This suggests we should remain critical while watching TV adverts or the illusion of truth effect will creep in under our defences.

You might think it’s better to let the ads wash over you, without thinking too much, but just the reverse is true.

Really we should be highly critical of the illusion of truth otherwise, before we know it, we’re singing the jingle, quoting the tag-line and buying the product.

When the argument is strong, though, it doesn’t matter whether or not the audience is concentrating hard, repetition will increase persuasion and the illusion of truth effect works.

Unfortunately, I find it’s often people with the best arguments who don’t take advantage of the illusion of truth.

Persuading groups

When people are debating an issue together in a meeting, you can see a parallel effect.

When one person in a group repeats their opinion a few times, the other people think that person’s opinion is more representative of the whole group (see my previous article: loudest voice = majority opinion).

The same psychology is at work again: to the human mind there is little difference between appearances and truth.

What appears to be true might as well actually be true, because we tend to process the illusion as though it were the truth.

It’s a depressing enough finding about the human ability to process rational arguments, but recent research has shown an even more worrying effect.

We can effectively persuade ourselves through repetition — which takes the illusion of truth to new heights.

A study has shown that when an idea is retrieved from memory, this has just as powerful a persuasive effect on us as if it had been repeated twice (Ozubko et al., 2010).

The aspiring sceptic, therefore, should be especially alert to thoughts that come quickly and easily to mind—we can easily persuade ourselves with a single recall of a half-remembered thought.

.

Cocktail Party Effect In Psychology: Definition & Example

The definition of the cocktail party effect in psychology  is when we tune into one voice from many conversations going on in a noisy room.

The definition of the cocktail party effect in psychology  is when we tune into one voice from many conversations going on in a noisy room.

For psychologists the ‘cocktail party effect’ or phenomenon is our impressive and under-appreciated ability to tune our attention to just one voice from a multitude.

For example, at a party, when bored with our current conversational partner — and for the compulsive eavesdropper — we can allow our aural attention to wander around the room.

Perhaps only the most recidivist eavesdroppers are aware how special the cocktail party effect is.

But even they might be surprised — and worried — by just how much we can miss in the voices we decide to tune out.

What is the cocktail party effect?

The cocktail party effect or phenomenon — our ability to separate one conversation from another — is beautifully demonstrated in a classic study carried out by Colin Cherry (Cherry, 1953).

Cherry used the simple method of playing back two different messages at the same time to people, under a variety of conditions.

In doing so he discovered just how good we are at filtering what we hear, which is how we overcome the cocktail party problem.

To accomplish this task, Cherry reports, participants had to close their eyes and concentrate hard.

In the first set of experiments on the cocktail party effect he played back two different messages voiced by the same person through both ears of a pair of headphones and asked participants to ‘shadow’ one of the two messages they were hearing by speaking it out loud, and later by writing it down.

When doing this they could, with effort, and while hearing the clips over and over again, separate one of the messages from the other.

With the two voice presented together, as though the same person were standing in front of you saying two completely different things at the same time, this task appears to be very hard, but still possible.

Pushing participants further Cherry found he could confuse listeners, but only by having both messages consist entirely of nonsensical platitudes.

Only then were participants unable to pick apart one message from the other.

This is not a wholly satisfying demonstration of the cocktail party effect.

An example of how the cocktail party effect works

The real surprise, though, came in the second set of experiments on the cocktail party effect or phenomenon.

For these Cherry fed one message to the left ear and one to the right ear — and once again both messages were voiced by the same speaker.

Suddenly participants found the task incredibly easy.

Indeed many were surprised how easily and accurately they could tune in to either one of the messages, and even shift their attention back and forth between the two.

No longer did they have to close their eyes and furrow their brows – this was much easier.

What participants were experiencing here seems much closer to most people’s experience of the cocktail party phenomenon.

At a party people are arrayed all around us and their conversations come from various different directions.

We seem to be able to use this information, which is key to the cocktail party effect, to reject all but the one in which we are interested.

Ignoring rejected speech

Although we are fantastically good at tuning in to one conversation over all the others, we seem to absorb very little information from the conversations we reject.

This is the flipside of the cocktail party effect and where it can get embarrassing.

Cherry’s experiments on the cocktail party effect revealed that people picked up surprisingly little information presented to the other, ‘rejected ear’, often failing to notice blatant changes to the unattended message.

When asked afterwards, participants:

  • could not identify a single phrase from the speech presented to the rejected ear.
  • weren’t sure the language in the rejected ear was even English.
  • failed to notice when it changed to German.
  • mostly didn’t notice when the speech to the rejected ear was being played backwards (though some did report that it sounded a bit strange).

Across all the different conditions in these cocktail party effect studies, there were only two aspects of the speech to the rejected ear the participants could reliably identify.

The first was that it was speech compared to a tone, the second was when the speaker suddenly changed from male to female.

Missed your own name

This research on the cocktail party effect doesn’t bode at all well for people with a habit of tuning out of conversations when they  lose interest (you know who you are!).

If you really are listening to someone else it’s likely you won’t hear a word of what’s being said to you directly.

One study has found that two-thirds of people don’t even notice when their own name is slipped into the unattended speech, while those who do notice are likely to be of the extremely distractable variety (Conway et al., 2001).

That demonstrates the power of the cocktail party effect.

.

Why Gut-Decisions Beat Agonising Over Business Data

The unconscious, or gut instinct, can do just as well as a conscious, deliberate decision in the business environment.

The unconscious, or gut instinct, can do just as well as a conscious, deliberate decision in the business environment.

Instinctive, gut decisions may be just as good as those based on data, a study suggests.

Managers who rely on gut instinct to make decisions about new projects are just as likely to be right as those relying on the data, the research found.

However, relying on gut instinct is much faster, as data analysis typically takes a long time.

Think or blink?

Malcolm Gladwell wrote a well-known book called ‘Blink‘ about the power of the unconscious to make complex decisions in the blink of an eye.

However, since then studies have failed to back up the idea that the unconscious can outperform the conscious mind.

Still, there is evidence that the unconscious, or gut instinct, can do just as well as a conscious, deliberate decision.

And as this study underlines, sometimes gut decisions have other benefits, such as speed and requiring fewer resources.

The study comes at a time when 92 percent of companies are investing in data initiatives, which might prove unnecessary.

About the study

For the research, 122 managers in digital, advertising, publishing and software companies were asked about how they decided to allocate resources to new projects.

Among the ways they reported making decisions were:

  1. Majority: making the choice that most people wanted.
  2. Experience: going with the option that the most experienced individual preferred.
  3. Tallying: choosing the option with the most positive points.

The results showed that managers often relied on a ‘tallying’ approach more than other methods.

More analysis did not provide much of a boost to accuracy in decision-making and took considerably longer, the results showed.

Using instinct and rules of thumb, like tallying positive points, was just as accurate as more data analysis.

Dr Oguz A. Acar, study co-author, said:

“This research shows that data-driven decision-making is not the panacea in all situations and may not result in increased accuracy when facing uncertainty.

Under extreme uncertainty, managers, particularly those with more experience, should trust the expertise and instincts that have propelled them to such a position.

The nous developed over years as a leader can be a more effective than an analytical tool which, in situations of extreme uncertainty, could act as a hindrance rather than a driver of success.”

The study was published in the journal Psychology & Marketing (West et al., 2021).

The Confirmation Bias: Definition And Examples

The definition of the confirmation bias in psychology is that people search for information that confirms their view of the world and ignore what doesn’t fit.

The definition of the confirmation bias in psychology is that people search for information that confirms their view of the world and ignore what doesn’t fit.

The confirmation bias is the fact that people search for information that confirms their view of the world and ignore what doesn’t fit.

In an uncertain world, people love to be right because it helps them make sense of things.

Indeed, some psychologists think it’s akin to a basic drive.

What is the confirmation bias?

One of the ways they strive to be correct is by looking for evidence that confirms they are correct, sometimes with depressing or comic results:

  • A woman hires a worker that turns out to be incompetent. She doesn’t notice that everyone else is doing his work for him because she is so impressed that he shows up every day, right on time.
  • A sports fan who believes his team is the best only seems to remember the matches they won and none of the embarrassing defeats to inferior opponents.
  • A man who loves the country life, but has to move to the city for a new job, ignores the flight-path he lives under and noisy-neighbours-from-hell and tells you how much he enjoys the farmer’s market and tending his window box.

We do it automatically, usually without realising.

We do it partly because it’s easier to see where new pieces fit into the picture-puzzle we are working on, rather than imagining a new picture.

It also helps shore up our vision of ourselves as accurate, right-thinking, consistent people who know what’s what.

Psychologists call it the confirmation bias and it creeps into all sorts of areas of our lives.

Here are a few examples of the confirmation bias:

 1. Confirmation bias in self-image

“Hey, you look great, have you done something different with your hair?”

Who doesn’t like a compliment? No one. It doesn’t even have to be sincerely delivered, I’ll take it. But what about…

“Hey, you’re a real slime-ball, you know that?”

Who likes insults? Well, we don’t exactly like them but—believe it or not—sometimes we seek them out if they confirm our view of ourselves.

In a study that examined this, people actually sought out information confirming their own view that they were—not exactly slime-balls—but lazy, or slow-witted or not very athletic (Swann et al., 1989).

And this isn’t some kind of self-hating thing; in this study even people with high self-esteem sought out information that confirmed their own negative self-views.

It seems we like to be right, even at a cost to our self-image.

2. Finance example

A study of online stock market investors has looked at how they gathered information about a prospective stock (Park et al., 2010).

The researchers found the confirmation bias writ large.

Investors mostly looked for information that confirmed their hunch about a particular stock.

Those people who displayed the strongest confirmation bias were the most over-confident investors and consequently made the least money.

It seems we like to be right, even if it costs us money.

3. Politics example

People see what they want to see in politics all the time.

The most ironic example is in satire.

Often satire uses sarcasm to make its point: TV satirist Stephen Colbert frequently says the opposite of what he really thinks to make his point (amongst comedians I believe these are called ‘jokes’).

Except the irony is that one study has shown that people who don’t agree with Colbert don’t get that he’s being sarcastic, they think he really means it (LaMarre, 2009).

The beauty is that both liberals and conservatives get what they want: their viewpoints confirmed.

It seems we like to be right, even if it means not getting the joke.

4. Healthcare examples

Despite what many nurses believe, the full moon is NOT linked to busier hospital emergency rooms or more births (Margot, 2015).

The belief that there might be a link is likely down to the confirmation bias.

Despite the belief being remarkably common in hospitals, the study, published in the journal Nursing Research, found no evidence for it.

Similarly there’s NO evidence that the moon has any influence on:

  • automobile accidents,
  • hospital admissions,
  • surgery outcomes,
  • cancer survival rates,
  • menstruation,
  • births,
  • birth complications,
  • depression,
  • violent behaviour,
  • or even criminal activity.

Blame it on the confirmation bias

Over the years the confirmation bias has picked up the blame for all sorts of dodgy beliefs. Here are a few:

  • People are prejudiced (partly) because they only notice facts which fit with their preconceived notions about other nations or ethnicities.
  • People believe weird stuff about flying saucers, the JFK assassination, astrology, Egyptian pyramids and the moon landings because they only look for confirmation not dis-confirmation.
  • In the early nineteenth century doctors treated any old disease with blood-letting. Their patients sometimes got better so doctors—who conveniently ignored all the people who died—figured it must be doing something. In fact for many ailments some people will always get better on their own without any treatment at all.

Fight the bias

The way to fight the confirmation bias is simple to state but hard to put into practice.

You have to try and think up and test out alternative hypothesis.

Sounds easy, but it’s not in our nature.

It’s no fun thinking about why we might be misguided or have been misinformed. It takes a bit of effort.

It’s distasteful reading a book which challenges our political beliefs, or considering criticisms of our favourite film or, even, accepting how different people choose to live their lives.

Trying to be just a little bit more open is part of the challenge that the confirmation bias sets us.

Can we entertain those doubts for just a little longer?

Can we even let the facts sway us and perform that most fantastical of feats: changing our minds?

.

Precrastination: Why People Complete Tasks Early When There’s No Need

Precrastinators may answer an email too quickly or submit a report too early with too little information.

Precrastinators may answer an email too quickly or submit a report too early with too little information.

Keep reading with a Membership

• Read members-only articles
• Adverts removed
• Cancel at any time
• 14 day money-back guarantee for new members

Allowing The Mind To Wander Is More Pleasant Than We Predict (M)

Most people dislike spending time alone thinking and predict it will be unpleasant.

Most people dislike spending time alone thinking and predict it will be unpleasant.

Keep reading with a Membership

• Read members-only articles
• Adverts removed
• Cancel at any time
• 14 day money-back guarantee for new members

Get free email updates

Join the free PsyBlog mailing list. No spam, ever.