cogntivebiasDon’t feel bad about this, we are all in the same boat when it comes to making bad decisions or being unduly influenced.  The science behind advertising and persuasion has come a long ways, and knowing how they manipulate you and the rest of the public is valuable knowledge.  James Garvey lists three of the ways we are vulnerable to persuasion the Representative Heuristic, the Availability Heuristic, and the Anchoring Effect.  Before we can discuss these systems though a brief overview of how we think and the short cutting our brain does that makes life generally go well but not always thoughtfully.

[…] by distinguishing between two kind of thinking:fast, automatic, intuitive thinking and slow, reflective, rational thinking.  You can imagine that these two kinds of mental activities are the work of two parts of your mind, two systems that swing into different kinds of action to accomplish different tasks.  The part that is responsible for first kind of thinking is called system 1 or the Automatic System, and the part the engages in slower, more careful thought is called system 2 of the Reflective System. 

     System 1 operates quickly and automatically,  This feels instinctive and intuitive, and it requires no effort on your part.  System 1 is in charge when you orient yourself to a sudden sound, wince involuntarily when you see something that disgusts you, read anger in the lines on someone’s face, and recognize written words in your native tongue – it all just clicks fluently and automatically, without you thinking about it at all. 

   The work of System 2, the Reflective system, takes effort, an act of deliberate concentration on your part.  Your deliberative efforts are limited and cannot be sustained for very long without degradation, a phenomenon called ego depletion.   System 2’s work is voluntary, slower that your gut reactions, and associated with the experience of choice and agency. 

[…]

   The two systems interact with each other in a number of surprising ways,  System 1 typically engages in a kind of constant monitoring, throwing up a series of impressions and feelings that System 2 might endorse, ignore, check, focus on, act upon, or simply go along with.  Much of the time System 2 is in a low power state, aroused only when the Automatic system encounters something it cannot handle. 

[…]

  Our mental resources are therefore limited.  It is an effort to bring System 2 into play, and it can be overloaded by trying to do too much.  So evolution has taught us a number of shortcuts, rules of thumb or heuristics, which conserve our mental energies and serve us well most of the time.

[…]

  But it also means that we go wrong in systematic, predictable ways – we are constitutionally susceptible to cognitive biases, and in turn, we can be nudged.

[…]

   We use shortcuts to arrive at judgments too.  […] It’s a large part of the theoretical framework behind contemporary persuasion, and it’s already shaping our world and changing our lives. 

 

   Consider this description of Steve. 

   ‘Steve is very shy and withdrawn, invariably helpful, but with little interest in people, or in the world of reality.  A meek and tidy soul, he has a need for order and structure, and a passion for detail’

   What do you think Steve does for a living?  Is he more likely to be a farmer, salesman, airline pilot, librarian, or physician?  Once you have an answer to that question , ask yourself what job he is least likely to have.

[…]

    Very many people, including me when I first read that description , conclude that Steve is most likely a librarian – how could this shy guy like that possibly be a salesman? – and in coming to this conclusion we make use of what Kahneman and Tversky call the representativeness heuristic.   We let our automatic faculties rip and take a short cut to an answer.  If one slows down and thinks about it, though, there are a lot more farmers than librarians in the world.  That’s extremely pertinent information if you are trying to guess which job on a list is most likely for anybody, and it should lead us to conclude that it’s most likely Steve is a farmer, maybe a shy and withdrawn farmer, but still a farmer.  The probability that Steve is a librarian is instead assessed by the extent to which the description of Steve matches up with or is representative of stereotype of a librarian we have in our heads. 

[…]

  We do this entirely automatically, and it has an effect on a host of judgments – how likely we think politicians are to be good leaders, how likely a new business is to succeed, and how likely our doctor is to be competent. 

[…]

   People who understand persuasion will take care to fit the right stereotype and make it easier for us to come to conclusions about them automatically. 

 

A second set of biases result from what Kahneman and Tversky call the availability heuristic.  When we think about how likely some even is, we’re affected by how readily examples come to mind. 

[…]

brain   We are likely to over-estimate the number of wayward politicians, shark attacks and meltdowns at nuclear plants because we can probably easily recall instance of such things.  The problem is that how easily we can recall something has less to do with how likely or common or worrying an occurrence is and more to do with what we happen to have heard about in the news recently and how striking that news was to us.  The news you choose to watch therefore has a lot of power over you,  The stories it repeats reinforce your susceptibility to the availability effect. 

[…]

   We over-react at first, then under-react as time goes on.  […]  Because of its salience, we think homicide is more common that suicide, but it isn’t.  In fact, Americans are more likely to take their own lives than be murdered or die in a car crash, but because murder and car accidents are more newsworthy, dramatic and available that suicide, we concern ourselves more with home alarm systems and airbags than the signs of depression. 

   A final kind of bias identified by Kahneman and Tversky, perhaps the most interesting and difficult to accept of the three, is called the anchoring effect.  When people first think about a number and try to estimate an unknown quality, the initial number affect their guess, anchors it – the estimate they make tends to stay near by.  Again, the rule of thumb in play isn’t too bad a guide, and we use it all the time.  What’s the population of Pittsburgh? If you don’t know, but you do know that Philadelphia is the largest city in Pennsylvania, and it has about 1.5 million people in it, you might feel able to guess about Pittsburgh.  It’s certainly smaller than Philadelphia – maybe it’s half the size, so perhaps Pittsburgh has a population of few than 750,000,  Maybe 600,000?

    anchoringThere are two very weird facts about this familiar process of guessing a quantity,  First we tend to undercook the adjustments we make from the original guess.  Once we have a number and begin adjusting in the direction we think is right, we tend to stay too close to the anchor, possibly because once we find ourselves in uncertainty, we can’t think of a good raise to carry on, so we play it safe and stop too soon.  Pittsburgh is smaller that Philadelphia, so we adjust downwards, but how far downwards?  In fact, this example we stayed much too close to the anchor, as we usually do.  Just 300,000 people live in Pittsburgh.

    Second, it doesn’t matter where the first figure comes from, it will still anchor our estimates, even it has nothing at all to do with the domain in question.  According to at least one understanding of what’s going on in such cases, sometimes System 2 is in charge, finding what it hopes to be a reasonable anchor and adjusting off it to estimate an unknown quantity.  But sometimes System 1 gets hooked on an anchor and freely associates, without our conscious control, and the cascade of associations ends up affecting our later estimate, whether it’s reasonable or not. 

   Tversky and Kahneman illustrated this second kind of anchoring with a rigged roulette wheel – it showed numbers from 0 to 100 but it actually stopped on either 10 or 65.  They spun the wheel and asked a group of students to write the number down, and then answer two questions.

   ‘Is the percentage of African nations among the UN members larger or smaller than the number you just wrote?’

  ‘What is your best guess of the percentage of African nations in the UN?’ 

    The average guess of those who saw the number 10 was 25 percent.  The average guess of those who saw the number 65 was 45 percent.  A roulette wheel is not a particularly informative thing if you’re trying to work out how many African nations are members of the UN, but still, those who saw the high number guessed higher than those that saw the low number.  Even ludicrous anchors have an effect on us”

-James Garvey.  The Persuaders pp. 55 – 66

 

Yeah, so being wary of your System 1 answers is probably a good thing.  Bad news for the anchoring effect, as even when you’re told about it, it still works on you. :/

hindsight