Saturday, June 11, 2011


KLUGE : The Haphazard Construction of Human Mind , by Gary Marcus

An awesome ( introductory ) book for who ever wants to start learning about brain or who ever want to understand why we are the we are ( in terms of how our brain functions in various aspects , in various chapters on , Memory , Belief , Choice ,Pleasure , Language and few others.

In this book, the author discusses several bugs in our cognitive make up

confirmation bias
No matter what we humans think about, we tend to pay more atten­tion to stuff that fits in with our beliefs than stuff that might chal­lenge them. Psychologists call this "confirmation bias." When we have embraced a theory, large or small, we tend to be better at noticing evi­dence that supports it than evidence that might run counter to it.

mental contamination
Our sub­jective impression that we are being objective rarely matches the ob­jective reality: no matter how hard we try to be objective, human be­liefs, because they are mediated by memory, are inevitably swayed by minutiae that we are only dimly aware of.

The bottom line is that every belief passes through the unpre­dictable filter of contextual memory. Either we directly recall a belief that we formed earlier, or we calculate what we believe based on whatever memories we happen to bring to mind.

anchoring and adjustment
During the process of anchoring and adjustment, people begin at some arbitrary starting point and keep moving until they find an answer they like ( for questions or situations where they have no clue about answer )

This is very interesting which needs an example , consider the below scenario

Imagine that the nation is preparing for the outbreak of an un­usual disease, which is expected to kill 600 people. Two alterna­tive programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows:

If Program A is adopted, 200 people will be saved.
If Program B is adopted, there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved.

Most people would choose Program A, not wanting to put all
the lives at risk. But people's preferences flip if the same choices are
instead posed this way:

If Program A is adopted, 400 people will die.
If Program B is adopted, there is a one-third probability that
nobody will die and a two-thirds probability that 600 people
will die.

"Saving 200 lives" for certain (out of 600) somehow seems like a good idea, whereas letting 400 die (out of the same 600) seems bad — even though they represent exactly the same outcome.

Only the wording of the question, what psychologists call framing, has been
changed. This is what precisely all advertisements , politicians and bureaucrats do .

in­ adequate self-control,
Which we all know ;)
the ruminative cycle,

the focusing illusion,

shows how easy it is to manipulate people simply by directing their attention to one bit of information or another.

motivated reasoning,
Our tendency to accept what we wish to believe (what we are motivated to believe) with much less scrutiny than what we don't want to believe is a bias known as "motivated reasoning,

and false memory,
not to mention absent- mindedness, an ambiguous linguistic system, and vulnerability to mental disorders

and very well said about paranoid

Once someone starts down that path — for whatever reason, legitimate or otherwise — the person may never leave it, because paranoia be-gets paranoia. As the old saying puts it,
even the paranoid have real enemies; for an organism with confirma­tion bias and the will to deny counter evidence (that is, motivated rea­soning), all that is necessary is one true enemy, if that. The paranoid person notices and recalls evidence that confirms his or her paranoia,
discounts evidence that contradicts it, and the cycle repeats itself.

( This blog post is my notes on few excerpts , suggest readers to read the complete book )

No comments: