Dear Colleagues!  This is Pharma Veterans Blog Post #484. Pharma Veterans welcomes sharing of knowledge and wisdom by Veterans for the benefit of Community at large. Pharma Veterans Blog is published by Asrar Qureshi onWordPress, the top blog site. Please email to for publishing your contributions here.

Continued from Previous……

The challenges with summarizing the findings of a book of this enormity can be many, some of which may be described as follows.

  1. The effort to simplify may affect the integrity and authenticity of the original concept
  2. Many other related, important concepts may be missed out
  3. The entire premise of the book may not be accurately portrayed

In the interest of surviving respectfully, I would rather focus on the main theme only and leave everything else aside.

The ‘Intuitive’ System 1 and the ‘Attentive’ System 2 have been elaborated to the extent that the basic understanding would have been established. We proceed to see how these are affecting our lives in short and long terms.


Human decision making is the most often used faculty. We are required to make decisions all the time. These may be as small as whether to take tea now or not, or big like deciding to buy a particular house. In the organizational context, we make series of decisions within one large framework, for example, developing and finalizing a three-years business plan, or an investment plan, or deciding about a merger or acquisition.

We need to understand that our process of decision making is not as sound as we prefer to think.

“Daniel Kahneman coined the acronym WYSIATI which is an abbreviation for “What You See Is All There Is”. It is one of the human biases that he explores when he describes how human decision-making is not entirely based on rational thought. Traditionally, economists believed in the human being as a rational thinker, that decisions and judgments would be carefully weighed before being taken. And much of traditional economic theory is based on that notion. Dr. Kahneman’s life’s work (along with his co-author Dr. Amos Tversky) explodes that notion and describes many of the short-comings of human decision-making. He found that many human decisions rely on automatic or knee-jerk reactions, rather than deliberative thought. And that these automatic reactions (he calls them System 1 thinking) are based on heuristics or rules of thumb that we develop or have hard-wired into our brains. System 1 thinking is very useful in that it can help the individuals deal with the onslaught of information that impinges on us each and every day, but the risk is when a decision that one is faced with should be thought through, rather than based on a knee-jerk reaction.

WYSIATI is the notion that we form impressions and judgments based on the information that is available to us. For instance we form impressions about people within a few seconds of meeting them. In fact, it has been documented that without careful training interviewers who are screening job applicants will come to a conclusion about the applicant within about 30 seconds of beginning the interview. And when tested, these initial notions are often wrong. Interviewers who are trained to withhold judgment about someone do a better job at applicant screening, and the longer that judgment is delayed the better the decision.”1

Our System 2 often endorses or rationalizes ideas and feelings generated by System 1. We may feel more optimistic about a project, if we have a positive impression about its leader, and that impression may be simply rooted in the fact that he/she reminds us of someone we loved. We are also likely to disapprove if we do not like the presenter for some unrelated reason in our memory. We convince ourselves that our choices are based on rational thinking which, it is not.

System 2 does examine many of the impulses generated by System 1 and makes necessary corrections by paying attention to more detail. However, it can only look at what is available to it, which is almost always limited. Daniel says, “we do not always think straight when we reason, and the errors are not always due to intrusive and incorrect intuitions. Often we make mistakes because we (our System 2) do not know any better.”

I would like to conclude this topic in the words of Daniel Kahneman, rather than my own, as it is the appropriate way. [Quote]

What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort……

The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2. This is how you will proceed when you next encounter Müller-Lyer illusion2 Unfortunately, this sensible procedure is least likely to be applied when it is needed most. We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available, and cognitive illusions are generally more difficult to recognize than perceptual illusions. The voice of reason may be much fainter than the loud and clear voice of an erroneous intuition, questioning your intuitions is unpleasant when you face the stress of a big decision. More doubt is the last thing you want when you are in trouble. The upshot is that it is much easier to identify a minefield when you observe others wandering into it than when you are about to do so. Observers are less cognitively busy and more open to information than actors…

Organizations are better than individuals when it comes to avoiding errors because they naturally think more slowly and have the power to impose orderly procedures. Organizations can institute and enforce the application of useful checklists, as well as more elaborate exercises, such as reference-class forecasting and the premortem. At least in part by providing a distinctive vocabulary, organizations can also encourage a culture in which people watch out for one another as they approach minefields. Whatever else it produces, an organization is a factory that manufactures judgments and decisions. Every factory must have ways to ensure the quality of its products in the initial design, in fabrication, and in final inspections. The corresponding stages in the production of decisions are the framing of the problem that is to be solved, the collection of relevant information leading to a decision, and reflection and review. An organization that seeks to improve its decision product should routinely look for efficiency improvement at each of these stages. The operative concept is routine. Constant quality control is an alternative to the wholesale review of processes that organizations commonly take in the wake of disasters. There is much to be done to improve decision making. One example out of many is the remarkable absence of systematic training for the essential skill of conducting efficient meetings……

There is a better link from more precise gossip at the watercooler to better decisions. Decision makers are sometimes better able to imagine the voices of present gossipers and future critics than to hear the hesitant voice of their own doubts.  They will make better choices when they trust their critics to be sophisticated and fair, and when they expect their decisions to be judged by how it was made, not only by how it turned out. [Unquote]



Disclaimer. Most pictures in these blogs are taken from Google Images which does not show anyone’s copyright claim. However, if any such claim is presented, we shall remove the image with suitable regrets.

Leave a Reply

%d bloggers like this: