You Are Not So Smart
Books | Humor / Form / Essays
4.1
(109)
David McRaney
An entertaining illumination of the stupid beliefs that make us feel wise, based on the popular blog of the same name. Whether you’re deciding which smartphone to purchase or which politician to believe, you think you are a rational being whose every decision is based on cool, detached logic. But here’s the truth: You are not so smart. You’re just as deluded as the rest of us—but that’s okay, because being deluded is part of being human. Growing out of David McRaney’s popular blog, You Are Not So Smart reveals that every decision we make, every thought we contemplate, and every emotion we feel comes with a story we tell ourselves to explain them. But often these stories aren’t true. Each short chapter—covering topics such as Learned Helplessness, Selling Out, and the Illusion of Transparency—is like a psychology course with all the boring parts taken out.Bringing together popular science and psychology with humor and wit, You Are Not So Smart is a celebration of our irrational, thoroughly human behavior.
AD
Buy now:
More Details:
Author
David McRaney
Pages
320
Publisher
Penguin Publishing Group
Published Date
2012-11-06
ISBN
1592407366 9781592407361
Community ReviewsSee all
"This book presents various ways we delude ourselves, or in other words, the ways in which we don't think the way we think that we think. Although the information in it is useful and the writing style vibrant, I was astounded by the number of errors it contains. If you're going to publish summaries of studies you have a duty to read the studies thoroughly and paraphrase them faithfully. McRaney doesn't seem to have done this as carefully as he ought to have.<br/><br/>I sent McRaney a list of errata. He replied to apologize sincerely, to thank me for politely pointing out the errors, and to say that future editions will have them corrected; in the first edition, he had been pressured by a major publisher to turn his popular blog into a book in only three months. Some factual errors you should be aware of if reading an earlier version of the book are:<br/><spoiler><br/> * McRaney mentions a study by Ariely and Wertenbroch on procrastination held over the course of three weeks (50). The study was spread over a twelve-week semester, not over three weeks. (Source: Predictably Irrational by Dan Ariely, p. 114)<br/> * He writes that "To suddenly stop moving and hope for the best is called fear bradycardia" (59). The word "bradycardia" just means a slow heartbeat; it can happen as a result of freezing from fear, but is not itself the proper term for such freezing. The proper term is tonic immobility, which McRaney throws in in the next sentence.<br/> * McRaney assumes that an experiment dividing a sum of $10 between two people scales linearly when dividing $1,000,000 (116-7). I see no evidence to suggest that this is the case. Although people would be offended enough to reject the money if offered only $1 by a person dividing $10, they would probably still accept the same portion ($100,000) of a million. I know of no evidence to suggest that people will turn down massive amounts of money out of petty spite.<br/> * He writes that "a hip-to-waist ratio of .67 to .80 correlates to health" (133). That should say "a waist-to-hip ratio." Obesity does not correlate to health. <br/> * He writes that when subjects "rated their abilities after being primed to think the task was considered simple, people said they performed better than average" (159). This shift sounded so illogical that I looked up the paper referenced to check what it said; I found that this rating wasn't in fact a result of priming but a result of the easy task actually being easier than the difficult one (not just being the same task referred to differently). People think they perform above average on easy tasks and below average on difficult ones, but it's not true that telling them a task is easy will make them think they did above average on it. Think about it: if I were to tell you that a difficult task was "easy" for others, you would assume yourself to be even further below average because it was not easy for you — the opposite of what McRaney writes.<br/> * "The most famous conformity experiment was performed by Stanley Milgram in 1963" (187). The experiments actually began in 1961. They were <i>published</i> in 1963.<br/> * "psychologist Hazel Markus at the University of Michigan says..." (241). Markus studied at the University of Michigan, but she's a professor at Stanford. Also McRaney repeatedly refers to Hazel Markus as "he." She's a she.<br/>"</spoiler><br/><br/>One of the principles of self-delustion that struck home with me was hindsight bias: the tendency to believe that we've always known something after we learn about it for the first time. I tend to do that quite a bit, which gets frustrating because sometimes it feels like I'm not learning from what I've read. Recently I've found that that jotting down notes about interesting points the book raised as soon as I come to those points helps to combat hindsight bias. (It also helps to remember and to find the source again later. I recommend doing it.) Some such points in this book are:<br/><spoiler><br/> * People subconsciously connect washing their hands with washing away guilt (2-3).<br/> * When stockings are placed side by side, people are most likely to think the ones on the right are the best, but when asked why they think this, they say it has nothing to do with the position (25).<br/> * Hindsight bias says that after we know something (such as after reading a study), it seems obvious that that thing was true. We'll even agree with opposite viewpoints after hearing different proverbs supporting them (32-5).<br/> * Counteracting procrastination may not be so much about having willpower in the moment as about developing ways to set clear plans and stick to them (44-52).<br/> * When you rate the risks and benefits of something and then read about the risks, you will then regard not only the risks as being greater, but also the benefits as being lesser (144). (The reverse is true if you read about the benefits instead.)<br/> * After witnessing a staged crime and then being asked to pick the criminal out of the lineup 78% of people identified one of the innocent people in the line even though the person who acted as a criminal was not present (179).<br/> * If you give two groups lists of words to unscramble where the first two puzzles are hard for one group and easy for the other — and have everyone move on to the next word once the people in the easy group have finished solving theirs — then by the time you get to a third word (which is the same for both lists) the group with the hard initial words will not solve it because they will have learned helplessness (207-8).<br/> * Sometimes when you are afraid of failure you may handicap yourself ahead of time so that if you fail you can blame it on the handicap rather than your own abilities (227-30). <br/> * Even when people are told that debaters are arguing in favor of a side assigned to them, they still assume that those people actually believe what they're saying (270).<br/></spoiler>"
a
aqword