I have felt vaguely guilty about a nonexistent marshmallow for several years now.
This guilt was triggered by the famous Stanford marshmallow experiment, in which psychologist Walter Mischel gave preschool aged children a marshmallow (or other treat) and told them they could either eat the single marshmallow now, or hold off for 15 minutes and get a second one. Mischel and his researchers revisited the same test subjects as they grew up, and found that those kids who could wait for the second marshmallow became successful adolescents and adults. Mischel concluded that the kids who waited were wired for delayed gratification.
So why should this study make me feel vaguely guilty? Because I've met me, and I know there was no way four-year-old Emily would have been able to hold off on eating a delicious marshmallow sitting right in front of her. (Nearly 40-year-old Emily would still struggle with this exercise, and I have the empty Jet-Puffed bags to prove it.)
Despite the fact that I have shown the ability to delay gratification when it comes to finances, education, career goals, purchases, vacations, and other important decisions, I believed that there was some aspect of my brain wiring that was suboptimal as compared to those who could wait for a sweet treat.
Except Mischel's experiment may not have proved what he thought it did. A recent study was unable to replicate his results — and Mischel's experiment is not the only one with a replication problem. It turns out, a number of the most famous behavioral science studies out there are based on somewhat shaky research. Many of the replication attempts are finding that the results from the original studies are either inconclusive, lacking important nuance, or just plain wrong. This is why it's likely we'll see more big ideas from behavioral psychologists getting challenged in the coming years.
So before you rethink your day-to-day habits, consider how these famous studies have been exposed as more myth than truth.
Mischel's original experiment from the late 1960s looked at too small a sample of children — 90 kids total — and all were from the preschool on Stanford's campus. The new study, by NYU's Tyler Watts and UC Irvine's Greg Duncan and Hoanan Quan, expanded the number of test subjects to 900 and made sure to include children more representative of the general population in terms of race, ethnicity, and parents' education level.
What the new researchers found was that children's ability to wait correlated most strongly with social and economic factors. A kid from a poorer family may not be able to count on food being there from one day to the next, so she has learned to take what she can when it's available — while a kid from a more affluent family may have learned that delaying gratification can be worth it.
When the researchers controlled for family income, they found that rich kids who ate the marshmallow right away did no worse in standardized tests as adolescents compared to their waiting counterparts, and poorer kids who waited did no better than those who dug in.
But the kids from higher-income families all tended to have better ability to delay gratification as adults — which means our "willpower" is less wired into our brains and more a function of how we were raised.
Speaking of willpower, one of the most influential studies about willpower was conducted two decades ago by Roy Baumeister and Dianne Tice. The study set out a plate of freshly-baked chocolate chip cookies and bowl of radishes. As participants filed in, they were instructed to take either a cookie or a radish. The test subjects were then given an impossible puzzle to solve. The participants who were allowed to snack on cookies spent twice as long trying to solve the puzzle before giving up as compared to those who had to eat radishes while smelling and staring at the plate of cookies.
Baumeister and Tice described the phenomenon as "ego depletion." They believed that forcing yourself to not eat the cookies right in front of you tuckered out your willpower, leaving you less mental energy to keep working on a frustrating puzzle. If willpower is a finite resource, you can use it up by making little choices before facing a big one.
But other researchers have been unable to replicate the original study, and meta-analysis (that is, analyzing all of the studies about this phenomenon) have also found little evidence of ego depletion.
What it comes down to is that our ability to exert willpower can depend heavily on our motivations, beliefs, and mindset. It's not as simple as the original study made it sound — that willpower is like a muscle and can be exhausted like one.
You may have seen the popular TED talk on power poses, presented by researcher Amy Cuddy. Cuddy presents some pretty compelling arguments about the power of body language to affect our feelings of confidence and power. She and her research colleagues found that standing in a Superman pose or other powerful pose for two minutes increased levels of testosterone, decreased cortisol (the stress hormone), and increased levels of risk-taking behavior.
This was great news for anyone feeling nervous about a speech, a job interview, or talking to that cute guy in accounting. Just find a private spot to pretend to be Superman for a couple of minutes, and you can enter into your nerve-wracking encounter with increased confidence, thanks to your hormones.
Except that a follow-up study which included four times the number of participants as the original has determined that there is no such hormonal effect. Even Cuddy now describes herself as being "agnostic" on the hormonal effects, although she does claim that assuming such poses does help people to feel more powerful.
Cuddy may be correct, although not for replicable, scientific reasons. Because of the placebo effect — the cognitive bias that leads people to believe something will help them because they expect it to — and the huge popularity of the idea of power poses, many who try them will find they feel more powerful after striking a pose. (See also: 5 Mental Biases That Are Keeping You Poor)
Nobel Prize-winning economist Daniel Kahneman wrote about priming in his book Thinking, Fast and Slow. Priming is the theory that subtle cues in an environment can affect an individual's behavior.
Various studies on priming found that participants would walk more slowly after hearing and using words associated with aging (like Florida and bingo); that people were more honest when there was a representation of eyes nearby when they had a choice to steal or cheat; that holding a warm drink while speaking with someone made people feel more warmly toward the conversational companion; and that seeing money will make people behave more selfishly.
However, follow-up studies have not been able to reproduce these results, suggesting that our behaviors are not nearly as impressionable as the original studies made us believe. This is good news, since it makes it clearer that human behavior is not captive to one's environment. We are more in control of our behavior and reactions than these studies suggested.
Behavioral science is a fascinating field that is truly helping us to better understand why we make the irrational decisions we do. But it is important to remember that the researchers in this field are human, as are the journalists who report on their studies. There are bound to be imperfect studies, mistakes, and even frauds that are touted as the next great truth about financial and psychological behavior. Don't let those "truths" keep you from following what works for you. Especially if what works for you is eating the marshmallow right away.
Disclaimer: The links and mentions on this site may be affiliate links. But they do not affect the actual opinions and recommendations of the authors.
Wise Bread is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to amazon.com.