Are We Innately Selfish? What the Science Has to Say
One of the key reasons for the unparalleled success of our species is our ability to cooperate. In the modern age, we are able to travel to any continent, feed the billions of people on our planet, and negotiate massive international trade agreements—all amazing accomplishments that would not be possible without cooperation on a massive scale.
While intra-species cooperation is not a uniquely human ability, one of the reasons why our cooperative behavior is so different from that of other animals is because of our willingness to cooperate with those outside our social group.1 In general, we readily trust strangers for advice, work together with new people, and are willing to look out for and protect people we don’t know—even though there are no incentives for us to do so.
However, while much of our success can be attributed to cooperation, the underlying motivations behind this unique ability are yet to be understood. Although it is clear that we often display cooperative and pro-social tendencies, is cooperation something that we are naturally hardwired to do? Or is it that our first instincts are inherently selfish, and it is only through the conscious repression of our selfish urges that we are able to cooperate with others?
Indeed, these questions have been debated by philosophers for millennia. For the longest time, the pervasive view was one of pessimism towards our species—that is, that we are innately selfish.
Plato compared the human soul to a chariot being pulled by two opposing horses: one horse is majestic, representing our nobility and our pure heartedness, while the other is evil, representing our passions and base desires. Human behavior can be described as an eternal tug-of-war between these two horses, where we desperately try to keep our evil horse under control.2
The moral philosopher Arthur Schopenhauer argued for a similar perspective, writing that “Man is at bottom a dreadful wild animal. We know this wild animal only in the tamed state called civilization and we are therefore shocked by occasional outbreaks of its true nature; but if and when the bolts and bars of the legal order once fall apart and anarchy supervenes it reveals itself for what it is.”3
Adam Smith, the father of economics, also echoed this view, famously writing in The Wealth of Nations: “It is not the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest.”4
These philosophical beliefs about our selfish human nature inspired many of the teachings we encounter in everyday life. For instance, in Christianity, the Seven Deadly Sins and The Golden Rule teach us to repress our innermost selfish desires in order to think about others. Another example is in economics, where the very foundation of neoclassical economics is the idea that we are selfish, rational decision-makers.
You may be inclined to agree with these ideas. Everyone has heard of stories of cheating, lying, and stealing—all of which display the worst of our human nature, where our selfish impulses reveal themselves.
But despite the legacy of these beliefs carrying on into modern times, the idea of our innate selfishness is being increasingly challenged. Insights from the behavioral sciences are beginning to suggest that we have a cooperative instinct, and that our selfish behavior only emerges when we have the time and ability to form strategies about our decisions.
Behavioral Science, Democratized
We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices.
At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.
The interaction of System 1 and System 2
Anyone remotely interested in psychology or economics has probably heard of the dual-systems theory of decision-making: the idea that our decisions are governed by two opposing cognitive “systems.” System 1 is the automatic and emotional part of our brain, and System 2, the slow and deliberative part.5
These two systems are very much related, and their interaction and relative levels of activation can determine our behavior. This means that certain stimuli can enhance or inhibit the influence of one system’s functioning in the decision-making process. For instance, making a decision when feeling overwhelmed with multiple tasks, time pressure, or mental and physical exhaustion can weaken an individual’s System 2 thinking and make them more reliant on their System 1 judgments.6
This should be unsurprising: when you’re mentally overwhelmed, you probably aren’t thinking things through, and you’re going to make decisions by impulse! In a similar fashion, facilitating System 2 thinking by giving people time to make decisions, or incentivizing people to think about things deeply, can suppress System 1 and enhance System 2 thinking.
Through this lens of the interaction between System 1 and System 2, researchers in psychology and economics have found a new way to answer this age-old question. By manipulating elements such as time pressure to enhance impulsivity in some subjects and promote deliberation in others, researchers have been able to differentiate the effects of System 1 and System 2 on our behavior to see whether we truly are instinctively selfish or cooperative.
The cooperative instinct
Experiments that require cooperation between participants are used to investigate instinctive versus calculated greed. Take the public goods game, for instance. In this game, players are placed in groups and given an endowment (typically around $10). They are asked to donate a certain amount of their endowment for a “public good,” where their donations will be doubled and subsequently split between the players. You should be able to spot an interesting dynamic in this game: by cooperating and contributing more to the public good, everyone will benefit. But by acting selfishly, you alone will benefit at the expense of the group.
What happens when you are asked to make this contribution to the public good when you are solely under the influence of System 1 (i.e. when System 2 is under stress from some form of cognitive strain)? It turns out, when required to make a decision within 10 seconds, participants in experimental groups acted more cooperatively. Participants who acted on impulse contributed more to the public good than those who had time to think about their contributions.7
What was also fascinating from this study was that, when participants were given time and encouraged to think about their decisions, participants opted to be greedier. Apparently, when relying on instinct, we are willing to cooperate, but when we are given a chance to think about the costs and benefits of our decisions, we think more about our own outcomes than those of others.
These findings also held true for the prisoner’s dilemma game, another activity that involves a cooperative dynamic (if you’re from the UK, this game is analogous to the “split-or-steal” situation in the game show “Golden Balls”). Similar results were also found when conducting these experiments in person rather than through a computer program.
These findings are certainly fascinating, but you might be thinking that behavior in a lab experiment may not be replicable in real life. Let’s say, for example, someone approached you on the street and asked you to contribute to a charity, and you had no time to make a decision (perhaps you’re late for work). Do you think you would donate? Perhaps more field research is necessary to confirm these findings in real-world scenarios.
Another approach to studying our cooperative instincts is to examine the behavior of babies. Intuitively speaking, babies should represent humankind in our most primal state, where we are most reliant on instincts to make our decisions. From a biological perspective, babies have underdeveloped brains and are extremely helpless at birth, which explains why we take a much longer time to mature in comparison to other animals. (We evolved this way because if our heads got any bigger, we would struggle to get out of our mother’s womb.)8 So, investigating the cooperative/selfish tendencies of babies should theoretically reflect our true human nature.
And indeed, researchers have found that babies display a strong tendency to cooperate. Toddlers as young as 14-18 months are willing to pick up and hand you an object you accidentally dropped without any praise or recognition; they are willing to share with others; and they are also willing to inform others of things that will benefit them, even if it brings no benefit to the toddler themselves.9 This is in contrast to chimpanzee babies, who do not display the same amount of cooperative tendencies at a young age. This showcases that perhaps it is a uniquely human ability to be instinctively cooperative.
Why are we instinctively cooperative?
So it seems that it’s possible the great thinkers of our history may have been wrong—perhaps we are not as selfishly wired as we think. The findings from the public goods game study and infant studies suggest that we may be actually instinctively cooperative rather than selfish. But what are the possible explanations for this?
From an evolutionary biology perspective, it could be that cooperative genes were selected for, because it was the best survival strategy. Those who were more innately cooperative were able to experience more advantageous outcomes and survive long enough to pass on their genes to their offspring.10
But there are also many instances where our first impulse is to not cooperate, and many instances where, after much deliberation, we still decide to cooperate. We’ve all met people who simply seem less trustworthy, and we can all think of times where we ended up trusting somebody after having a long time to think about our decision—for example, after contemplating a business deal, or purchasing something expensive from someone else.
The social-heuristics hypothesis (SHH) aims to tie these ideas together. This theory predicts that variation in our intuitive and cooperative responses largely depends on our individual differences as well as the context we are in.11
Our intuitive responses are largely shaped by behaviors that proved advantageous in the past. For instance, imagine you’re playing for a basketball team. If you realize that working together with your teammates is advantageous for winning matches, you will gradually start to develop instinctive responses to cooperate with your teammates in order to continue winning games. But if you start to recognize that you are carrying the team and that trusting your teammates is actually hindering the team’s results, you will start to develop more instinctively selfish behaviors and not pass to them as frequently.
With this perspective, our instinctive responses all depend on which strategy—cooperation or selfishness—worked for us in the past. This can explain why most participants in the public goods game chose to cooperate: cooperative behaviors are typically advantageous in our daily lives.12
In our modern age, our lives are more interconnected than ever. There are over 7 billion of us now, where our experiences are easily shareable on social media and our businesses require close collaboration with partners in order to mutually benefit. Behaving in accordance with social norms13 is more important than ever, where we frequently require cooperation with others in our daily life and any self-serving behavior often leads to social criticism and damage to one’s reputation. We quickly learn to cooperate and adapt to these social norms, and this, in turn, hardwires our instincts towards more cooperative behaviors.
On the other hand, deliberation allows us to adjust to specific situations and override our intuitive responses if that intuitive response is not actually beneficial in the present context. In other words, deliberation allows us to strategize and suppress our individual instinctive desires in order to choose the most optimal choice, whether this be cooperation or noncooperation. When there are no future consequences, such as in the public goods game experiment, even though our instincts may be cooperative, deliberation will likely skew towards selfish behavior as we realize that strategic selfishness will make us better off and that we won’t be punished for free-riding.
However, when there are future consequences, deliberation will favor cooperation or noncooperation depending on the individual’s beliefs about which behavior will be more strategically advantageous. Take the star basketball player example again: although his instinctive response is to go at it alone, given that his selfish behavior could lead to potential future consequences (e.g. unhappiness from his teammates, criticism from observers, being dropped by the coach), he may override his initial impulses and work with his team, since it would be strategically advantageous to do so. Our System 2 processes allow us to stop and think about our intuitions, and strategize accordingly.
Concluding remarks
So, there is compelling evidence against an idea that has shaped our teachings for millennia. The evidence seems to point to the conclusion that, in general, we have an innate desire to cooperate, and in fact, it is only when there are opportunities to be strategically selfish that we reveal our more undesirable tendencies.
Understanding our instinctive human tendencies will be essential as our species encounters some of the biggest challenges that we will have ever encountered. Climate change, political tensions, and inequality are issues that threaten the very existence of our species, and can only be resolved through cooperation on a global scale. Within us, there lies an instinctive desire to cooperate. Knowledge of this fact could inspire new and creative solutions, in order to rally people into tackling these challenges together.
References
- Melis, A. P., & Semmann, D. (2010). How is human cooperation different?. Philosophical transactions of the Royal Society of London. Series B, Biological sciences, 365(1553), 2663–2674. https://doi.org/10.1098/rstb.2010.0157
- Plato. (1972). Plato: Phaedrus (R. Hackforth, Ed.). Cambridge: Cambridge University Press. doi:10.1017/CBO9781316036396
- Schopenhauer, A. (1851). On reading and books. Parerga and Paralipomena.
- Smith, A. (1937). The wealth of nations [1776].
- Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
- Loewenstein, G. (1996). Out of control: Visceral influences on behavior. Organizational behavior and human decision processes, 65(3), 272-292.
- Rand, D. G., Greene, J. D., & Nowak, M. A. (2012). Spontaneous giving and calculated greed. Nature, 489(7416), 427-430.
- Knight, M. (2018, June 22). Helpless at birth: Why human babies are different than other animals. Retrieved from: https://geneticliteracyproject.org/2018/06/22/helpless-at-birth-why-human-babies-are-different-than-other-animals/
- Warneken, F., & Tomasello, M. (2006). Altruistic helping in human infants and young chimpanzees. Science, 311(5765), 1301-1303.
- Robison, M. (2014, September 1). Are People Naturally Inclined to Cooperate or Be Selfish? Retrieved from: https://www.scientificamerican.com/article/are-people-naturally-inclined-to-cooperate-or-be-selfish/
- Rand, D. G. (2016). Cooperation, fast and slow: Meta-analytic evidence for a theory of social heuristics and self-interested deliberation. Psychological science, 27(9), 1192-1206.
- Rand, D. G., & Nowak, M. A. (2013). Human cooperation. Trends in cognitive sciences, 17(8), 413-425.
- https://thedecisionlab.com/reference-guide/anthropology/social-norm/
About the Author
Tony Jiang
Tony Jiang is a Staff Writer at the Decision Lab. He is highly curious about understanding human behavior through the perspectives of economics, psychology, and biology. Through his writing, he aspires to help individuals and organizations better understand the potential that behavioral insights can have. Tony holds an MSc (Distinction) in Behavioral Economics from the University of Nottingham and a BA in Economics from Skidmore College, New York.