Being overly generous can get you punished as a nonconformist.
From an early age, we are taught that cooperation, generosity, and
altruism are generally things we should strive for. But altruistic acts
aren’t always lauded, and researchers have found that generous
individuals are sometimes punished for their behavior. Studies suggest
that people often react negatively to large contributions, are suspicious of those who offer help, and want to expel particularly charitable individuals
from cooperative endeavors. These seemingly counterintuitive behaviors
are called “antisocial punishment” and are more common than you might
think. But why would people want to punish anyone who is particularly
charitable?
The answer to that question would explain a puzzling human behavior, and it could have important ramifications for public policy. Tackling many of the major problems we currently face—from climate change to political stalemates—requires cooperation and collaboration. Understanding why people are sometimes willing to undermine joint efforts out of what appears to be nothing more than spite could go a long way to improve cooperation and discourse in many areas.
Sociologists Kyle Irwin and Christine Horne suggest that our inclination to punish do-gooders may stem from our adherence to social norms. Using a clever experimental design that allowed them to manipulate the level of conformity among group members, the researchers investigated the relationship between antisocial punishment and social norms.
The premise was relatively simple. Each participant was given 100 points and randomly assigned to a group of six players. In each round of the game, individuals would be asked to contribute however many points they like to a “group fund” that would be doubled by the experimenters and divided equally among the participants. In this scenario, everyone in the group would end up with twice what they started with if all participants donate all their points, but free-riders that donated fewer points—or even none at all—could still benefit from others’ contributions.
The participants made their choices in a predetermined order and could see each contribution as it was made, but they interacted with other group members through a computer rather than face-to-face.
But there was a pretty significant twist: since the researchers wanted to control some variables while manipulating others, much of what happened in the study was decided in advance (which, of course, was unbeknownst to the participants). There was only one actual study participant in each group; the other five “group members” were computer programs playing out predetermined roles. The human participant was always “randomly” chosen to be the fifth player to donate, and the four contributions that he or she observed before contributing always averaged 50 points, or half the total possible contribution.
By preprogramming these values, the researchers could manipulate the “social norm,” or the way most group members behaved. In the “strong” social norm condition, the contributions varied only slightly, ranging between 45 and 55 points; this represented a situation in which social conformity was high. In the “weak” social norm condition where conformity was lower, the first four predetermined contributions varied between 30 and 70 points.
Lastly, the contribution of the sixth and final group member was also set by the researchers and was either overly generous (donating 90 of the 100 possible points), or overly stingy (donating only 10 points).
After all the contributions were made, the participant was given the opportunity to punish any of the other group members if desired. He or she could deduct points from any other player, but this came at a cost: for every three points subtracted from another group member, the punisher also lost a point.
Participants weren’t reluctant to punish other players despite the fact that this action took away from their own earnings; 77 percent of the participants deducted at least one point from another group member, and the average cost the punisher incurred was nearly 7 points. Not surprisingly, most people (nearly 70 percent) chose to punish the stingy deviants that contributed much less than the average. After all, these players were benefiting from others’ donations to the group fund without making large contributions of their own.
But here’s the amazing part: 51 percent of the participants also chose to punish the overly generous deviant. In other words, a majority of the people in this study were willing to reduce their own chance to win $100 just to punish a particularly cooperative group member. Furthermore, many participants actually wanted this individual to be kicked out of the group. When asked to rate how much they would like each player to remain in the group on a scale of 1 (not at all) to 9 (very much), the average rating for the overly generous player was less than a 3.
However, the strength of social norms didn’t affect the punishments of the stingy deviant. Players tended to punish this individual equally under both conditions. The researchers suggest that no matter how high or low conformity is among group members, people always see stinginess as a punishable offense.
So it appears that nonconformity is a bit of a double-standard, at least under these specific circumstances. We always dislike free-riders, but we will also punish cooperators when their behavior is particularly atypical. As of now, we can only speculate about the rationale for this behavior; the presence of strong social norms may foster a feeling that the generous contributor is trying to make him or herself look rich or powerful, or that they are trying to make everyone else look bad.
When it comes to self-interest, this behavior is completely counter-intuitive; it seems absurd to punish these super-cooperators and want to expel them from the group. After all, their generosity increases other players’ chances, generally at their own expense. But humans’ adherence to conformity is strong, and when the stakes aren’t high, social norms may win out over self-interest.
The researchers acknowledge that under different circumstances—for example, if rewards are large or the type of punishment varies—the outcome might be different. This study had a very homogeneous subject pool and was tightly managed in order to control for multiple variables, so its external validity and applicability to real world problems are limited at this point. However, there’s no doubt that in certain situations, “big givers” are subject to punishment, even when this isn’t in anyone’s best interest.
Social Science Research, 2013. DOI: 10.1016/j.ssresearch.2012.10.004 (About DOIs).
The answer to that question would explain a puzzling human behavior, and it could have important ramifications for public policy. Tackling many of the major problems we currently face—from climate change to political stalemates—requires cooperation and collaboration. Understanding why people are sometimes willing to undermine joint efforts out of what appears to be nothing more than spite could go a long way to improve cooperation and discourse in many areas.
Sociologists Kyle Irwin and Christine Horne suggest that our inclination to punish do-gooders may stem from our adherence to social norms. Using a clever experimental design that allowed them to manipulate the level of conformity among group members, the researchers investigated the relationship between antisocial punishment and social norms.
The setup
During the study, 310 undergraduates were asked to take part in a game based on points; the more points a participant ended up with, the better chance they had of winning one of three $100 Amazon gift cards.The premise was relatively simple. Each participant was given 100 points and randomly assigned to a group of six players. In each round of the game, individuals would be asked to contribute however many points they like to a “group fund” that would be doubled by the experimenters and divided equally among the participants. In this scenario, everyone in the group would end up with twice what they started with if all participants donate all their points, but free-riders that donated fewer points—or even none at all—could still benefit from others’ contributions.
The participants made their choices in a predetermined order and could see each contribution as it was made, but they interacted with other group members through a computer rather than face-to-face.
But there was a pretty significant twist: since the researchers wanted to control some variables while manipulating others, much of what happened in the study was decided in advance (which, of course, was unbeknownst to the participants). There was only one actual study participant in each group; the other five “group members” were computer programs playing out predetermined roles. The human participant was always “randomly” chosen to be the fifth player to donate, and the four contributions that he or she observed before contributing always averaged 50 points, or half the total possible contribution.
By preprogramming these values, the researchers could manipulate the “social norm,” or the way most group members behaved. In the “strong” social norm condition, the contributions varied only slightly, ranging between 45 and 55 points; this represented a situation in which social conformity was high. In the “weak” social norm condition where conformity was lower, the first four predetermined contributions varied between 30 and 70 points.
Lastly, the contribution of the sixth and final group member was also set by the researchers and was either overly generous (donating 90 of the 100 possible points), or overly stingy (donating only 10 points).
The fallout
The researchers weren’t particularly concerned with the size of the participants’ donations; instead, they wanted to know whether or not they would choose to punish this final nonconforming group member, which they called the “deviant.”After all the contributions were made, the participant was given the opportunity to punish any of the other group members if desired. He or she could deduct points from any other player, but this came at a cost: for every three points subtracted from another group member, the punisher also lost a point.
Participants weren’t reluctant to punish other players despite the fact that this action took away from their own earnings; 77 percent of the participants deducted at least one point from another group member, and the average cost the punisher incurred was nearly 7 points. Not surprisingly, most people (nearly 70 percent) chose to punish the stingy deviants that contributed much less than the average. After all, these players were benefiting from others’ donations to the group fund without making large contributions of their own.
But here’s the amazing part: 51 percent of the participants also chose to punish the overly generous deviant. In other words, a majority of the people in this study were willing to reduce their own chance to win $100 just to punish a particularly cooperative group member. Furthermore, many participants actually wanted this individual to be kicked out of the group. When asked to rate how much they would like each player to remain in the group on a scale of 1 (not at all) to 9 (very much), the average rating for the overly generous player was less than a 3.
But why?
Examining the interaction between the strength of the social norm (as set by the range of donations) and the size of the punishment meted out suggests a basis for this puzzling behavior. Irwin and Horne found that strong social norms encouraged punishment of the cooperative player: the more similar the first four pre-programmed donations were, the higher the punishments tended to be for the overly generous deviant. When there is a clear “right way” to behave, the researchers suggest, people respond more strongly to behaviors that don’t fit the norm.However, the strength of social norms didn’t affect the punishments of the stingy deviant. Players tended to punish this individual equally under both conditions. The researchers suggest that no matter how high or low conformity is among group members, people always see stinginess as a punishable offense.
So it appears that nonconformity is a bit of a double-standard, at least under these specific circumstances. We always dislike free-riders, but we will also punish cooperators when their behavior is particularly atypical. As of now, we can only speculate about the rationale for this behavior; the presence of strong social norms may foster a feeling that the generous contributor is trying to make him or herself look rich or powerful, or that they are trying to make everyone else look bad.
When it comes to self-interest, this behavior is completely counter-intuitive; it seems absurd to punish these super-cooperators and want to expel them from the group. After all, their generosity increases other players’ chances, generally at their own expense. But humans’ adherence to conformity is strong, and when the stakes aren’t high, social norms may win out over self-interest.
The researchers acknowledge that under different circumstances—for example, if rewards are large or the type of punishment varies—the outcome might be different. This study had a very homogeneous subject pool and was tightly managed in order to control for multiple variables, so its external validity and applicability to real world problems are limited at this point. However, there’s no doubt that in certain situations, “big givers” are subject to punishment, even when this isn’t in anyone’s best interest.
Social Science Research, 2013. DOI: 10.1016/j.ssresearch.2012.10.004 (About DOIs).
No comments:
Post a Comment