By Thomas Bartlett
We don't make sense. Our reasons for doing what we do, for choosing what we choose, frequently fail to add up. We are more dishonest and less logical than we imagine. And psychologists and economists seemingly never tire of pointing this out to us.
So can anything be done? Or are we hopeless?
In search of an answer, I attended a special session on choices at the Association for Psychological Science's recent annual convention, which boasted some of the biggest names in social science at the moment. Among them was Dan Ariely, a professor of psychology and behavioral economics at Duke University, author of two best-selling books, and a prolific researcher with a winningly dry wit. You don't want to follow his presentation.
In the past few years, Mr. Ariely has shown that people who believe they're honest are still perfectly willing to behave unethically when it suits them and when they think they can get away with it. For instance, in one study, participants were asked to complete a set of simple math puzzles in a limited amount of time (like finding two numbers in a matrix that together equal 10). They were given a small reward, like 50 cents or $2, for each completed puzzle. The more puzzles they completed, the more money they received.
In the experiment, participants whose results had to be verified by an experimenter completed significantly fewer puzzles than those who were allowed to report their own results and collect the cash. From that, Mr. Ariely concludes, when people saw a chance to cheat, they took it. They didn't cheat a lot, but enough to be noticeable. And other research he's done indicates that this same kind of minor corruption pops up on tax returns and expense reports. People who cheat in the lab probably cheat in the real world, too.
But there may be an easy solution: When Mr. Ariely set up the same type of experiment but first reminded people of the university's honor code or asked them to write down as many of the Ten Commandments as they could remember, the cheating was eliminated. Gone. Nothing about the situation was different; the subjects could cheat without consequences if they so desired. But a simple reminder ahead of time that cheating is not OK was enough to keep people honest.
The Duke professor has tried to convince corporations and policy makers of the power of such reminders. He even suggested that the IRS have taxpayers sign the honesty pledge before completing their forms. So far the federal government hasn't warmed to the idea. The tradition of signing at the end of a form is, for whatever reason, difficult to abandon.
Like Mr. Ariely, Eldar Shafir, a professor of psychology and public affairs at Princeton University, has written often about decision making, though lately much of his research has focused on decisions made by people below the poverty line. The poor make unwise decisions, just like the nonpoor, but the repercussions can be much more severe. A middle-class person who makes a few lousy financial choices may have to cut back on creature comforts, while a poor person may end up on the streets.
One decision that hurts plenty of poor people is opting not to put their money in a bank. When encouraging people to use banks rather than a mattress or a freezer, it's good to reduce the hassle factor. Small obstacles like an intimidating form or a long drive, Mr. Shafir says, can stand in the way of smart decisions.
But being in the right frame of mind might help, too. In one study, Mr. Shafir found that poor women who were asked questions about their loved ones— such as "Which of your family members do you feel closest to?"—were more likely to agree to open a savings account that required a $20 monthly deposit than were women who were asked what they did for fun. As with Mr. Ariely's honesty experiments, nothing more than a cue to remember their obligations to others may have done the trick.
Mr. Shafir emphasizes that, while it's easy to be judgmental about the financial decisions of poor people, everyone is subject to a certain amount of irrational thinking. In a recent paper on so-called choice overload, Sheena Iyengar, a professor of business at Columbia University and author of the best seller The Art of Choosing, illustrates exactly that.
When we have too many choices, studies have demonstrated, we tend to get overwhelmed and do nothing, but Ms. Iyengar has shown that choice overload has another downside. In her study, participants were told to choose from a selection of coin-flip gambles. For instance, one gamble might be that if a flipped coin lands on tails, the participant gets $10, but if it lands on heads, he or she gets nothing. Another option might be receiving five bucks no matter how the coin lands. And so on.
One group was given three such options, while another was given 11.
You might think that the group with 11 choices would be more likely to find the most profitable option. In fact, when they had more choices, participants leaned toward the safest choice, like taking five bucks no matter how the coin landed. The problem is that the least risky choice isn't always the best. (For instance, starting a business may be risky but may be a risk worth taking.) But something about having lots of choices makes us seek out safety.
The implications of this finding go well beyond flipping coins in a laboratory. Ms. Iyengar argues that real-life decisions, like health care and 401(k) plans, are subject to the same choice-overload phenomenon. It might be sensible for companies and the government to remember that before touting the supposed superiority of a lengthy menu.
The takeaway is not that we can remove our tendency to make dumb or unethical decisions, but that it's important to be aware of our natural deficiencies. Maybe we don't make sense, but perhaps we can outsmart ourselves.
File this under: Science Confirms What We Already Knew. Of course, we don't always reflect enough to know what we know, and it's nice to know it's not just me that does this. The middle section describes experiments that target a self or other orientation, but there's also a question of immediate v. long-term benefit. Situations where our effects on others are rather remote (filling out tax forms) really show up the role of ethics in our lives, i.e., to come between our immediate self-interest and our long-term character and community goals.
No comments:
Post a Comment