In his latest book, The Undoing Project, author Michael Lewis introduces us to the fathers of behavioral economics, Amos Tversky and Daniel Kahneman. Its first chapter describes how Houston Rockets General Manager Daryl Morey used behavioral economics to rebuild the team beginning in 2007. The key, Daryl Morey noticed, was that his recruiters and coaching staff invariably fell prey to specific errors in their decision making when selecting players. For example, recruiters who found a candidate they liked tended to overvalue information reinforcing their decision and ignore information suggesting that the player should not, in fact, be drafted. Eliminate these errors in judgment, replacing “judgment calls” with hard data, and they could pay less for better players. This was the same approach the Oakland A’s used to achieve success during their famous 2002 baseball season and the subject of Michael Lewis’ prior book, Moneyball.
What is behavioral economics, and how does it relate to the work we do as business lawyers? In short, behavioral economics is the science of how people make decisions. By understanding the techniques people use to make their decisions, including those that cause us to occasionally make bad decisions, we can accomplish two things. First, we can help other people make better decisions (or perhaps, instead, make the decisions we want them to make). Second, we can better understand our own decision making processes and, with a more concrete understanding, improve them. Understanding decision making can improve our performance in a large number of arenas, but certainly assists in performing such tasks as negotiating deals, structuring contracts, or building compliance systems.
Behavioral economics builds on the traditional economics concept of normative decision theory, which describes the rules by which a fully rational individual makes choices. Normative decision theory makes two basic assumptions. First, that the person making choices has complete information. In other words, the person knows all the information relevant to making the choice. Second, normative decision theory assumes that the person is rational—that is, capable of making choices that are logical and consistent based on that person’s desires. For example, if a person prefers coffee to tea, and also prefers hot chocolate to coffee, then a rational person will ask for a hot chocolate when offered a choice between that and tea (this is called the principal of transitivity). When you know the logical rules by which rational persons make decisions, the argument goes, you can build mathematical and logic models of their behavior, use those models to predict results, and also develop responsive strategies. These rules of logical choice, called utility theory, were described by mathematician John von Neumann and economist Oskar Morganstern in their 1944 work, Theory of Games and Economic Behavior, and form the basis for modern game theory.
The problem with utility theory is its limited application in real-world situations. People don’t have complete information when they make decisions, and as Tversky and Kahneman proved, people do not follow the rules of rational decision making when making choices. Instead, decision making employs a variety of cognitive short cuts, called heuristics. Behavioral scientists, through empirical studies, have identified dozens of these heuristics. They bear names such as “planning fallacy,” “anchoring,” “confirmation bias,” and “loss aversion,” but they essentially describe the rules by which human minds tend to make decisions in place of the strict logical constructs that utility theory describes. In short, behavioral economics provides a useful tool for predicting and understanding decisions where standard economics tends to fail. For example, anchoring refers to a tendency to determine subjective values based on recent exposures to something similar, although unrelated. When asked to guess the percentage of African countries in the UN, people consistently pick a higher number when exposed to the number 65 than when exposed to the number 10 just prior to guessing. The planning fallacy refers to the consistent tendency to underestimate the length of time a task will take, even when a person has extensive experience performing that task. Kahneman and Tversky’s work is significant—Kahneman was awarded the Nobel Prize in Economics for his work in the area.
Understanding these rules can play a significant role in negotiations. For example, an attorney who understands how anchoring works can employ the concept to set expectations both for his client and the opposite party that are reasonable and conducive to obtaining a negotiated solution. In addition, the attorney can be more aware of situations where anchoring might be affecting her own decision making, resulting in a potentially poor negotiation outcome. This article explores some of the more significant heuristics and how they affect negotiations.
Planning Fallacy
In Thinking, Fast and Slow, Daniel Kahneman describes the process of planning a book for a psychology course. When he polled the group of authors about how long they thought the project would take, they estimated about two years. Kahneman then asked the most experienced member of the group how long similar projects had taken in the past. After a little thought, the expert replied, “I cannot think of any group that finished in less than seven years,” and he said that about 40 percent of the projects had failed to reach completion altogether! Still, even though none of the authors were prepared to make a seven-year investment in a project with only a 60-percent chance of success, they went ahead designing the book. They finished it eight years later, and it was never used.
Closely related to the optimism heuristic, the “planning fallacy” refers to the tendency for people to consistently underestimate both the time and costs for completing projects. Although the most obvious examples come from large public works projects, any lawyer can think of the times that a lawsuit, negotiation, or business deal took longer than expected and cost more than estimated. Empirical studies show that the planning fallacy reflects an underlying psychological tendency to ignore historical evidence when estimating the time and expense for a project. In one study, students were asked to estimate the length of time needed to complete and submit their honors thesis. The average estimated time was 34 days. The average actual time was 55 days. Follow-up studies showed that formalized planning and thinking about the results of prior projects had little effect on the planning error. Studies show not only that the planning fallacy is pervasive across different activities, but that even experienced professionals fall prey to planning errors on a consistent basis.
In a negotiation, the planning fallacy can play a significant role in how each side evaluates its positions. In a litigation situation, both sides will likely underestimate not only the amount of time needed to reach a conclusion, but also the cost of the litigation process. This will make them less likely to settle, based on a mistaken belief about the costs of reaching a non-negotiated resolution. In short, parties elect to take on unanticipated risks based on their unrealistic belief in the potential results. They fail to settle when they should because of the planning fallacy.
In a deal situation, the planning fallacy has a different, but equally unfortunate, effect. Parties will underestimate the time needed to work through the negotiations or even the amount of time needed to negotiate and draft the details of the relevant documents. The unpredicted delay creates frustration as tasks a client or her counsel thought would take a couple of days or maybe a week to complete remain unfinished weeks later. In some cases, this frustration, resulting from the original unrealistic expectations, can cause a deal to blow up.
Daniel Kahneman suggests that the best way of avoiding the planning fallacy is to use a technique called “reference class forecasting.” Essentially, reference class forecasting entails a four-step process. First, identify a set of similar activities. When trying to predict how much a lawsuit might cost in legal fees, for example, identify a group of similar lawsuits. This group of similar, prior lawsuits is your reference class. Second, collect data on the reference class. How long did those lawsuits last from beginning to end? How much were the total legal fees? This data provides the baseline for evaluating your own situation. So, if your firm has handled 10 similar types of lawsuits in the past, and the average legal fees incurred were $100,000, then $100,000 is your baseline. Third, evaluate the effect of concrete differences between your particular case and the reference class cases. For example, if your firm’s hourly rates have increased year over year, you will want to adjust the baseline estimate upward to reflect the increases in hourly rates. If some of the prior cases required more witnesses than your case will, you might adjust your estimate downward.
Finally, the fourth, and possibly hardest, step is to actually use the estimate and ignore your inevitable desire to use your original “prediction” about the cost in place of the hard data. By using a data-driven, objective approach to forecasting, you can reduce the planning fallacy effect and make better decisions in negotiations.
Anchoring
Amos Tversky and Daniel Kahneman ran an experiment where college students spun a number wheel rigged to stop only on the numbers 10 and 65. After each student spun the wheel, he or she had to guess the percentage of African nations in the United Nations. Oddly, the number on which the number wheel landed had a profound effect on the guess. Students whose spin resulting in a 10 guessed, on average, that 25 percent of African nations were in the UN, whereas the students whose spin resulted in a 65 had an average guess of 45 percent. Kahneman and Tversky called this mental heuristic—the tendency for a recently experienced number to affect decision making—“anchoring.” Kahneman describes anchoring as “one of the most reliable and robust results of experimental psychology.”
Most lawyers are familiar with the concept, although they might tend to think about it in basic terms. Lawyers learn to begin negotiations with either a high number or a low number to set expectations about the final result. Anchoring does work in this context, but anchoring effects also operate in subtle ways that are harder to identify and more effective than one might think. First, the number used as an anchor does not have to be related to the number being anchored. In the African nations experiment, the number on the wheel was unrelated to the question of how many African nations are in the UN, but greatly influenced the students’ decision making. In another common experiment, subjects are asked to write down the last few digits of their Social Security number and then guess the number of marbles in a jar. Subjects with higher Social Security numbers invariably guess higher. Anchoring effects are hard to shake and operate even where the subject has independent information on which to make a reasoned decision. In one experiment, real estate agents were told the listing price of a property and then were asked to appraise it. Even when they had complete information about the property, their appraisals remained anchored to the listing price (including when the listing price was clearly implausible).
According to Kahneman, two different mechanisms cause anchoring: one that operates when we consciously think about the decision, and one that operates when we do not. When we are making conscious decisions about values (what Kahneman refers to as System 2 thinking), we tend to find an initial anchor for the value and adjust from that value. We also tend to under-adjust. As a result, the starting point supplied has a very real effect on the final result. In a negotiation, making the first offer—or even opening negotiations with a discussion that includes appropriately scaled numbers—can help set that anchor point and thus affect the final negotiation results.
Anchoring also affects unconscious decision making (what Kahneman refers to as System 1) through something called the “priming” effect. In this context, the anchoring number can create mental associations that inform the final decision making. Although the mechanism is different, the final effect remains similar.
In negotiations, taking advantage of the anchoring effect means acting quickly, perhaps by making an early offer designed to anchor the final results, or perhaps by opening negotiations with a discussion designed to expose the other party to higher or lower numbers generally. Anchoring doesn’t necessarily have to target the final result. You might seek to anchor the inputs to the other party’s decision making processes, such as their view of your client’s cost of capital, litigation costs, or other factors. Also consider the setting for negotiations. Conducting a meeting in a cheap coffee shop might create mental associations that help you negotiate a lower price, whereas meeting in an expensive restaurant might have the opposite effect. In any negotiation, anchoring efforts should occur early in the process, before the other party has an opportunity to anchor based on its own decision making processes or other experiences.
Confirmation Bias
In Predictably Irrational, psychologist Daniel Ariely describes an experiment where he asked MIT students to taste-test two types of beer. One is a regular beer, and the other is the same beer, but with some balsamic vinegar added. They called this “MIT Beer.” Predictably, when forewarned that MIT Beer contained vinegar, the students preferred the regular beer. When they were not forewarned about the secret ingredient, however, the students typically preferred MIT Beer. This and similar experiments demonstrated that peoples’ prior perception of something strongly affects their interpretation of future experiences. This heuristic is commonly referred to as “confirmation bias”—the idea that we tend to interpret new facts and experiences in ways that reinforce our pre-existing beliefs. When we expect a beer to taste odd because we are told in advance that it contains vinegar, we are more likely to dislike the flavor when we actually drink the beer.
Confirmation bias affects how people process new information and leads to serious errors in judgment. For example, a lawyer who believes strongly in his client’s case might discount the effect of negative testimony at a deposition, focusing on the parts of the deposition that support his client’s position. As a result, he will fail to properly evaluate his case, concluding that the deposition testimony helped his case more than it hurt. A CEO trying to close an M&A transaction that she championed to her board of directors, with an eye on the big payoff from a successful merger, might ignore information about the buyer’s poor history of successfully integrating prior acquisition targets. Confirmation bias can affect how we and our clients process all kinds of information, including factual information, legal research results, case evaluation, and even the desirability of doing a deal in the first place.
This tendency to focus on facts that confirm pre-existing beliefs, while discounting facts that counter those beliefs, contributes to a related heuristic called “optimism bias.” Optimism bias is the demonstrated tendency for people to overestimate their chances of success in a particular endeavor. For example, one study found that 68 percent of entrepreneurs believe that their company is more likely to succeed than other similar companies; by definition, only 50 percent are more likely to succeed. Another study found that optimistic CEOs are 65 percent more likely to complete mergers and more likely to overpay, leading to a post-merger failure.
In negotiations, confirmation bias and optimism bias can lead to suboptimal decision making on both sides and interfere with parties reaching a deal. A seller might enter a negotiation discussing facts it thinks strongly support its decision and will help sway the buyer’s pricing, failing to realize the buyer discounts those facts. Particularly in a dispute, optimism bias can give each side an unrealistic viewpoint of its chance of success, leading to negotiation deadlocks. These heuristics are most likely to affect decision making where parties enter into negotiations already holding well-defined beliefs.
Understanding these cognitive effects helps us control the extent to which we might make misjudgments based on the information we receive during a negotiation. Confirmation bias can be countered by approaching, from the start of a matter, issues from an objective viewpoint. We can also counter bias by seeking input and feedback from objective third parties during the process and listening to what they have to say. A conscious effort can be made to identify negative information and carefully analyze the effect that information should have on our decisions. Careful, systematic approaches to evaluating information provide a way to counter confirmation bias in our own decision making.
Other parties in negotiations are also subject to confirmation bias. We can recognize that where the other side in a negotiation is starting with a different set of beliefs or a different starting viewpoint, they will evaluate the information shared during the negotiation in a different light. An effort to consider their point of view and evaluate how they are receiving and interpreting information will provide greater insight into their decision-making processes and make negotiations more productive.
Finally, like anchoring, these biases are best countered by opening negotiations and discussions early in the dispute-resolution process, before parties have an opportunity to develop strongly held beliefs about facts and positions.
Loss Aversion
Traditional economic theory posits that a dollar gained is equivalent to a dollar lost. Thus, people who have an item (a mug, for example) should be willing to sell that mug, on average, for the same price that a similar group of people would be willing to pay for a mug; however, that is not the case. In a famous experiment, people who had a mug wanted, on average, twice as much for the mug than people without a mug were willing to pay. This effect, called the “endowment effect,” has been demonstrated experimentally over and over again. People who have an item inherently give that item a higher value than those thinking about acquiring it. The endowment effect goes hand in hand with a concept Daniel Kahneman referred to as “loss aversion”—the simple idea that people fear losses more than they desire gains, and so the value of a thing, whether money or some other object, depends on the person’s point of view. A person viewing themselves as losing something places more value on the thing than someone who views the transaction as receiving the thing. As Kahneman put it, “losses loom larger than gains.”
Understanding how loss aversion might affect each party’s perception of value in a negotiation helps to understand the incentives behind each party’s positions. Understanding these incentives can help you to set up the negotiation to benefit your client or simply better align the party’s economic incentives to increase the chance of reaching a successful result. For example, imagine that you are negotiating whether your client’s business partner is due a performance bonus under a contract. If your client’s starting position is that the bonus was not earned, your client will view paying the bonus as a loss. On the other hand, where the bonus has been paid or escrowed already, and your client later discovered that its business partner was arguably not entitled to the bonus, your client might view any return of the bonus money as a gain. In the first situation, your client is less likely to want to pay any part of the bonus, potentially making settlement more difficult. On the other hand, when your client still has the funds, the other party might view any funds received as a gain and be more willing to settle. Here, loss aversion suggests making sure your client holds the funds pending negotiations; the combination of your client being less willing to pay, and the other party being more willing to settle, should result in a better settlement for your client.
Loss aversion also explains a benefit of escrows. Both sides should view funds placed in an escrow pending resolution of a dispute as held by a third party. For each side, it can only receive a gain from a resolution, not a loss. This will increase the willingness to settle and help bring together the parties negotiating positions. Loss aversion suggests using escrow arrangements even when both parties are financially capable of covering any payments that might need to be made.
The lawyer who can frame the negotiation’s effect on the other party as a gain should do better than when the effect is viewed as a loss. This might be done through prenegotiation litigation, such as by attaching a bank account or real estate. When this occurs, the other party gains from the settlement, rather than loses, and because the dollars gained are valued less than dollars lost, the other party should be willing to “pay” more for a settlement. Alternatively, a lawyer might frame the details of the deal so as to make an element appear as a gain instead of a loss. For example, in a shipping contract, a lawyer might start with a higher base price that includes insurance and offer a discount if the customer maintains its own insurance, rather than start with a lower base price and then try to get the customer to pay extra for the shipper to cover insurance. Loss aversion suggests that the customer will view the discount as a gain, and the payment of extra fees as a loss, and be more willing to forgo the discount than pay the extra fees.
Conclusion
This article describes just four of many biases and heuristics identified by behavioral economists through empirical research. The planning fallacy, anchoring, the confirmation bias, and loss aversion strike familiar chords among experienced attorneys. But, behavioral economics provides more than vague concepts about behavior. Its research provides deeper insights into exactly how and when these mechanisms operate, and also provides tools for mathematically modelling behavior in negotiations and litigation. Lawyers who become familiar with this new social science will undoubtedly gain an edge over their competition.
Additional Resources
Behavioral economics generally:
- Lewis, Michael. The Undoing Project (W. W. Norton & Company, 2016) (as much a story about Daniel Kahneman and Amos Tversky as a book about behavioral economics, but an easy introduction to the subject).
- Ariely, Dan. Predictably Irrational (HarperCollins, 2008).
- Thaler, Richard & Sunstein, Cass. Nudge (Yale University Press, 2008).
- Kahneman, Daniel. Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011).
Planning fallacy:
- Kahneman, Daniel and Amos Tversky. 1979. “Intuitive Prediction: Biases and Corrective Procedures.” TIMS Studies in Management Science, 12, 313–27.
- Buehler, Roger, Dale Griffin, and Michael Ross. 1994. “Exploring the Planning Fallacy: Why People Underestimate their Task Completion Times.” 67 Journal of Personality and Social Psychology 366–81.
Anchoring:
- Mussweiler, Thomas, Fritz Strack and Tim Pfeiffer. 2000. “Overcoming the Inevitable Anchoring Effect: Considering the Opposite Compensates for Selective Accessibility.” Personality and Social Psychology Bulletin 26: 1142–50.
- Northcraft, Gregory B., and Margaret A. Neale. 1987. “Experts, Amateurs, and Real Estate: An Anchoring-and-Adjustment Perspective on Property Pricing Decision.” Organizational Behavior and Human Decision Processes 39: 84–97.
- Tversky, Amos, and Daniel Kahneman. 1974. “Judgment under Uncertainty: Heuristics and Biases.” Science 185: 1124–30.
Confirmation bias:
- Baker, Malcolm, Richard S. Ruback, and Jeffrey Wurgler. 2007. Behavioral Corporate Finance: A Survey, Vol. 1 of Handbook of Finance Series, Chap. 4.
- Malmendier, Ulrike, and Geoffrey Tate. 2008. “Who Makes Acquisitions? CEO Overconfidence and the Market’s Reaction.” The Journal of Finance 89(1): 20–43.
Loss aversion:
- Kahneman, Daniel, and Amos Tversky, 1976. “Prospect Theory: An Analysis of Decision Under Risk.” Econometrica 47: 263–91
- Kahneman, Daniel, Knetsch, Jack L.; Thaler, Richard H. (1990). “Experimental Tests of the Endowment Effect and the Coase Theorem.” Journal of Political Economy. 98 (6): 1325–48.
- Hossain, Tanjim, List, John A. (2012). “The Behavioralist Visits the Factory: Increasing Productivity Using Simple Framing Manipulations.” Management Science. 58 (12): 2151–67.