Hello, welcome back guys, thank you for joining me again. All right, so we start out this lecture with a little experiment that we're going to call, is the glass half empty, or half full? All right, okay, so here I have an empty glass, and I'm going to pour some water into it. So now is the glass half empty or half full? What do you think? Well, I don't know what you said of course, but 69% of people will typically say that it is now half empty, right? Okay, so now let me start with a full glass. Right, so now I have a glass full of water, and I'm going to pour some water out. Try to make it the same. What would you say now? Is the glass now half empty or half full? I know this is silly, but anyway, 88% of people will now say that it is half full, right? So there is no difference in the size of the glasses or in the amount of the water, as much as I could, right? But there is a small twist in the context, and that changes everything. So what is really going on here? In this lecture, we're talking about framing, right? The idea that the form a decision problem is described or framed in has an impact on our judgements or decisions, right? Just like the water cups that I was talking about, right? The framing effect refers to the fact that our preferences over options our outcomes seemingly change as a function of the way a decision is framed or described to us, okay? So for example, suppose we told one group of people that the burgers that they are about to eat was 75% lean meat, lean ground beef, right? And we told another group that their burgers were 25% fat. And then we asked both groups how they liked their burgers, right? So, the group that heard about the fat, would likely estimate the meat to be lower in quality and taste than the other group. And in fact, experiments show that after both groups tasted the burgers, made of course, from the same meat, right? Those who were told about the fat ended up liking the burgers a lot less than the people who were told about the lean meat. Right, so clearly how we frame the question appears to matter, right? So the classic example of framing was introduce by Tversky and Kahneman, which became known as Asian disease problem. So here's the context, imagine that the US is preparing for the outbreak of the unusual Asian disease which is expected to kill 600 people. There are two alternative programs to combat the disease that have been proposed. And suppose that the exact scientific estimates of the consequences of these programs are as follows, right? If program A is adopted, 200 people will be saved. If program B is adopted, There is one-third probability that 600 people will be saved, and two-thirds probability that nobody will be saved. So which one is your pick? Given those choices, a substantial majority of people choose program A. In other words, they prefer the sure outcome, the sure option over the gamble, right? They then frame the outcomes a little differently in a second version. Suppose now you have programs C and D, right? Now if program C is adopted, 400 people will die. And if program D is adopted, there is one-third probability that nobody will die and a two-thirds probability that 600 people will die. Now which one would your pick? Now if you look closely, and compare the two versions, the consequence of the programs are all identical, right? The programs A and C are identical and so are B and D, right? However, in the second version, the larger majority of people now choose the gamble, right? So how the questions are framed appears to make a difference in how choices between gambles and sure outcomes are resolved, right? The first frame emphasizes the number of lives saved, right? And when a choice is framed positively, as a potential gain, right, it is as if the glass is half full, it is perceived as an improvement over the empty glass, right? On the other hand, the second version emphasizes the number of lives lost, right? Which makes us want to take the extra risk to avoid losing, right? We instinctively respond differently to the two frames, and don't even notice that all four programs are actually equivalent. All right, so framing can lead to other freaky decisions as well. Okay, so here's one more. So imagine you have $2,000 in the bank, and I offer you a choice. You can either do nothing, or you can take a 50/50 chance of either losing 300, or winning $500, right? Which one would you do, do nothing or take the gamble? Now let's imagine again that you have $2,000 in the bank and I offer you the following choice this time. You can either do nothing, or you can take a 50/50 chance of either ending up with $1,700 or $2,500. What would you do, which one would you pick? Well what happens is, most people reject the first gamble, but take the second, right? This is because the first one is framed to emphasize the amounts you will gain or lose. Whereas the second one is framed to emphasize the total amount you end up with, right? What happens is, the changes feel bigger and potentially scarier under the first frame. And, in fact, the two gambles are economically identical. But how we react to them can, of course, be very different because of the framing, right? And in the financial world, framing is everywhere. For example, if you sink one percent of your money into one stock that goes to zero, you will probably be very upset, right? But on the other, hand if your portfolio losers 1% of its value, you are more likely to shrug it off, right? Even though the effect on your wallet is identical. Another example is, most people are happier with the 4% raise, let's say when inflation is at 3%, than they would be if they were to get a 2% raise with inflation at zero. The reason is 4% is bigger, right, twice the size of 2%. So it feels better, even though in reality of course what matter is what is left over after the cost of living adjustment, okay? All right, so in this lecture you saw examples of how framing can change our decisions over sure outcomes versus gambles, right? Choices between gambles and sure outcomes are resolved differently depending on how the questions are framed for us.