When you know the probability distribution of a random variable, you can start to make calculations for that variable. One of the first things you'd like to know are summary statistics that capture the essence of the distribution well, similar to what you do with observational data. In this video, I'll explain how a mean of a probability distribution is calculated and also show what happens with the mean when adjusting a random variable or when combining different random variables. The mean of a random variable denoted by the symbol, mu, provides the expected average outcome of many observations. It's therefore also called the expected value of that random variable which is written by the symbol E. The mean of a discrete random variable is the probability weighted average of all possible values that the random variable can take. So it is the sum of each possible value times its probability. For a continuous random variable, essentially the same applies. To account for continuity, the summation symbol is replaced by the integral and the probability's not defined as a discreet value is i, but is a function of x. An example. Suppose you have to travel on a daily basis and cross three traffic lights. Waiting at a traffic light will take an extra two minutes of your total travel time. You have kept a record of the frequency by which you had to wait for none up to three traffic lights. This is the probability table. The mean waiting time you can expect for any travel is calculated as follows, leading to a 2 minutes and 15 seconds waiting time. Interestingly, the specific value of 2 minutes and 15 seconds will never occur. You would either wait 0, 2, 4, or 6 minutes. Now let's look at some properties of the mean of a random variable. First we consider what happens if a random variable x is adjusted by adding a value a and multiplying it with a value b. Then the mean is affected as follows. Let's return to our example. As it turns out, you found a shorter route, saving one minute on your trip. But at the same time, traffic got busier and you're waiting times have increased to two and a half minutes per traffic light. That's an increase of 25%. The time you save with the shortcut corresponds with the value of a in the equation, while the factor of 1.25 increase corresponds with b. The new probability distribution for each outcome is provided by the following table. The new mean waiting time turns out to be 2 minutes and 45 seconds. And by accounting for the time gain with the shortcut, you would expect an average 1 minute and 45 seconds net delay in the new situation. These calculations are equivalent to applying the equation for changing the mean with proper values of a and b. Let's now consider what happens if two random variables are added or subtracted. It turns out that the mean of random variables that are added or subtracted is simply the sum or difference of their individual means. And it doesn't even matter whether the variables are independent. For example, suppose that you would like to calculate the mean waiting time for a week. Then you could simply add up the mean waiting times for the individual days that you travel. Let me summarize what I hope you understood from this video. The mean, or expected value of a discrete random variable is the sum over all the values that the variable may take times their probabilities. If the random variable is changing through multiplication or addition by a constant, the mean is changing accordingly. And the mean of several random variables added together is the sum of their means, even if the different variables are not statistically independent.