Sie sind auf Seite 1von 3

# Now, when working with probability distributions, we have the probability function, we have the probability curve.

There's also something known as the cumulative distribution function. The cumulative distribution function or the CDF of a random variable is denoted by capital F. This just represents the probability that the random value is less than or equal to a particular value. So, f(x), capital F(x) is equal to the probability that big X is less than or equal to little x. So, it's the area under the probability curve, if you have a continuous random variable to the left of x, okay? If you have a discreet distribution, then you just count up the probability values for all x less than or equal to a particular value and you usually get a stair step kind of function. The cumulative distribution function satisfies the following properties. So, if x2 is bigger than x1, then the area to the left of x2 is bigger than the area to the left of x1. The CDF at minus infinity is zero. The area to the left of minus infinity is zero. And the CDF at infinity is equal to one, so the area to the left of plus infinity has to be equal to one. This is just says the total area in the probability curve is equal to one. Because the total area of the probability curve is equal to one, the probability that x is greater than, the random variable is greater than or equal to x is one minus the probability that it's less than or equal to x. If we want to know the area, or if we want to know the probability that x is between, that the random variable is between x1 and x2, it's the CDF evaluated x2 - the CDF evaluated at x1. So, this result here is one of the main reasons why we look at the cumulative distribution function. If we want to know the probability over an interval, we can just calculate the difference between the CDF values. And so, if we have some function that can compute the CDF, then we can compute the probabilities. And if we have a continuous random variable, it turns out that the probability curve, the density is equal to t he derivative of the CDF. So, there's a result from Calculus. And this is just an example of Fubini's theorem. So, let's look at an example. Suppose we have a uniform distribution over zero actually, before we go to the uniform distribution, let's just look at so, this discrete distribution for Microsoft stock that we had before. So, if we look at this

distribution here. So, this is the probability mass function. And if we wanted to know, what is the cumulative distribution? What is the probability that x is less than or equal to zero? Well, it's the sum of the probabilities that x is less than zero. So, we calculate the cumulative distribution function, we see the probability that x is less than -four is zero. The probability that the return is less than -three is, is just the probability that it's, it's equal to -30%. So, this is the probability that you're in a depression. And then the probability that the return is less than say -ten%. Well again, that's the sum of these two events. And then, we just add up the probabilities and we get this stair step. So, that's a discreet distribution. We have a continuous distribution, say the, say the uniform distribution on 0,1. So, in that case, we have a very simple distribution. So, we have a distribution that is essentially the square, that's, that's supposed to be one. And so, we wan to know what is the probability that this random variable x is less than or equal to little x? Well, that's the area under the curve to the left of x. So we, think of some value here, x. Then this area on to the probability curve to the left of x is just the shaded area here, which I'll shade in. And so, this represents the probability that x is less than or equal to x. It's just the area under the probability curve to the left of X. And we can do this integration very easily cuz the probability curve is just equal to one. The, this, this probability is just the interval of the z from zero to x, so that's equal to X. So, for uniform distribution on the unit square, if we want to calcul ate the CDF is just a straight line that looks like this. So, this is F(x) of x is equal to x is the CDF. So, this is the case where we can analytically derive what this CDF is by actually evaluating the interval. And if we want to find the probability, say, that x is between zero and a half, then what is this? This is just the area under the probability curve between zero and a half. We can evaluate the CDF at a half minus the CDF at zero. The CDF at a half, using this result, is a half. The CDF at zero is zero, so the area between zero and a half is equal to a half, okay? And then similarly, the result from Calculus, because the CDF is equal to x, if we take the derivative of x, we get one, and

that's exactly the probability curve. So, the continuous random variable, we integrate, we take the derivative of the CDF, we get the, the probability curve. All right, so, I want to make a remark here. So in this example, let's say, when we had this the probability curve on the unit square, right, we can think of the probability, say, that this is the probability, say, between 0.5 and 0.6 is the area under the curve, so this is equal to the area under the curve, okay? So, what happens if we consider the probability as you know, this, this interval gets smaller and smaller and smaller. So, the idea is we want to think of, what's the probability at a single point? So, if so, if we, we squish this rectangle such that we just get a single point, what's the area associated at a single point? It's zero. So, with a continuous random variable, the probability that a continuous random variable is exactly equal to a particular point is, is zero. Now, that's a little counter intuitive, right? Because I, cuz when you see the probability of, of x being equal to a particular point is zero, it's like saying that it can never happen, right? That's not true. And so, it' s sort of like with a continuous random variable, again, there's an infinite number of values between zero and one, right? And think about the idea of guessing, you kno w, exactly what x is going to happen, right? You can have as many decimal points after this, you know, the accuracy that you want. So, in some respects you know, probability of a single point for continuous random variables just doesn't make any sense. And so, for continuous random variables, we define probabilities over intervals and not at distinct points. And with the continuous, and because of this result, we have a continuous random variable. The probability that x is less or equal to x is the same as the probability that x is strictly less than x, because the probability at a single point is equal to zero.