0:00

So as we said last time, about the running time of the brute force algorithm for

Â computing distances, is that you know, we got a very complicated formula.

Â But, at the end what we could say also, one of

Â the things that we could say without going through all these complications and

Â details, is that the running time of that algorithm.

Â Is at least 2 to the n minus 1 if you remember.

Â We said because the algorithm was going to try every possible subset, of size and

Â every possible subset of a set of size n minus 1.

Â And, it was going to, inspect each one of them and do a lot of work.

Â So even it's going to do definitely more than, 2 to n minus 1.

Â But, we can say that,

Â we can put a lower bound on the earning time that is 2 to the n minus 1.

Â Now why does that matter?

Â Because we have shown already, that when we talk about growth of functions, that 2

Â to the n is growing actually very, very fast, that even when n was 256.

Â It will take 10 to the 59 years, for

Â a computer that can handle 10 billion operations a second, okay.

Â 2 to the n, 2 to the n minus 1, are basically the same,

Â it's just that 1 is divided by 2, okay.

Â So, using this kind of technique by putting lower bounds and

Â upper bounds on an algorithm, is very helpful because.

Â As I said from the beginning,

Â that we're not trying to get at the exact running time of the algorithm.

Â We're trying to get an approximation, that gives us a good idea about,

Â how long that our algorithm would take once we implement it.

Â So in the case of brute force algorithm.

Â Using that kind of technique by bounding the running time from below,

Â it told me that this algorithm is going to take at least exponential amount of

Â time 2 to the n.

Â Order of 2 to the n.

Â And that's enough for me to know, that this algorithm is not practical,

Â is not worth implementing and giving to the user.

Â 1:53

In the same sense also we can usually put an upper bound on the running time, by, so

Â that we get an idea of, how much time is

Â this algorithm going to take in what worst case, in the worst case, okay?

Â So, how fast is this function going to go?

Â When we have a function like, 5n plus 3n squared, minus 17n cubed,

Â minus 19n to the 17, you know, that's a function that, you know,

Â we have to start plugging in values of n to evaluate it.

Â But, if we use this kind of lower bound and upper bound technique.

Â We can get the, an easier function that we can understand better,

Â it's much easier to say, 2 to the n grows fast and

Â look at a function like 2 to the n and decides that it grows very fast.

Â Than a very complicated function that has summation and combi, and

Â combinations and choose k and choose n infactorial and so on.

Â Okay.

Â So for that there is there is a set of mathematical tools that we use,

Â which we refer to collectively as asymptotic notations.

Â Asymptotic notations allows us, allow us to put upper and

Â lower bounds on functions, as well as the two bounds as the same time,

Â that will bound these functions and will give us an idea of.

Â How fast the function grows?

Â How bad can it be?

Â Or how good it can be, as well?

Â Okay? So, I will start with with the,

Â with the first one of the simp,

Â there are three main asymptotic notations that we will use.

Â The first one is, corresponds to putting an upper bound on the running time of a,

Â of a, of an algorithm.

Â 3:26

So if I have the, if I have an algorithm, I have analyzed the running time of

Â the algorithm, and I get a function of the input size.

Â So we'll assume that the input size, that the parameter that corresponds to

Â input size is n, and I will assume that I have done all my work, and

Â I got that the running time is f of n.

Â 4:48

Faster than f of n after some point, okay?

Â So in the asymptotic notations when I have a function f of n,

Â I am interested in getting a function g of n, and getting a function g of n.

Â That except for a few values of n in the beginning.

Â Except for a few values of n in the beginning, after a certain point

Â the function f of n is going to grow at most as fast,

Â as most as fast as some constant times that function g of n.

Â Okay, so, remember, again, in asymptotic notations, I have f of n.

Â It's a function that someone analyzed, or

Â the running time of an algorithm I obtained, okay.

Â Someone analyzed the running time of a certain algorithm,

Â got the function f of n.

Â I am interested in quickly getting some sort of upper bound on that function, so

Â that I can understand, how bad can it be?

Â Okay?

Â Now, in asymptotic notations, we are interested in getting another function,

Â g of n, that will bound f of n from above, that, when I draw them,

Â g of n is going to be on top of f of n, but, again, with the caveat that,

Â if multiplied by a constant, c.

Â Okay?

Â So, there's a constant, c, here,

Â that if I multiplied by g of n, is going to be higher than f of n.

Â The other issue, as I mentioned, is that.

Â In the beginning, I can tolerate a few values of n,

Â because remember we focus on the growth of function as n goes to infinity.

Â I am not much concerned with what happens when n, with n is very small, okay?

Â So when we talk about putting an upper bound,

Â we are interested basically in this function g of n.

Â And we have to think about two constants at the same time.

Â One constant that we are going to call n0, and

Â other constant that we are going to call c.

Â And we want that function g of n such that, except for

Â that these values here smaller than n0, or

Â for every value of n greater than or equal to n0, for every value here.

Â I want f of n to be at most c times g of n, okay.

Â Or in other words we want, we want a function

Â 7:07

g of n.

Â Such that, there exist

Â two constants, c and

Â n0, where f of n,

Â is at most, c times g of n for

Â every n greater than n0.

Â Kay?

Â Again, if we look at this picture here, it shows clearly that,

Â if you ignore all the values of n that are smaller than n0, if you ignore them.

Â And just focus on values of n greater than or equal to n0.

Â You will notice that f of n is always smaller than c times g of n.

Â And this is what this definition is saying,

Â that we have a function g of n, two constants c and n0.

Â C has to be strictly greater than 0, n0 is greater than or equal to 0.

Â Such that f of n is smaller than or equal to c of g of n,

Â c times g of n, for every n greater than or equal to n0.

Â This upside down A is basically, in English, we say,

Â for all, or for every value of n that is greater than or equal to n0.

Â If we have this scenario.

Â If we have this scenario, we have a term for it.

Â We say that, in this case, in this case, f of

Â n is big O of g of n.

Â And this is the first asymptotic notation I'm introducing, the Big O notation.

Â f of n is O(g(n)) if we have this scenario or if we can,

Â if, if I can find two constants c and n0 such that for

Â every n greater than or equal to n0, f of n is going to be at most c times g of n.

Â Okay?

Â So this is the first notation that we can use.

Â And it tells us to put an upper bound, on the running time of an algorithm.

Â Again, in this notation, we will ignore finite set of values in the beginning,

Â and focus on the running time as the, as the input size grows.

Â Okay?

Â So this is the first, first asymptotic notation that we,

Â we learned, and again it's about putting an upper bound.

Â This is the definition.

Â If you want to put an upper bound on a function f of n, you find the g of n, and

Â the c and then n0, that will satisfy this definition.

Â Okay?

Â 9:40

In the case of the brute force algorithm that we devi we devised for

Â the distance problem, we actually looked at the lower bound.

Â We said that that algorithm will take at least 2 to the n amount of time.

Â So what does it mean to put a lower bound?

Â We can look at the same thing, but, we can think about g now, c times g of n being.

Â A lower bound on this, and in this case,

Â we can think about it as, I can think about it as the following.

Â If I have f of n like this here.

Â 10:24

And I can find, a g of n for

Â example c times g of n.

Â And I can focus on the n0.

Â So now I found a, a function g of n such that f of n is going to be greater than or

Â equal to c time g of n.

Â Once n is greater than n 0.

Â In this case here, again it's similar, definition, we need to find the g of n and

Â two constants, such that f of n is greater than or

Â equal to c times g of n for every n greater than or equal to n0.

Â In this case say that f of n equals omega of g of n, okay?

Â So omega is for lower bound, okay?

Â The third asymptotic notation is that, I can take this function, f of n.

Â 11:28

Okay?

Â So we have now c1 times g of n and

Â we have c2 times g of n, and I say f of n is bounded from above and

Â below by g of n times two different constants.

Â In this case, what we have is that f of n is, we say it is theta(g(n)).

Â It is sandwiched between c1 times g of n and c2 times g of n.

Â So we have these three asymptotic notations f of n equals theta of g of n,

Â f of n equals big O of g of n.

Â