First divide it into smaller parts, so

in the case of mergesword it was divided into two parts of size n over 2.

More generally it might be parts of maybe different size.

So we'll pick effect or beta, so it could be n over 3 or n over 4, whatever.

And not only that, it might not add up to n.

So there might be multiple parts, and say three parts of size n over 2 or

7 parts of size n over 8.

Or whatever, so we'll parameterize both the number of parts and

the size of each part.

Usually, they try to do equal size.

If you don't have equal size, then you have even more complications.

So that's the first thing, divide into alpha parts of size n over beta.

And then solve recursively, and then put the solution together somehow with

extra cost that is described by a standard function.

And in the theory of algorithms, remember we don't care about the constant,

we just want to get the order of growth.

So we'll say theta of n to the gamma, log n to the delta.

So there's a lot of problems that fall within this paradigm.

And again, it's easy to write computer programs that have this kind of behavior.

Just write a recursive program that does the things.

And so for the theory of algorithms for decades, people have been developing

computer programs that gain their efficiency by this strategy.

And what we want is to analyze them.

So very early on,

people started studying general models for computer programs like this.

And that's what's called the master theorem that we'll talk about next.

So first, here's some examples.

So we did merge sort.

For merge sort, problem sizes N/2, so that's beta 2.

Two problems in size N/2, so that's alpha is 2.

And the extra cost is N, so there's no log n.

So delta equals 0, gamma equals 1.

So that's .mergesort.