0:04

Next, we'll look at two more examples of parameters in permutations.

Again, just to gain some comfort with the b asic method and

also to get more familiarity with the properties of permutation.

And there are many, many other examples in the book.

We were picking four parameters of permutations and

there's another Five or six or more in the book.

So say you want to know how many 1-cycles are there in

a random permutation of size N.

So for two,

the average number 1-cycles, there it's just one

because the Permutation has 2 1 cycles.

Has 2 and the one that's a 2 cycle has 0.

So the accumulated cost is 2.

And we divide it by 2 we get 1.

For 3 we have either 3 1 cycles which is 3 to the accumulated cost.

We have three of them that have 1 1 cycles so that adds another 3 to the cost.

So again the total is accumulated cost is six and

then there’s the six of them so the average is one.

And you might start to see a pattern in and sure enough for

permutation of size for the total cumulative cost and you can count to here.

There's 24 1 cycles in all these permutations and there's 24 permutations,

so the average is 1.

So we're going to expect our result to show that the number of 1 cycles averages,

expected number of 1 cycles in a random permutation of size n is 1.

1:51

To show that, we're going to use precisely the same construction that we

used for counting cycles.

So again, we put p + 1 at every position in the cycle.

The difference in the analysis is, if the original

perm has cyc1 of p1 cycles How many are there in the set of constructed perms?

Well, you have the same equation to start out with.

That is that whatever number 1-cycles there

are in the original there's people as 1 times that in all of these.

But then we have to adjust to add the new one cycle,

when we added our new element to make it a one cycle.

And then we have to subtract off for

every one cycle that was there in this case, the two.

We knocked it out by making it into a two cycle in one of the perms.

So we have to subtract off cyc1(p).

So that gives us a slightly different formula.

The number of one cycles in the set now is p

times cyc1(p)+1 instead of p+1 times cyc1(p).

So just that plus one Is the difference between this

3:03

equation and the one that we did for left to right minimum in for cycles.

So let's look at what happens to the analysis when we do that.

So, CGF.

And then we apply the construction.

And again, the only difference is there was people before, and now it's p.

3:54

And now the second one is just the for.

The permutation, that's one over one minus z, in the first one, the p cancels

out, and so that is just the prime of z, so if

4:13

you take the derivative of b of z, if you look in the upper left,

the first formula, The p cancels out with

the p factorial in the denominator.

And then we are just left with an extra factor of z.

So it's b prime of z.

And then plus 1 over 1 minus z.

And that's our answer.

So now we have a differential equation b prime z equals one over one z minus

squared.

and that's easy to solve it's just one over one minus c and coefficient z is one.

So average number one cycle in a random permutation is one So

5:21

So, for example in our Students and

Rooms Problem everyone goes back to a random room,

what's the average number of people who wind up in their own room?

Well, what we just proved is that it's one.

5:38

Now, just to test yourself or your understanding of this method

it's worthwhile maybe to take the time to try to figure out the number of,

expected number of two cycles in a random permutation of size N, or then generalize

that to the expected number of R cycles in a random permutation of size N.

Now the answers are 1 over 2, and 1 over r.

In you can by solving this problem you can see

get a little practice with the manipulating this kind of equations.

As our last example of studying parameters and permutation we look at inversion.

An inversion in a permutation is a pair that is out of order,

that's a little bit imprecise.

A better way to look at it is to say, it is the sum for

each entry, we sum up the number of elements that are larger and to the left.

6:40

So for example in the top right corner one two four three,

the only pair that is out of order is three and four.

So three has one larger element.

That's left before.

All the rest of have zero or larger elements to the left.

The one below that, 2143 has two inversions, 1 and 2, and 3 and 4,

are out of order.

1 has one larger Drome and to its left three has one larger elements to its left.

The one below that 3 1 4 2 that one has three inversions because 1 has one larger

element to its left the 3 and 2 has two larger elements to its left the 4 and 3.

And again for every one of these permutations we have written down

off to the side the number of inversions.

If you add all those numbers up you get the cummulated cost.

For three, the cummulated cost is nine so

the average number of inversions is one and a half.

For four, the total cummulated cost is 72.

So the average, so

the accumulation across the field is the expected number of inversions is three.

7:51

So that's the quantity we want to study, the number of inversions.

An application of this is to analyze another elementary sorting algorithm

called insertion sort and that's also the method of choice in some situations and

you can read about it in algorithms.

8:20

what insertion sort does is goes through the array.

For every position I,

it's job is to keep all of the elements before I in sort of Order.

And the way that it does that is when it gets a new element,

it exchanges it with all the larger elements to its left.

8:40

So in this case, when i equals 10 and it's pointing at the m,

it knows that everything to its left is already sorted.

But looking at the current element,

its element to its left is larger so we exchange.

We keep exchanging as long as the element to its left is larger, so M is still

always larger and N is larger and when it gets to a point where they almost

just left to smaller then it's inserted into its proper place in the array.

9:12

So it's known as insertion sort.

[COUGH] the exchanges put the current element in to place among

the elements to its left.

And so, the cost of this sort or

number of exchanges is going to be the number of inversions in the permutation.

So we want to know how many inversions there are in a random permutation in

order to understand insertion sort.

9:48

Cursive method like quick sort or merge sort when the files get small,

those methods are less efficient that insertion sort.

You want to switch to insertion sort.

You want to analyze the size that you need to switch to insertion sort,

you have to in order to be able to answer this question.

10:21

but it's It's got the same transfer theorem and so forth as the star.

And it's just an indication of the kind

of freedom that we have in trying to understand combinatorial objects.

So in this construction, we're going to create a permutation.

They use a six pointed star.

The permutation of size n-1.

I'll create a permutation of the size n by inserting n in every possible position.

So here the 7 goes from the right most down to the 1st position.

Then there is no renumbering involved,

they all have the labels from 1 to 7 when you do that.

So, now you notice that when,

as we move from left to right, we add one more inversion.

So, the first one, this no additional inversions,

is the same number of inversions.

But then each one, as we move down, everybody to the right

of 7 gets one more inversion added because now 7 is to its left.

11:30

And so we can calculate, again as before,

if we know the number of inversions in the original permutation.

And know the number of inversions in a set of constructive permutation and

it's there's all the inversions in the original still there

in the constructed permutations, but then we have all these additional inversions,

1 plus 2 plus 3 up to the size of p, which is size of p plus 1 times p over 2.

So that's the number of inversions in the set of constructed perms.

Now, we're going to use that equation in our typical construction.

So now our accumulated generating function is on inversions and

again rearranging the size of the sum by size of

people if you want to group those that gives that equivalent

equation that we can now simplify

12:33

It's got two terms in the first term the P+1 cancels against the P+1 pectorial.

In the second term the P+1, both in that one also the P+1 cancels.

So the first term is just ZB of Z.

In the second term, if you group by k,

it's just kz to the k because there's k factorial size and

there's an extra z to run out in the one half from pp plus one over two.

So now, that simple equation solved for b of z Is z B(z) + one half,

that generating function is the derivative of z to the k,

so z squared over (1- z) squared.

Now solve for B(z), and get another factor of one minus Z and

again it's one of our elementary generating functions.

Coefficient of Z to the N in that is the average number of immersions

is N times N minus one over four.

So, now that's the fourth and that's enough and

agin if want many more there's many more in the book.

Again, that checks against small values.

There's lots of properties and permutations that have been studied in

classical combinatorics that can be handled in a similar manner.

So for example, a rise in the permutations when the value goes up, but

falls when the value goes down.

A peak is when the value goes up and then down.

A valley is when the value goes down, then up.

A run is if you have successive values going up.

Left to right minima, we already talked about,

and increasing sub sequence is some subset of the permutation where

they go up all of these properties can be handled in a similar manner.

And again, the book contains several other derivations.

It wouldn't be productive to cover in lecture.

So those four indicate a an approach towards studying parameters

of permutations that's effective in those that we looked at

have actual applications to understanding the performance of

important algorithms in practical situations.