0:41

so if you have that approximation of g, then taking e to that power, which the

set's supposed to do. gives you a good approximation of F.

And then you can extract coefficients from the standard scale using F.

And again, there's a lot of detail to proving this fact, involving the

singularity analysis process. but once it's proven then we can apply it

easily in lots of applications. and also there's a corollary that

immediately gives the average number of components in a random object where it's

alpha log N where alpha is the mul, coefficient of the, of the log in the

approximation. so that's again a transfer theorem from

the last lecture. And now we're going to look at

applications. and the first one is and again we'll just

be focusing on the coefficient asymptotics.

first application is all about cycles and permutations and again, just to refresh

memories for people who didn't see the last lecture, well, take a look at this

one. and we also, not just interested maybe in

permutations, but how many cycles are there on an average.

[INAUDIBLE] In a random permutation. and we've looked at this problem, we've

looked at this problem as an example all the way back through part one of analytic

[UNKNOWN]. but now we're going to show how this

problem and variants of this problem and problems like it are immediately handled

with a general schema. so our construction is a permutation is a

set of cycles. so that immediately translates to e log

of 1 minus e for the cycles, e to that power for the sets, x blog.

so that form of the generating function says we're going to use uh,exp, the

exp-log theorem. so now its what we want to do is

calculate the alpha row and beta but those are immediate

2:52

Because in this case it's not approximation.

the thing that's raised to the, e is raised to that power is log of 1 over 1

minus z. So that's alpha equals 1.

Beta equals zero. There's nothing added on.

And row equals 1. So we just plug in alpha equals 1.

Gamma of 1 is 1. Beta equals 0 so that's a 1.

rho equals 1. So it's 1 to the N.

and all that's left is 1, from exp-log. So the coefficient of z to the N of P of

z is asymptotic to 1. And it's exponential, so number of

permutations as sub type n factorial. that doesn't seem like much of of much

interest. But remember the correlary says the

expected number of components. So that's the expected number of cycles.

Which does maybe require some [INAUDIBLE].

Some calculation it's alpha log in. Alpha's one so that's immediate through

singularity analysis prove that not just the number of permutations.

But also the average number of components comes immediately log in.

and again for The basic problem like this we have lots

of ways to proceed to improve the log in result.

Our point is that for variance of this structure or for other similar structures

we can use precisely the same process to get an answer where direct calculations

might be extremely complicated. Were prohibitive.

So like, examples. So we looked at derangements.

So we can look at derangements. Or we can look at a problem like, how

many cycles are there in a random derangement.

might be hard to think about how to even approach a problem like that.

but it's immediate from the X blog schema.

And precisely the same process works for more general settings.

Like maybe we want to know about cycles and derangements.

We talked about derangements, which are the set of all permutations having no

singleton cycle so we take out the singleton cycles.

So that's the construction. It's a set of cycles none of which, all

of which are lengths greater than one, greater than zero and so, all we do is

subtract off, one from log of one minus z to take out the singleton cycles.

D is, equals x log, [INAUDIBLE], or one minus z minus one.

that's x blog. The only difference is that, now that

beta is minus one. So it's the x blog form.

it's just that beta is minus one. So,

When we now, apply the theorem, we get the same answer.

except that, 'cuz of the e to the beta, we get e to the minus 1.

and that's a familiar result. the number of [INAUDIBLE] have to

multiply by n factorial. It's n factorial, over e.

and when you looked, we looked at this as an early example in part 1 of the course.

and that 1 over e started out as a finite sum and we had to show that the tail's

negligable, and so forth. this higher level way of looking at the

same problem really gives insight in where this form of the solution comes

from. it's all about the minus one.

And we subtracted off the minus one. That translated right through the

theorem, down to the exponent of e. and not only that, we have the

correlative theorem which tells us, average number of cycles in a random

derangement. But still log in, doesn't involve data,

data so the average number of cycles in a random derangement, is proportional to

the log in still. And this generalizes to include any

restrictions on the cycle links. So you could pick t different integers

and say I want to study the class of all permutations having no cycles of length

w1, w2, down through wt. the uh,construction is, just sets the

cycles with that restriction on the cycle length, immediately translates to a

generating function equation where we subtract off the term corresponding to

each one of those cycle lengths from log of one over one minus z.

but when applying the explog theorem all that does is when we approximate that

function near the singularity, which is rho equals 1.

Those Z to the Ws become 1. it all comes to the beta.

so applying the theorem then we're just going to get e to the minus theta.

and then it's just e to that constant and whatever those constraints are, you can

compute the constant, and you can know that the number of generalized

derangements of that form are n factorial over e to that constant, very easy to

compute. pick 2, 3, 5, 9, and 11, and you can

compute the number and know that average number of derangements that don't have

cycles of length 2, 3, 5, 11. You might be challenged to try to derive

a result like that using elementary techniques.

9:07

let's look at a a much more complicated problem to to deal with.

this is called the two regular graphs. it's actually not at all more complicated

through analytic combinatorics. so these are undirected graphs where all

the nodes are degree two. so every node of degree two essentially

means it has to be a little circle. There's only one to regular graph in size

three it's a little triangle and there's only one way to label that one because

it's not any different. So that's like, down there at the right,

is the representation of the graph as a set of edges.

If we have an edge from 1 to 2, and edge from 1 to 3, an edge from 2 to 3, then

all the nodes have degree two and the, that's the only possibility.

for four-node graphs to keep all nodes of degree two.

Sines gotta be a little circle. But now there's a bunch of different ways

to label it. and get different graphs.

so in this case, there's three different edge sets.

so you fix one of the nodes and do the premutations but then after removing the

flips This so, either order counts the same in

this type of graph, because the set of edges won't see the difference, how you

draw it. and so for five, it turns out that

there's is twelve. again fixed one is twenty four

permutations, and each one appears twice in both it's orientations.

the, for 6 now you could have, the 6 node graph and again, 60 ways to label that.

Or you could have 2 3 node and that would work fine.

So there's actually 70 different, 6 node graphs.

That's with, one two three four five six edges.

that every node has degree to. There's 70 of them.

so that's and for seven nodes there's 465.

and as we get more we can have different combinations of the little ones.

So it's a set of gr, subgraphs and sub components of this type and then the n

minus one factorial, or k minus one factorial over two ways label it if it's

given. So, you can work with elementary methods

to try and count these, but with analytic comminatorits, it's actually quite easy.

you also might want to know the number of components in in a random two regular

graph. so this one there's one component for 60

of them and there's two components for ten of them, so that's an average of

1.143. And for this it's 1.226.

so, maybe in some application in chemistry or some physical process, you

might have a situation where you need, you have a random structure of this type,

and you want to know the number of components.

That's the kind of problem that we're getting at with this kind of analysis.

12:40

So what is a two regular graph? It's a set of undirected cycles of three

or more nodes. That's all.

that's actually a fairly compact specification of what it is.

That immediately translates the generating function equation.

Its half undirected cycle means you take half, that's the same as when I the, each

graph appears twice. and then again we have to subtract off

the the small ones z over 2, and z squared over 4.

That's a classic to regular graphs. so now that looks a bit complicated at

first, until you realize that it's x blog form.

So exp log form, and all it says is what's raised, e to the power, what's in

that power is some constant times log of 1 over minus z times some constant plus

some other constant. asymptotic to that near the singularity

closest to the origin. Singularity closest to the origin is 1

near that. we have beta equals three-quarters.

we have alpha equals one-half, that's the coefficient of the log.

we have rho equals 1. Immediately now, we can apply the

theorem. Beta's three-quarters, so we have either

the three-quarters. Alpha equals one-half, so we have gamma

of one-half. and and that's it.

14:16

So immediately the exp-log theorem translates from the generating function

equation to the coefficient asymptotics. You need the minus three quarters over

square root of pi n. and so if you want to get the number of

two regular graphs, you multiply that by n factorial.

the very general theorem when we're using the set construction the x log theorem to

get coefficient asymptotics. And this type of asymptotics would be

quite difficult to get using normal techniques.

Now again, underlying this theorem is singularity analysis, as it, so there's

quite a bit of mechanics under there. But when it comes to applying it's no

problem. And again the component corollary applies

here. now we have alpha equals a half, so

average number of components in a random two-regular graph of size N is half log

N. A very easy to get results of this type

using the x log schema. So that's our second example of a widely

applicable, schema that's based on a singularity analysis.