The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

Loading...

From the course by Stanford University

Greedy Algorithms, Minimum Spanning Trees, and Dynamic Programming

291 ratings

Stanford University

291 ratings

Course 3 of 4 in the Specialization Algorithms

The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

From the lesson

Week 1

Two motivating applications; selected review; introduction to greedy algorithms; a scheduling application; Prim's MST algorithm.

- Tim RoughgardenProfessor

Computer Science

So in this sequence of videos, we're going to apply the greedy algorithm

Â design paradigm to a fundamental graph problem, the problem of computing minimum

Â spanning trees. The MST problem is a really fun

Â playground for greedy algorithm design, because it's the singular problem in

Â which pretty much any greedy algorithm you come up with seems to work.

Â So we'll talk about a couple of the famous ones, show why they're correct,

Â and show how they can be implemented using suitable data structures to be

Â blazingly fast. So, I'll give you the formal problem

Â definition on the next slide but first let me just say informally what it is

Â we're trying to accomplish. Essentially, what we want do is connect a

Â bunch of points together as cheaply as possible.

Â And, as usual with an abstract problem the objects can mean something very

Â literal. So maybe the points we're trying to

Â connect are servers in some computer network, or it could represent something

Â more abstract. Like maybe we have a model of documents

Â like Web Pages where we represent them as points in space.

Â And we want to somehow connect those together.

Â Now the main reason I'm going to spend time on the minimum expenditure problem

Â is pedagogical. It's just a great problem for sharpening

Â your skills with greedy algoritum design and proof of correctness.

Â It'll also give us another opportunity to see the beautiful interplay between data

Â structures and fast limitation of graph algorithms.

Â That said that minimum expenditure problem does have applications.

Â One very cool one is in clustering, and that I'll talk about in detail in a later

Â video, it also comes up in networking. So if you do a web search on spanning

Â tree protocol you'll also find some information about that.

Â So as I said at the beginning the minimum spanning tree problem is remarkable in

Â that it doesn't just admit one greedy algorithm that's correct, but in fact it

Â admits multiple greedy algorithms that are correct.

Â we're going to talk about two of them, the two most well known ones.

Â But there are even some others believe it or not.

Â So the first one we're going to discuss beginning in the next video is Prim's MST

Â algorithm. This dates back over 50 years to 1957.

Â in fact as you'll see Prim's algorithm shows a remarkable number of similarities

Â with Dijkstra's shortest path algorithm. So you might not be surprised to know

Â that Dijkstra also independently had discovered this algorithm a couple of

Â years later. But in fact it was only noticed much

Â later that this exact same algorithm had been first discovered over 25 years

Â earlier by a mathematician named Jarnick. For that reason you'll sometimes hear

Â this called Jarnick's algorithm or the Prim-Jarnick algorithm.

Â for gravity and to be consistent with some of the main text books in the area

Â I'm just going to call this Prim's algorithm throughout the lectures.

Â The other algorithm we're going to cover which is also rightfully famous is

Â Kruskal's MST algorithm. As far as I know this was indeed first

Â discovered by Kruskal roughly the same time as Prim was doing his algorithm in

Â the mid 50s. And in what sense do I say these

Â algorithms are blazingly fast? Well, they run in almost linear time,

Â linear in the number of edges of the graph.

Â Specifically we'll see how using appropriate data structures will get each

Â of them to run in time big O of M log N, where M is the number of edges in the

Â graph, and N is the number of vertices in the graph.

Â We'll employ data structures to speed up Prim's algorithm in exactly the same way

Â we did for Dijkstra's algorithm, that is we'll be using the heap data structure,

Â One thing that's cool about Crystal's algorithm is it'll give us an opportunity

Â to study a new data structure, mainly the union fine data structure and that's a

Â lot of fun to think about, in its own right, as you'll see.

Â So to put this amazing running time on perspective I want to emphasize that only

Â is it awesome in the sense it's you know, barely, it's almost linear.

Â It takes almost barely more time to compute the spanning tree than it does to

Â read the input graph. Reading the input graph alone, remember

Â would take linear time. O of M time.

Â But more over, graphs can have an enormous number of spanning trees.

Â An exponential number. So some of these algorithms are honing in

Â really quickly on a needle in a haystack. There's no way they have time to look at

Â all these spanning tees, and yet they find the one which is the best which is

Â optimal amongst all of them. How do these seemingly magical algorithms

Â do it? Well, to discuss the details let's start

Â by formalizing the Minimum Spanning Tree, or MST problem on the next slot.

Â So in the MSD problem this is a graph problem so the main part of the input is

Â a graph comprising verticies and edges. I do want to emphasize for the MST

Â problem we are be considering only undirected graphs.

Â This is different notice, than when we discussed shortest-path problems in Part

Â one of the course. There we worked with directed graphs.

Â There is an analogous problem to the [INAUDIBLE] signature problem for

Â directed graphs. It's often called the optimal branching

Â problem. And there are fast algorithms for it, but

Â those algorithms are just slightly beyond the scope of this course.

Â So we're not going to cover it. We're going to discuss only undirected

Â graphs, and then minimum spanning trees for them.

Â Now, whenever you talk about graph problems, you need to talk about, how is

Â the graph actually represented. So that's something we discussed at

Â length in part one. If you don't remember, I suggest going

Â back and reviewing the video on graph representations.

Â For the MST problem, we're going to assume that the graph is given as an

Â adjacency list. That means, we're given an array of

Â vertices, an array of edges. And we have pointers, wiring vertices to

Â their incident edges and wiring edges back to their two endpoints.

Â In addition to the graph of self the input includes a cost, for each of the

Â edges, we're going to use the notation C sebies of note the cost of a edge, E.

Â And in another contrast, to are discussion of shortest path problems,

Â we're actually not going to care if the edge cost are positive or negative, they

Â can be any number whatsoever. So no prizes for guessing what the

Â outputs supposed to be, it's right there in the problem definition, the output is

Â supposed to be a minimum cost spanning tree of the graph, but let's drill down

Â and explain exactly what we mean by that. So first of all what do we mean by the

Â cost of a tree or generally the cost of a sub graph, as a subset of the edges.

Â Well we're just going to be looking at summing up the edges in the tree that we

Â output. Now the other question is what do I mean

Â by a tree that spans all vertices? So let me tell you exactly what this

Â means, the sub graph T should have two properties, first of all there can not be

Â any cycles, there can not be any loops in this tree.

Â And by spanning all vertices, what I mean is that this sub graph is what's called

Â connected. That is, there's a path, using the edges

Â and t, from any vertex of the graph to any other vertex.

Â That's what it means to span all of the vertices.

Â So for example, consider the following graph with four vertices and five edges.

Â I've labeled each of the five edges with a cost, which in this case, is just an

Â integer between one and five. So, let's look at some example subgraphs,

Â let's start with the three edges, A, B, B, D and CD.

Â This sub-graph satisfies properties one and two.

Â That is, it has no cycles, there's no loops and it spans all of the vertices.

Â If you start at any one of these four vertices, you can get to any of the other

Â four vertices by using only red edges. So in that sense, this red sub-graph is a

Â spanning tree. However, it is not the minimum cost

Â spanning tree. There is another spanning tree which is

Â even cheaper, has a smaller sum of edge costs, namely the edges AC, AB, and BD.

Â This also has no cycles and it's also connected but the sum of the edge cost is

Â only seven, smaller than the eight of the previous spanning tree.

Â In fact, this pixograph is the unique minimum spanning tree of this graph.

Â There is a sub graph that has three edges which has an even smaller sum, of edge

Â costs, namely the triangle AB, BD and AD. But this light blue sub graph, this

Â triangle, is not a spanning tree. In fact, it fails on both counts.

Â It does obviously have a cycle. It has a loop.

Â That's, what it is by definition. It's also not connected, so there's no

Â way to get from C, the vertex, to any of the other three vertices by following

Â only light blue edges. It's disconnected, and so it fails

Â property one as well. So the MST problem in general is you're

Â given it under a graph, like, for example, this four note, five edge graph,

Â or presumably. something much larger and an interesting

Â problem and your suppose to quickly identify the minimum spanding tree like

Â in this example the pink subgraph. So what I want to do next is something

Â you're probably quite accustomed to me doing by this point, is I want to make a

Â couple of mild simplifying assumptions just among friends.

Â So these assumptions are not important in the sense that all of the conclusions of

Â these lectures will remain true, will remain valid even if these assumptions

Â are violated but it'll make the lectures a little bit easier.

Â It'll allow us to focus on the main points and not get distracted by less

Â relevant details so here are the two assumptions that we're going to make

Â throughout all of the lectures on minimum spanning trees.

Â The first assumption we're going to make is that the input graph G is itself

Â connected. That is G contains a path from any vertex

Â to any other vertex. So why am I making this assumption?

Â Well if this assumptions violated then the problem isn't even well defined.

Â If the graph isn't connected then certainly none of it's subgraphs are

Â connected so it has no spanning trees and it's not clear what we're trying to do.

Â So, those of you who still remember the stuff we covered in part one in

Â particular, graph search. Should recognize that this condition's

Â easy to check in a pre-processing step. Just run something like breadth first

Â search or depth first search. Remember, we know how to implement those

Â in linear time. And those will, in particular, tell you

Â whether or not the input graph is connected.

Â Now, another thing you might be wondering is, suppose it was disconnected.

Â Then what? Should be really just sort of throw up

Â our hands and give up? You can define a version of the minimum

Â spanning tree problem. A more general one called minimum

Â spanning forest. Where, basically you want the minimum

Â cost sub graph that spans as much stuff as possible.

Â Essentially, it's responsible for computing a spanning tree within each of

Â the connected components of the original graph.

Â And using the algorithms I'll show you here, Prim's algorithm, Kruskal's

Â algorithm, they're easily modified to solve the more general problem with

Â disconnected input graphs as well. But again, for simplicity among friends,

Â let's just focus on the connected graph case that contains all of the main ideas.

Â Our second standing assumption throughout all of the minimum of spanning tree

Â lectures will be that in the input graph the edge costs are distinct.

Â So you're already use to this sort of no ties kind of assumption from our foray

Â into scheduling algorithms, and we're going to do something similar here.

Â Now again this assumption is not important in the sense that the

Â algorithms that we cover prims algorithm crustgrals algorithm.

Â They remain correct even if the input has equal cost edges, irrespective of how

Â ties are broken. So the algorithms are correct as widely

Â as you would want. That's it.

Â I'm not going to actually prove for you that they are correct with ties.

Â Remember we had our scheduling, application it was a little bit easier to

Â get a proof of correctness without ties, I gave you that, and then optionally

Â there was a slightly more complicated argument that handled ties.

Â You can do the same thing here, but I'm just not going to give it to you.

Â I'll leave that for the keen viewer to work out for themselves.

Â Coursera provides universal access to the worldâ€™s best education, partnering with top universities and organizations to offer courses online.