the value of that current index is already bigger than the value

at the last piece of the already sorted part of the array.

So when does that happen?

Well, it happens when the array that we start with is already sorted.

Every time we look at a new element, it's already in its correct location

relative to the earlier ones, so we don't need to do anything.

The body of the while loop never executes.

And so when we look at the performance in this case,

that means that every time we try to start executing the while loop,

all we need to do is do that just one test, and it's going to evaluate to false.

And we go away from it, and we go to the next iteration of the for loop.

And so that means that the body of the for loop in this case,

when the array's already sorted, every time we go through the for loop,

we just execute a constant number of instructions.

And so all in all, we have n iterations of the for loop, each time doing

just a constant amount of work, and so all in all we have just O(n) work.

So when the array is already sorted, in this best-case lucky,

lucky scenario, insertion sort just takes big O(n) time.

Cool. That's much, much better than quadratic,

which was selection sort.

So that's interesting, okay.

Is it gonna be good all the time?

Is it a big win?

Well, so let's think about the worst case.

So what input might we get that would require us to run many, many steps?

And remember the piece that we're focusing in on is that inner while loop.

Where we're nudging the current value that we're looking at to its correct location

within that first part of the array that's already sorted.

So, if we think about what might make us really unlucky

is if we have to do a lot of the nudging, a lot of the swaps over.

And so, if we think of our arrays being really far from sorted, then we're gonna

have to do a lot of work to keep moving these elements to their correct location,

relative to the sorted part of the array.

So, thinking about an array that's given to us in reverse sorted order,

we're going to have to do a lot of work in this algorithm in order to get it

to the correct form,

namely going from one through five, rather than from five through one.

So in this worst case analysis what we see is that when we are looking at,

for example, at the second position, so position equals 1,

then when we want to nudge 4 over to its correct location we have to do a swap.

And then when we look at the next index at position 2, then in order to move 3

to its correct location we're going to have to swap it twice.

And so every time we're going to have to swap all the way over to

the head of the array and so all together what's going to happen is that

on average we're gonna have about O(n) many swaps for each position.

And that happens n many times as we go through all the positions.

And so we get to O(n) squared amount of work.

And so the analysis at the end was a little brief but

it's really similar to what we do when we think about analyzing selection sort.

So go back and

revisit that support video if you'd like to see a little bit more detail.

Now what we wanna keep in mind, though, is that the worse case for

insertion sort happened when our input array was in reverse order.

And so this is really illustrating how

the structure of the input can determine the performance of our algorithm.

And when we're thinking about worst-case analysis, we want to think about

the kinds of inputs that will make us do the most amount of work.