So one thing to notice here is that our Hessian matrix is symmetrical

across the leading diagonal.

So actually, once I'd worked out the top right region,

I could just have written these directly in for the bottom left region.

This will always be true if the function is continuous,

meaning that it has no sudden step changes.

We can now simply pass our Hessian an xyz coordinate,

and it will return a matrix of numbers,

which hopefully tells us something about that point in the space.

In order to visualize this, we're going to have to drop down to two dimensions again.

Consider the simple function f(x,y) = x squared + y squared.

Calculating the Jacobian and the Hessian are both fairly straightforward.

And hopefully, you can visualize how this function would have looked in your head.

However, if you hadn't known what function we were dealing with and

calculated the value of the Jacobian at the point (0,0),

you'd have seen that the gradient vector was also 0.

But how would you know whether this thing was a maximum or a minimum at that point?

You could, of course, go and check some other point and see if it was above or

below, but this isn't very robust.

Instead, we can look at the Hessian,

which in this simple case is no longer even a function of x or y.

Its determinant is clearly just 2 times 2 minus 0 times 0, which is 4.

The power of the Hessian is, firstly, that if its determinant is positive,

we know we are dealing with either a maximum or a minimum.

Secondly, we then just look at the first term,

which is sitting at the top left-hand corner of the Hessian.

If this guy is also positive,

we know we've got a minimum, as in this particular case.

Whereas, if it's negative, we've got a maximum.

Lastly, slightly modifying our function to include a minus sign and

recalculating our Jacobian and our Hessian, and

our Hessian determinant, we now see the third interesting case.

This time, our Hessian determinant is negative.

So we know that we're not dealing with a maximum or a minimum.

But clearly at this point, (0,0), the gradient is flat.

So what's going on?

Well, if you look at the animation,

what we've got here is a location with 0 gradient, but

with slopes coming down towards it in one direction, but up towards it in the other.

We call this kind of feature as saddle point, and

they can also cause a lot of confusion when searching for a peak.

In the last module of this course,

you're also going to see another way that the Hessian can help us with optimisation.

But for now, we've simply got another tool to help you navigate the sandpit.

See you next time.