They are usually, the closed-form formula for them usually have this shape.

In this case, since the time series only depends

on the immediately previous time instance,

we call this the autoregressive one model or AR(1) model.

The value of one here indicates that the current value depends

only on the immediately past observation.

Now, there are more extensive autoregressive model

says that this is actually simplest autoregressive model.

More complex autoregressive model,

for example AR(2) model,

says that the current observation is a combination of a random component once again,

say contribution of a random component.

It also depends on what happened immediately before the current observation,

but it also depends to some degree what happened to time instance back.

So, to predict what is going to happen today,

I need to account for what happened yesterday,

what happened the day before plus the random event that may happen.

So this is called the AR(2) model.

Again, it's an autoregressive model; however,

it is more complex than AR(1) because whatever will happen today,

what is happening today depends on not only yesterday but also the day before.

Note that when they say understanding time series model,

what we mean is that we need to be able to understand.

Given the time series we need to be able to

understand first or whether it's an autoregressive model.

If it's an autoregressive model,

we need to understand whether it's an AR(1) model,

whether it's an AR(2) model or maybe it's an AR(3) model,

maybe it's an AR(5) model.

For each one of them, we also need to be able to

discover what are the corresponding Alpha values.

How strongly what happens today depend on yesterday?

How strongly what happens today depends

on what happened the day before and so on and so on?

So that's really what we mean by discovering a model for a time series.

Now, autoregressive models are not the only models that

are available beyond the random models.

A second type of model that goes

beyond the random model is called the moving average model.

The moving average model.

In the case of the moving average model,

what is happening today once again depends on what's happening right now.

Randomly, the external input to the system.

It also depends on what happened yesterday,

but it doesn't depend on the actual observation yesterday,

but it depends on the random event that happened yesterday.

So, the difference.

Please, pay attention.

The difference here is that in the AR(1) model,

X_t depend on X_t minus one,

Xt dependent on X_t minus one.

In the moving average model however,

X_t depends on E_t minus one.

Remember, E is the external input.

So the moving average models essentially account for the contribution of

the random external events and their contribution to the current observation.

Now, the way the AR models have different versions AR1,

AR2 so on, the moving average MA models also have MA1 MA2 and so on.

In the MA2 model what we say is that

the current observation depends on the current random events,

yesterday's random events, and the day before's random events.

So these type of time series are called Moving Average Time Series Models.

Of course in the real world usually you don't really have just autoregressive models,

and just Moving Average Models.

In the real world you have a mixture of these.

Usually the time series show some autoregressive behaviour,

and some moving average behavior.

Because of that usually we talk about ARMA models, A-R-M-A models.

The ARMA or A-R-M-A models essentially show both autoregressive behavior,

and moving average behavior,

on top of the random behavior currently.

Usually basically when we talk about ARMA models,

we usually give two parameters to the ARMA models, ARMA AM.

A essentially is the number of autoregressive terms that the model has.

M usually is the number of moving average term the model has.

Obviously, the more terms we add,

the more precise our model may become,

but it also may become more difficult to discover,

and it may become more complex and less easy to use.

Consequently usual when we discover models and which we will discuss this later,

we usually seek models that fit the data well,

that represent the data well,

but they are also less complex.

We usually don't want to have very large A's and very large M values.

Those type of models become too difficult to discover accurately,

and be to properly use and interpret.

We will discuss that later,

we will discuss how we are discovering this ARMA models later.