# Autoregressive

Search Dictionary

## Definition of 'Autoregressive'

Autoregressive is a statistical term that describes a process that depends on its own past values. This can be contrasted with a moving average process, which depends on the past values of a different variable.

Autoregressive processes are often used in time series analysis to model the behavior of a variable over time. For example, an autoregressive model might be used to predict the future value of a stock price based on its past values.

There are two main types of autoregressive processes:

* **First-order autoregressive (AR(1)):** This is the simplest type of autoregressive process, and it is defined by the following equation:

$$y_t = \alpha y_{t-1} + \epsilon_t$$

where $y_t$ is the value of the variable at time $t$, $\alpha$ is a constant, and $\epsilon_t$ is a random error term.

* **Higher-order autoregressive (AR(p)):** This is a generalization of the first-order autoregressive process, and it is defined by the following equation:

$$y_t = \alpha_1 y_{t-1} + \alpha_2 y_{t-2} + ... + \alpha_p y_{t-p} + \epsilon_t$$

where $p$ is the order of the autoregressive process.

Autoregressive processes are often used in time series analysis because they can capture the dynamic relationship between a variable and its past values. This can be useful for forecasting future values of the variable.

However, autoregressive processes can also be difficult to estimate, and they can be sensitive to outliers. As a result, it is important to carefully consider the appropriateness of an autoregressive model before using it for forecasting.

Do you have a trading or investing definition for our dictionary? Click the Create Definition link to add your own definition. You will earn 150 bonus reputation points for each definition that is accepted.

Is this definition wrong? Let us know by posting to the forum and we will correct it.