11.2: Vector Autoregressive models VAR(p) models

Printer-friendly versionPrinter-friendly version

VAR models (vector autoregressive models) are used for multivariate time series. The structure is that each variable is a linear function of past lags of itself and past lags of the other variables.

As an example suppose that we measure three different time series variables, denoted by \(x_{t,1}\), \(x_{t,2}\), and \(x_{t,3}\).

The vector autoregressive model of order 1, denoted as VAR(1), is as follows:

\[x_{t,1} = \alpha_{1} + \phi_{11} x_{t−1,1} + \phi_{12}x_{t−1,2} + \phi_{13}x_{t−1,3} + w_{t,1}\]

\[x_{t,2} = \alpha_{2} + \phi_{21} x_{t−1,1} + \phi_{22}x_{t−1,2} + \phi_{23}x_{t−1,3} + w_{t,2}\]

\[x_{t,3} = \alpha_{3} + \phi_{31} x_{t−1,1} + \phi_{32}x_{t−1,2} + \phi_{33}x_{t−1,3} + w_{t,3}\]

Each variable is a linear function of the lag 1 values for all variables in the set.

In a VAR(2) model, the lag 2 values for all variables are added to the right sides of the equations, In the case of three x-variables there would be six variables on the right side of each equation, three lag 1 variables and three lag 2 variables.

In general, for a VAR(p) model, the first p lags of each variable in the system would be used as regression predictors for each variable.

VAR models are a specific case of more general VARMA models. VARMA models for multivariate time series include the VAR structure above along with moving average terms for each variable. More generally yet, these are special cases of ARMAX models that allow for the addition of other predictors that are outside the multivariate set of principal interest.

Here, as in Section 5.8 of the text, we’ll focus on VAR models.

On page 304, the authors fit the model of the form

\[\mathbf{x}_t = \Gamma \mathbf{u}_t + \phi \mathbf{x}_{t-1} + \mathbf{w}_t\]

where \(\mathbf{u}_t = (1, t)’\) includes terms to simultaneously fit the constant and trend. It arose from macroeconomic data where large changes in the data permanently affect the level of the series.

There is a not so subtle difference here from previous lessons in that we now are fitting a model to data that need not be stationary. In previous versions of the text, the authors separately de-trended each series using a linear regression with t, the index of time, as the predictor variable. The de-trended values for each of the three series are the residuals from this linear regression on t. The de-trending is useful conceptually because it takes away the common steering force that time may have on each series and created stationarity as we have seen in past lessons. This approach results in similar coefficients, though slightly different as we are now simultaneously fitting the intercept and trend together in a multivariate OLS model.

The R vars library authored by Bernhard Pfaff has the capability to fit this model with trend. Let’s look at 2 examples: a difference-stationary model and a trend-stationary model.

Difference-Stationary Model

Example 5.10 from the text is a difference-stationary model in that first differences are stationary. Let’s examine the code and example from the text by fitting the model above:

install.packages("vars") #If not already installed
install.packages("astsa") #If not already installed
library(vars)
library(astsa)
x = cbind(cmort, tempr, part)
plot.ts(x , main = "", xlab = "")
summary(VAR(x, p=1, type="both"))

  • The first two commands load the necessary commands from the vars library and the necessary data from our text’s library.
  • The cbind command creates a vector of response variables (a necessary step for multivariate responses).
  • The VAR command does estimation of AR models using ordinary least squares while simultaneously fitting the trend, intercept, and ARIMA model. The p = 1 argument requests an AR(1) structure and “both” fits constant and trend. With the vector of responses, it’s actually a VAR(1).

Following is the output from the VAR command for the variable tempr (the text provides the output for cmort):

R output

The coefficients for a variable are listed in the Estimate column. The .l1 attached to each variable name indicates that they are lag 1 variables.

Using notation T = temperature, t=time (collected weekly), M = mortality rate, and P = pollution, the equation for temperature is

\[\hat{T}_t = 67.586 - .007 t - 0.244 M_{t-1} + 0.487 T_{t-1} + 0.128 P_{t-1}\]

The equation for mortality rate is

\[\hat{M}_t = 73.227 – 0.014 t + 0.465 M_{t-1} - 0.361 T_{t-1} + 0.099 P_{t-1}\]

The equation for pollution is

\[\hat{P}_t = 67.464 - .005 t - 0.125 M_{t-1} - 0.477 T_{t-1} +0.581 P_{t-1}.\]

The covariance matrix of the residuals from the VAR(1) for the three variables is printed below the estimation results. The variances are down the diagonal and could possibly be used to compare this model to higher order VARs. The determinant of that matrix is used in the calculation of the BIC statistic that can be used to compare the fit of the model to the fit of other models (see formulas 5.89 and 5.90 of the text).

For further references on this technique see Analysis of integrated and co-integrated time series with R by Pfaff and also Campbell and Perron [1991].

In Example 5.11 on page 307, the authors give results for a VAR(2) model for the mortality rate data. In R, you may fit the VAR(2) model with the command

summary(VAR(x, p=2, type="both"))

The output, as displayed by the VAR command is as follows:

 R output

Again, the coefficients for a particular variable are listed in the Estimate column. As an example, the estimated equation for de-trended temperature is

\[\hat{T}_t = 49.88 - .005 t - 0.109 M_{t-1} + 0.261 T_{t-1} – 0.505 P_{t-1} - 0.041 M_{t-2} + 0.356 T_{t-2} – 0.095 P_{t-2}\]

We will discuss information criterion statistics to compare VAR models of different orders in the homework.

Residuals are also available for analysis. For example, if we assign the VAR command to an object titled fitvar2 in our program,

fitvar2 = VAR(x, p=2, type="both")

then we have access to the matrix residuals(fitvar2). This matrix will have three columns, one column of residuals for each variable.

For example, we might use

acf(residuals(fitvar2)[,1])

to see the ACF of the residuals for mortality rate after fitting the VAR(2) model.

Following is the ACF that resulted from the command just described. It looks good for a residual ACF. (The big spike at the beginning is the unimportant lag 0 correlation.)

 R plot of residuals

The following two commands will create ACFs for the residuals for the other two variables.

acf(residuals(fitvar2)[,2])

acf(residuals(fitvar2)[,3])

They also resemble white noise.

We may also examine these plots in the cross-correlation matrix provided by acf(residuals(fitvar2)):

cross-correlation matrix plots

The plots along the diagonal are the individual ACFs for each model’s residuals that we just discussed above. In addition, we now see the cross-correlation plots of each set of residuals. Ideally, these would also resemble white noise, however we do see remaining autocorrelations, especially between temperature and pollution. As our authors note, this model does not adequately capture the complete association between these variables in time.

Trend-Stationary Model

Lets explore an example where the original data are stationary and examine the VAR code by fitting the model above with both a constant and trend. Using R, we simulated n = 500 sample values using the VAR(2) model

\[y_{t,1} = 10 +.25y_{t−1,1} - .20y_{t−1,2} -.40y_{t−2,1} -.65y_{t−2,2}\]

\[y_{t,2} = 20 +.60y_{t−1,1} - .45y_{t−1,2} +.50y_{t−2,1} +.35y_{t−2,2}\]

Using the VAR command explained above:

y1=scan("var2daty1.dat")
y2=scan("var2daty2.dat")
summary(VAR(cbind(y1,y2), p=2, type="both"))

We obtain the following output:

 R output

R output

The estimates are very close to the simulated coefficients and the trend is not significant, as expected. For stationary data, when detrending is unnecessary, you may also use the ar.ols command to fit a VAR model:

ar.ols(cbind(y1, y2), order=2)

 R output

In the first matrix given, read across a row to get the coefficients for a variable. The preceding commas followed by 1 or 2 indicate whether the coefficients are lag 1 or lag 2 variables respectively. The intercepts of the equations are given under $x.intercept – one intercept per variable.

The matrix under $var.pred gives the variance-covariance matrix of the residuals from the VAR(2) for the two variables. The variances are down the diagonal and could possibly be used to compare this model to higher order VARs as noted above. 

The standard errors of the AR coefficients are given by the fitvar2$asy.se.coef command. The output is

 R output

As with the coefficients, read across rows. The first row gives the standard errors of the coefficients for the lag 1 variables that predict y1. The second row gives the standard errors for the coefficients that predict y2.

You may note that the coefficients are close to the VAR command except the intercept. This is because ar.ols estimates the model for x-mean(x). To match the intercept provided by the summary(VAR(cbind(y1,y2), p=2, type="const")) command, you must calculate the intercept as follows:

\[(y_{t,1} - \hat{\mu}_1) = \alpha_1 + \phi_{11} (y_{t−1,1} – \hat{\mu}_1) + \phi_{12} (y_{t−1,2} – \hat{\mu}_2) + \phi_{21}(yt−2,1 – \hat{\mu}_1) + \phi_{22}(y_{t−2,2} – \hat{\mu}_2) + w_{t,1}\]

\[y_{t,1} = \alpha_1 + \hat{\mu}_{1} (1- \phi_{11}- \phi_{21}) – \hat{\mu}_{2} (\phi_{12}+\phi_{22}) + \phi_{11}y_{t−1,1} + \phi_{12} y_{t−1,2} + \phi_{21}y_{t−2,1} + \phi_{22}y_{t−2,2} + w_{t,1}\]

In our example, the intercept for the simulated model for yt,1 equals

-0.043637 -2.733607*(1-0.2930+0.4523) – 15.45479*(-0.1913-0.6365) = 9.580768,

and the estimated equation for yt,1

\[y_{t,1}= 9.58 +.29y_{t−1,1} - .19y_{t−1,2} -.45y_{t−2,1} -.64y_{t−2,2}\]

Estimation with Minitab

For Minitab users, here’s the general flow of what to do.

  • Read the data into columns.
  • Use Time Series > Lag to create the necessary lagged columns of the stationary values.
  • Use Stat > ANOVA > General MANOVA.
  • Enter the list of “present time” variables as the response variables.
  • Enter the lagged x variables as covariates (and as the model).
  • Click Results and select “Univariate Analysis” (to see the estimated regression coefficients for each equation).
  • If desired, click “Storage” and select Residuals and/or Fits.