Structural Vector Autoregression¶
Definition¶
A \(\smash{p}\) th order structural vector autoregression generalizes has the form:
\[\smash{B_0\boldsymbol{Y}_{t} = \boldsymbol{k} +
B_{1}\boldsymbol{Y}_{t-1} + B_{2}\boldsymbol{Y}_{t-2} +
\ldots + + B_{p}\boldsymbol{Y}_{t-p} +
\boldsymbol{u}_{t}}.\]
- \(\smash{Y_{t} = (Y_{1t},Y_{2t},\ldots,Y_{nt})^{'}}\) is an \(\smash{n \times 1}\) vector of random variables.
- \(\smash{\boldsymbol{k} = (k_1,k_2,\ldots,k_n)^{'}}\) is an \(\smash{n \times 1}\) vector of constants.
- \(\smash{B_{j}}\) is an \(\smash{n \times n}\) matrix of autoregressive coefficients for \(\smash{j = 0,\ldots,p}\).
- \(\smash{\boldsymbol{u}_{t} = (u_{1t},\ldots, u_{nt})^{'} }\) is a vector white noise process:
\[\begin{split}\smash{E[\boldsymbol{u}_{t}] = \boldsymbol{0} \,\,\, \text{and}
\,\,\,
E[\boldsymbol{u}_{t}\boldsymbol{u}^{'}_{\tau}] =
\bigg\{\begin{array}{c} U \hspace{10pt} t = \tau \\ 0
\hspace{10pt} \text{o/w} \\ \end{array}}.\end{split}\]
- \(\smash{\boldsymbol{Y}_{t}}\) is referred to as a \(\smash{SVAR(p)}\).
Structural Models¶
\(\smash{SVAR}\) s
Allow for contemporaneous relationships among the variables in \(\smash{Y_{t}}\).
- Not only are the elements of \(\smash{Y_{it}}\) related to \(\smash{Y_{1\tau},Y_{2\tau},\ldots,Y_{n\tau}}\) for \(\smash{\tau = t-1, \ldots,t-p}\), but they are also related to \(\smash{Y_{1t},Y_{2t},\ldots,Y_{nt}}\).
- Such relationships allow for the expression of theoretical (structural) economic model relationships.
Example¶
Consider the following bivariate model for asset returns, \(\smash{r_t}\), and order flow, \(\smash{x_t}\):
\[\begin{split}\begin{align*}
r_t & = \alpha_{x0} x_t + \alpha_{r1} r_{t-1} + \alpha_{x1} x_{t-1} +
\varepsilon_{rt} \\
x_t & = \beta_{r1} r_{t-1} + \beta_{x1} x_{t-1} + \varepsilon_{xt}.
\end{align*}\end{split}\]
- \(\smash{r_t}\) is typically measured as the differences in log prices.
- \(\smash{x_t}\) is signed trade volume: total number of shares traded at the offer (demand), less the total number of shares traded on the bid (supply).
- Notice that \(\smash{r_t}\) depends on \(\smash{x_t}\) and lags of both variables, whereas \(\smash{x_t}\) only depends on lags.
Example¶
The model can be expressed as
\[\begin{split}\begin{align}
\left[\begin{array}{cc} 1 & -\alpha_{x0} \\ 0 & 1 \end{array}
\right] \left[\begin{array}{c} r_t \\ x_t \end{array} \right] & =
\left[\begin{array}{cc} \alpha_{r1} & \alpha_{x1} \\ \beta_{r1} &
\beta_{x1} \end{array} \right] \left[\begin{array}{c} r_{t-1} \\
x_{t-1} \end{array} \right] + \left[\begin{array}{c} \varepsilon_{rt} \\
\varepsilon_{xt} \end{array} \right].
\end{align}\end{split}\]
\(\smash{SVAR}\) as \(\smash{VAR}\)¶
Under the assumption that \(\smash{B_0}\) is invertible, a \(\smash{SVAR}\) can be expressed as a traditional \(\smash{VAR}\):
\[\begin{split}\begin{align}
\boldsymbol{Y}_{t} & = B_0^{-1} \boldsymbol{k} +
B_0^{-1} B_{1}\boldsymbol{Y}_{t-1} + B_0^{-1}
B_{2}\boldsymbol{Y}_{t-2} + \ldots + B_0^{-1}
B_{p}\boldsymbol{Y}_{t-p} + B_0^{-1} \boldsymbol{u}_{t} \\
& = \boldsymbol{c} +
\Phi_{1}\boldsymbol{Y}_{t-1} + \Phi_{2}\boldsymbol{Y}_{t-2} +
\ldots + \Phi_{p}\boldsymbol{Y}_{t-p} +
\boldsymbol{\varepsilon}_{t}.
\end{align}\end{split}\]
- This is the reduced form representation of the \(\smash{SVAR}\).
\(\smash{VAR}\) as \(\smash{SVAR}\)¶
Recall that a \(\smash{VAR(p)}\) can be expressed in terms of orthoganalized errors via its \(\smash{VMA(\infty)}\) representation:
\[\begin{split}\begin{align*}
\boldsymbol{Y}_{t} & = \boldsymbol{\mu} +
H^{-1}H\boldsymbol{\varepsilon}_{t} +
\Psi_{1}(H^{-1}H)\boldsymbol{\varepsilon}_{t-1} + \ldots \\
& = \boldsymbol{\mu} + J_{0}\boldsymbol{u}_{t} +
J_{1}\boldsymbol{u}_{t-1} + J_{2}\boldsymbol{u}_{t-2} + \ldots
\end{align*}\end{split}\]
- \(\smash{H}\) is the matrix such that \(\smash{\,\,\,\,H \Omega H^{'} = D}\), where \(\smash{\text{E}[\varepsilon_t \varepsilon_t^{\prime}] = \Omega}\).
Consider the system
\[\smash{H \boldsymbol{Y}_{t} = H \boldsymbol{c} +
H \Phi_{1}\boldsymbol{Y}_{t-1} + H
\Phi_{2}\boldsymbol{Y}_{t-2} + \ldots + H
\Phi_{p}\boldsymbol{Y}_{t-p} + H
\boldsymbol{\varepsilon}_{t}}.\]
- This is a \(\smash{SVAR}\) with \(\smash{B_0 = H}\). \(\smash{B_i = H \Phi_i}\) and orthogonal errors \(\smash{\boldsymbol{u}_t = H \boldsymbol{\varepsilon}_t}\).
Identification¶
A \(\smash{SVAR}\) is identified if the number of parameters in \(\smash{B_0}\) and \(\smash{D}\) are equal to those of \(\smash{\Omega}\).
- Since \(\smash{D}\) is diagonal (\(\smash{n}\) parameters), \(\smash{B_0}\) can have at most \(\smash{n(n-1)/2}\) parameters.
- For example, this condition is satisfied if \(\smash{B_0}\) is lower triangular (with ones on the diagonal).