Stationarity¶
Introduction¶
Time series analysis is concerned with dynamics.
- We may have complete knowledge of the unconditional distribution of a group of random variables but no understanding of their sequential dynamics.
- Time series is focused on understanding the sequential relationship of a group of random variables.
- Hence, the focus is conditional distributions and autocovariances.
Time Series¶
A time series is a stochastic process indexed by time:
- Stochastic is a synonym for random.
- So a time series is a sequence of (potentially different) random variables ordered by time.
- We will let lower-case letters denote a realization of a time series.
Distributions¶
We will think of \({\bf Y}_T = \{Y_t\}_{t=1}^T\) as a random variable in its own right.
- \({\bf y}_T = \{y_t\}_{t=1}^T\) is a single realization of \({\bf Y}_T = \{Y_t\}_{t=1}^T\).
- The CDF is \(F_{{\bf Y}_T}({\bf y}_T)\) and the PDF is \(f_{{\bf Y}_T}({\bf y}_T)\).
- For example, consider \(\smash{T = 100}\):
- Notice that \({\bf Y}_T\) is just a collection of random variables and \(f_{{\bf Y}_T}({\bf y}_T)\) is the joint density.
Time Series Observations¶
As statisticians and econometricians, we want many observations of \({\bf Y}_T\) to learn about its distribution:
Likewise, if we are only interested in the marginal distribution of \(\smash{Y_{17}}\)
we want many observations: \(\smash{\left\{y_{17}^{(i)}\right\}_{i=1}^N}\).
Time Series Observations¶
Unfortunately, we usually only have one observation of \({\bf Y}_T\).
- Think of the daily closing price of Harley-Davidson stock since January 2nd.
- Think of your cardiogram for the past 100 seconds.
In neither case can you repeat history to observe a new sequence of prices or electric heart signals.
- In time series econometrics we typically base inference on a single observation.
- Additional assumptions about the process will allow us to exploit information in the full sequence \({\bf y}_T\) to make inferences about the joint distribution \(F_{{\bf Y}_T}({\bf y}_T)\).
Moments¶
Since the stochastic process is comprised of individual random variables, we can consider moments of each:
Moments¶
where \(\smash{f_{Y_t}}\) and \(\smash{f_{Y_t,Y_{t-j}}}\) are the marginal distributions of \(f_{{\bf Y}_T}\) obtained by integrating over the appropriate elements of \({\bf Y}_T\).
Autocovariance and Autocorrelation¶
- \(\smash{\gamma_{jt}}\) is known as the \(\smash{j}\) th autocovariance of \(\smash{Y_t}\) since it is the covariance of \(\smash{Y_t}\) with its own lagged value.
- The \(\smash{j}\) th autocorrelation of \(\smash{Y_t}\) is defined as
Sample Moments¶
If we had \(N\) observations \({\bf y}_T^{(1)},\ldots,{\bf y}_T^{(N)}\), we could estimate moments of each (univariate) \(\smash{Y_t}\) in the usual way:
Example¶
Suppose each element of \({\bf Y}_T\) is described by
Example¶
In this case,
- If \(\smash{\sigma^2_t = \sigma^2}\) \(\smash{\forall t}\), \({\bf \varepsilon}_T\) is known as a Gaussian white noise process.
- In this case, \({\bf Y}_T\) is a Gaussian white noise process with drift.
- \({\bf \mu}_T\) is the drift vector.
White Noise¶
Generally speaking, \({\bf \varepsilon}_T\) is a white noise process if
White Noise¶
Notice there is no distributional assumption for \(\varepsilon_t\).
- If \(\smash{\varepsilon_t}\) and \(\smash{\varepsilon_{\tau}}\) are independent for \(\smash{t \neq \tau}\), \({\bf \varepsilon}_T\) is independent white noise.
- Notice that independence \(\smash{\Rightarrow E[\varepsilon_t \varepsilon_{\tau}] = 0}\), but \(E[\varepsilon_t \varepsilon_{\tau}] = 0 \not \Rightarrow\) independence.
- If \(\smash{\varepsilon_t \sim \mathcal{N}(0, \sigma^2)}\) \(\forall t\), as in the example above, \({\bf \varepsilon}_T\) is Gaussian white noise.
Weak Stationarity¶
Suppose the first and second moments of a stochastic process \({\bf Y}_{T}\) don’t depend on \(\smash{t \in T}\):
- In this case \({\bf Y}_{T}\) is weakly stationary or covariance stationary.
- In the previous example, if \(\smash{Y_t = \mu + \varepsilon_t}\) \(\smash{\forall t}\), \({\bf Y}_{T}\) is weakly stationary.
- However if \(\smash{\mu_t \neq \mu}\) \(\smash{\forall t}\), \({\bf Y}_{T}\) is not weakly stationary.
Autocorrelation under Weak Stationarity¶
If \({\bf Y}_{T}\) is weakly stationary
- Note that \(\smash{\rho_0 = 1}\).
Weak Stationarity¶
Under weak stationarity, autocovariances \(\smash{\gamma_j}\) only depend on the distance between random variables within a stochastic process:
This implies
Weak Stationarity¶
More generally,
Strict Stationarity¶
\({\bf Y}_{T}\) is strictly stationary if for any set \(\smash{\{j_1, j_2, \ldots, j_n\} \in T}\)
- Strict stationarity means that the joint distribution of any subset of random variables in \({\bf Y}_{T}\) is invariant to shifts in time, \(\smash{\tau}\).
- Strict stationarity \(\smash{\Rightarrow}\) weak stationarity if the first and second moments of a stochastic process exist.
- Weak stationarity \(\smash{\not \Rightarrow}\) strict stationarity: invariance of first and second moments to time shifts (weak stationarity) does not mean that all higher moments are invariant to time shifts (strict stationarity).
Strict Stationarity¶
If \({\bf Y}_{T}\) is Gaussian then weak stationarity \(\smash{\Rightarrow}\) strict stationarity.
- If \({\bf Y}_{T}\) is Gaussian, all marginal distributions of \(\smash{(Y_{j_1}, \ldots, Y_{j_n})}\) are also Gaussian.
- Gaussian distributions are fully characterized by their first and second moments.
Ergodicity¶
Given \(\smash{N}\) identically distributed weakly stationary stochastic processes \(\left\{{\bf Y}_{T}\right\}_{i=1}^N\), the ensemble average is
For a single stochastic process, we desire conditions under which the time average
Ergodicity¶
If \({\bf Y}_{T}\) is weakly stationary and
Then \({\bf Y}_{T}\) is ergodic for the mean and the time average converges.
- The equation above requires that the autocovariances fall to zero sufficiently quickly.
- i.e. a long realization of \(\smash{\{y_t\}}\) will have many segments that are uncorrelated and which can be used to approximate an ensemble average.
Ergodicity¶
A weakly stationary process is ergodic for the second moments if
- Separate conditions exist which cause the equation above to hold.
- If \({\bf Y}_{T}\) is Gaussian and stationary, then \(\smash{\sum_{j=0}^{\infty} |\gamma_j| < \infty}\) ensures that \(\smash{{\bf Y}_{T}}\) is ergodic for all moments.