Financial historical data are usually composed by associating values to time, or in other words, in finance we deal with time series. The problem in financial time series analysis is that we only have one sequence of observations (realisation process) that is one out of many different possible outcomes that could have arisen from a stochastic process. However, if the distribution of data remains unchanged over time (stationarity) the observations can be viewed as different outcomes from the same distribution. Moreover, if a process is not too steady (ergodicity) each observation contains some information not available from the other observations. Here, we review these notions from a mathematical point of view. 

Autocovariance function

The autocovariance and autocorrelation functions demonstrate the dependence over an infinite number of random variables for a time series \(\{X_t \}\). The autocovariance function of a time series \(\{ X_t \}\) is defined as

$$\gamma _{X} (s,t)=Cov(X_s,X_t) = E[(X_s - \mu_s)(X_t - \mu_t)],$$

where \(Var(X_t) < \infty\) for all \( t \). 

Stationarity

Stationarity is one of the most desired assumptions in time series analysis. A stochastic process \(\{X(t)\}\) is said to be strictly stationary if its distribution is time-invariant, i.e., the joint distribution function of \((X(t_1),...,X(t_n))\) is identical to that of \((X(t_1+\ell),...,X(t_n+ \ell))\) for all \(\ell\) and \(n\). Plainly speaking, strict stationarity means that the distribution of a process does not vary over time.

Unfortunately, strict stationarity is often hard to verify empirically as it requires all moments to be constant over time. Instead, a weaker version of stationarity is often considered, relaxing the above assumption to the first two moments, as follows:

\begin{align} E[X(t)] & = \mu, \, \, \, \, \, \text{for all} \, \, \,  t, \nonumber \\  \gamma_{X}(s,t) & = \gamma_{X} (s + \ell, t + \ell), \, \, \, \, \, \text{for all} \, \, \, t, \, s, \, \ell. \nonumber \end{align}

Particularly, the mean, variance and autocovariance do not change over time. If a time series \(\{X(t)\}\) satisfies these conditions it is said to be weakly stationary. In other words, under weak stationarity the mean of a process is constant and finite and the autocovariance function only depends on \(\tau = t-s\).

Ergodicity

Suppose that a stochastic process \(\{X(t)\}\) is weakly stationary. The process is weakly, or wide sense, ergodic if the time series sample mean and autocovariance converge in mean-square to their true (ensamble) statistical quantities, i.e.,

\begin{align} \lim _{n \rightarrow \infty} E[(\hat{\mu} -\mu)^2] & = 0, \nonumber \\ \lim_{n \rightarrow \infty} [E(\hat{\gamma}(\ell)- \gamma(\ell))^2] &=0, \,\,\, 0 \leq \ell <n, \nonumber \end{align}

where \(\hat{\mu} := \frac{1}{n} \sum_{i=1} ^{n} X(i)\) and \(\hat{\gamma}(\ell) := \frac{1}{n} \sum _{i=1 + \ell} ^{n} (X(i) - \hat{\mu})(X(i - \ell) - \hat{\mu})\).

In other words, ergodicity implies that the statistical properties of a process can be deduced given a single and sufficiently long sample path.

The ergodicity plays a central role in estimations of the statistical quantities, where it ensures that time series estimates serves as unbiased estimators of the considered statistical parameters.

 

References

1. J.P. Brockwell and A. R. Davis, Time Series: Theory and Methods. Springer Science + Business Media, LLC, 2nd edition, 1991.

2. D. J. Hamilton, Time Series Analysis. Princeton University Press, 1994.

3. S. R. Tsay. Analysis of Financial Time Series. John Wiley & Sons, Inc., 3rd edition, 2010.