# Stationary process

In the mathematical sciences, a stationary process (or strict(ly) stationary process or strong(ly) stationary process) is a stochastic process whose joint probability distribution does not change when shifted in time or space. As a result, parameters such as the mean and variance, if they exist, also do not change over time or position.

Stationarity is used as a tool in time series analysis, where the raw data are often transformed to become stationary, for example, economic data are often seasonal and/or dependent on the price level. Processes are described as trend stationary if they are a linear combination of a stationary process and one or more processes exhibiting a trend. Transforming this data to leave a stationary data set for analysis is referred to as de-trending.

## Definition

Formally, let $X_{t}$ be a stochastic process and let $F_{X_{t_{1}},\ldots ,X_{t_{k}}}(x_{t_{1}},\ldots ,x_{t_{k}})$ represent the cumulative distribution function of the joint distribution of $X_{t}$ at times $t_{1},\ldots ,t_{k}$ . Then, $X_{t}$ is said to be stationary if, for all $k$ , for all $\tau$ , and for all $t_{1},\ldots ,t_{k}$ ,

$F_{X_{t_{1}},\ldots ,X_{t_{k}}}(x_{t_{1}},\ldots ,x_{t_{k}})=F_{X_{t_{1}+\tau },\ldots ,X_{t_{k}+\tau }}(x_{t_{1}},\ldots ,x_{t_{k}}).$ ## Examples

As an example, white noise is stationary. However, the sound of a cymbal crashing is not stationary because the acoustic power of the crash (and hence its variance) diminishes with time.

An example of a discrete-time stationary process where the sample space is also discrete (so that the random variable may take one of N possible values) is a Bernoulli scheme. Other examples of a discrete-time stationary process include autoregressive and moving average signals.

## Weak or wide-sense stationarity

A weaker form of stationarity commonly employed in signal processing is known as weak-sense stationarity, wide-sense stationarity (WSS), second-order stationarity or covariance stationarity. WSS random processes only require that 1st and 2nd moments do not vary with respect to time. Any strictly stationary process which has a mean and a covariance is also WSS.

So, a continuous-time random process x(t) which is WSS has the following restrictions on its mean function

$\mathbb {E} \{x(t)\}=m_{x}(t)=m_{x}(t+\tau )\,\,\forall \,\tau \in \mathbb {R}$ and correlation function

$\mathbb {E} \{x(t_{1})x(t_{2})\}=R_{x}(t_{1},t_{2})=R_{x}(t_{1}+\tau ,t_{2}+\tau )=R_{x}(t_{1}-t_{2},0)\,\,\forall \,\tau \in \mathbb {R} .$ The first property implies that the mean function mx(t) must be constant. The second property implies that the correlation function depends only on the difference between $t_{1}$ and $t_{2}$ and only needs to be indexed by one variable rather than two variables. Thus, instead of writing,

$\,\!R_{x}(t_{1}-t_{2},0)\,$ we usually abbreviate the notation and write

$R_{x}(\tau )\,\!{\mbox{ where }}\tau =t_{1}-t_{2}.$ When processing WSS random signals with linear, time-invariant (LTI) filters, it is helpful to think of the correlation function as a linear operator. Since it is a circulant operator (depends only on the difference between the two arguments), its eigenfunctions are the Fourier complex exponentials. Additionally, since the eigenfunctions of LTI operators are also complex exponentials, LTI processing of WSS random signals is highly tractable—all computations can be performed in the frequency domain. Thus, the WSS assumption is widely employed in signal processing algorithms. 