Random signal modeling#
Random processes are characterized by an ensemble of sample functions generated by the process. As an example of a random process consider the noise generation in a resistor. Across the terminals of the resistor, a noise voltage can be measured, recorded, and plotted versus time. We would like to have expressions for the time average, the average power, the RMS value and the power spectral density for this signal. This, however, is not possible, since we do not have a time-domain description of a random signal. In many situations, we can obtain these quantities from a statistical description of the random process using its so-called probability density function.
To do so, we make different recordings for a large number of equal resistors. These recordings are made over a limited time interval \(T\). This so-called ensemble of truncated sample signals \(x_{T_{i}}(t)\), corresponding to the sample signals from the different resistors \(x_{i}(t)\), can be defined as
Now, we consider a specific time instant \(t_{1}.\) The amplitudes of the truncated sample functions at that time instant constitute a random variable \(\underline{x}_{1}(t_{1}).\) The possible values of this random variable are denoted as \(x_{1}.\) We are now able to set up an histogram by counting the number of samples of \(\underline{x}_{1}(t_{1})\) that fall within a range between \(x_{1}\) and \(x_{1}+\Delta x_{1}\), for all successive values of \(x_{1} \). This histogram is a discrete approximation of the probability density function \(P(\underline{x},t_{1})\) of the random variable \(\underline{x}\) at a time instant \(t=t_{1}\). The probability that a random variable \(\underline{x} (t)\) has a value between \(a\) and \(b,\) at time instant \(t\) is then obtained as
By definition:
Some examples of probability density functions are uniform and Gaussian probability functions. Uniform probability functions have a probability \(P(\underline{x},t)\) that does not depend on \(x.\) A Gaussian probability density function is characterized by its mean value \(\mu\) and its root mean square value \(\sigma\) for \(\mu=0:\)
Stationary and ergodic processes#
A random process is called stationary if its statistical properties do not change with time. For stationary processes, the averages, the correlation functions and the power spectral densities do not depend on time. It is important to notice that, from the observer’s point of view, a process can be called stationary as long as its properties do not change appreciably over the time of interest.
A process is called ergodic if it can fully be described by one sample function only. For ergodic processes, we may obtain the values for the random variable \(\underline{x}(t)\) only from one truncated sample function \(x_{T} (t)\). Ergodic processes are always stationary. A stationary process, however, is not necessarily ergodic. Knowledge about the random process that generates the information is indispensable for setting up methods for extracting characteristic properties of the information from the process. This extraction is needed to find the characteristic properties of the information-carrying signals and to formulate the requirements for the information processing system. For the verification of the behavior of an information processing system, relevant properties of the deterministic test signals must be related to these characteristic properties of the information source. For this purpose, we will introduce some characteristic properties of signals, generated by random processes, and relate them to properties of deterministic signals that are often used as test signals.
Time average and ensemble average#
The ensemble average, or the \textit{expectation } \(E\{\underline{x}(t)\}\) of a random variable \(\underline{x}(t)\) at time instant \(t,\) is defined as the weighted sum of all values of the sample functions at that time instant \(t\), each weighted according to its probability:
If we have \(n\) recordings or truncated sample functions \(x_{T_{i}}(t),\) we can estimate the expectation at \(t=t_{1}\) from the average value of the \(n\) samples \(\overline{\underline{x}_{1}(t_{1})}:\)
The time average \(\overline{x_{i}(t)}\) of the sample signal \(x_{i}(t)\) is the average value of the truncated recording \(x_{T_{i}}(t)\). It is obtained as
If a process is stationary, the ensemble average does not depend on time. If the random process is also ergodic, the ensemble average \(E\{\underline{x} _{1}(t_{1})\}\) of a random variable \(\underline{x}(t)\) equals the time average \(\overline{x_{i}(t)}\) of one recording \(x_{i}(t).\)
Correlation function#
We have seen that a random variable at time instant \(t=t_{1}\) can be described by its probability density function \(P(x,t_{1}).\) Now, let us assume that, in another random process, a random variable \(\underline{y}(t)\) at \(t=t_{2}\) is described by its probability density function \(P(y,t_{2}).\) The so-called joint probability density function \(P(x,t_{1};y,t_{2})\) gives the probability that \(\underline{x}(t_{1})\) has a value between \(x\) and \(x+dx_{1},\) while \(\underline{y}(t_{2})\) has a value between \(y\) and \(y+dy,\) which can be written as \(P(x,t_{1};y,t_{2})dxdy.\) The correlation function tells us something about the similarity between these two processes. It is defined as the expectation of the product of the two random variables \(\underline{x} (t_{1})\) and \(\underline{y}(t_{2})\) at time instants \(t_{1}\) and \(t_{2},\) respectively:
If both processes are stationary, the correlation function only depends on the time difference \(t_{2}-t_{1}\). If, in addition, the process is ergodic, we may approximate the correlation using two truncated recordings \(x_{T}(t)\) and \(y_{T}(t)\). For a time difference \(t_{2}-t_{1}=\tau,\) we may then estimate the correlation function \(r_{xy}(\tau)\) of two sample signals \(x_{T}(t)\) and \(y_{T}(t)\) generated by an ergodic process as
From this expression, we intuitively expect that the correlation function can be interpreted as a measure for the joint power of \(x(t)\) and \(y(t).\) This can be expressed by saying that two signals are correlated when they have a nonzero joint power. As a consequence, the similarity between two signals \(x(t)\) and \(y(t)\) can also be found from the average power of the sum of both signals and the sum of the average powers of the two signals individually. The average power of their sum is proportional to
The third term in the expression can be written as \(2r_{xy}(0),\) and if it is zero, the sum of the powers of \(x(t)\) and \(y(t)\) equals the power of the sum of both signals. Two signals \(x(t)\) and \(y(t)\) are said to be uncorrelated or orthogonal when \(r_{xy}(0)=0\).
Autocorrelation function#
Of particular importance is the so-called autocorrelation function. This function tells us something about the correspondence between the values of one random variable, at two time instants. If the rate of change of a random variable is large, the correspondence will rapidly drop with the difference between the two time instants. The autocorrelation function is defined as the expectation of the product of a random variable at one time instant and the same random variable at a second time instant. It is written as:
For an ergodic process, the autocorrelation function \(r_{x}(\tau)\) may be approximated using one sample function of the process with its replica, shifted \(\tau\) in time. It is an important characteristic for the correspondence between signal values at time distance \(\tau\) and tells us something about the rate of change of a signal:
As an example, consider random noise \(n(t)\), which has been passed through a low-pass filter. Since the output noise of the filter cannot change rapidly, the similarity between values will be large for small values of \(\tau\). For increasing values of \(\tau\), the similarity will become smaller, and finally, \(r_{x}(\tau)\) will drop to zero.
Mean square value#
The mean square value of a random variable \(\underline{x}(t)\) is defined as the expectation of the squared value of the random variable that equals the value of the autocorrelation function for \(t_{1}=t_{2}:\)
If we are dealing with ergodic processes, the mean square value of a truncated sample function random \(x_{T}(t)\) can be estimated from \(r_{x}(0)\) as
Wiener-Khintchine theorem#
We have seen that the autocorrelation function \(r_{x}(\tau)\) tells us something about the correspondence between signal values at two time instants. When \(r_{x}(\tau)\) drops to zero for a very small value of \(\tau\), the signal \(x(t)\) changes rapidly and its bandwidth must be large. The relation between the frequency contents of a signal and its autocorrelation function is given by the Wiener-Khintchine theorem. This theorem asserts that for a stationary process, the power spectral density \(S_{x}(\omega)\) is the Fourier transform of the autocorrelation function:
where \(\mathcal{F}[\left\{ x(t)\right\} \) is the Fourier Transform of \(x(t)\).
Power spectral density#
The power spectral density \(S(f)\) of a signal \(x(t)\) specifies the power or the mean square value of that signal per unit of bandwidth [Hz]. For electrical signals, \(S(f)\) will have the dimension of W/Hz. The mean square value of a time-limited recording of a signal \(x(t)\), of which the frequency contents is limited between \(f_{1}\) and \(f_{2}\), and with a recording time \(T\) over which \(x(t)\) can be considered to be stationary, can be obtained both from the time-domain and the frequency-domain descriptions: