In this post, I introduce the cyclic autocorrelation function (CAF). The easiest way to do this is to first review the conventional autocorrelation function. Suppose we have a complex-valued signal defined on a suitable probability space. Then the mean value of is given by
For stationary signals, and many cyclostationary signals, this mean value is independent of the lag parameter , so that
The autocorrelation function is the correlation between the random variables corresponding to two time instants of the random signal, or
To see how the autocorrelation varies with some particular central time , we can use a more convenient parameterization of the two time instants and , such as
So time represents the center point of the two time instants involved in the correlation and is their separation. If the autocorrelation depends only on the separation between the two time instants , and not their center point , the signal is stationary of order two, or just stationary, and we have
This is a convenient way of saying that the autocorrelation depends only on the difference between and , and not on their midpoint.
For nonstationary signals, on the other hand, the autocorrelation does depend on central time . For the special case of nonstationary signals called cyclostationary signals, the autocorrelation is either a periodic function or an almost periodic function. In either case, it can be represented by a Fourier series
where is a Fourier-series coefficient called the cyclic autocorrelation function. The Fourier frequencies are called cycle frequencies (CFs). The cyclic autocorrelation functions are obtained in the usual way for Fourier coefficients,
If the signal is a cycloergodic signal (or we are using fraction-of-time probability), then the cyclic autocorrelation function can be obtained directly from a sample path of the random process (the observed signal itself),
For many cyclostationary signals, such as BPSK, the conjugate autocorrelation function is also non-zero (and is also useful in practice). This function is defined by
and is represented by its own Fourier series
where is the conjugate cyclic autocorrelation function and here is a conjugate cycle frequency. I explain in detail why we need two autocorrelation functions in the post on conjugation configurations. The problem is worse when we look at higher-order moments and cumulants, where we need functions to properly characterize a signal at order .
Symmetric versus Asymmetric Cyclic Autocorrelation Functions
The conventional autocorrelation and cyclic autocorrelation functions are symmetric in the delay variable in that it appears as in one factor of the delay product and as in the other:
When attempting to estimate the cyclic autocorrelation function, using discrete-time data, the lag variable and the time index variable take on integer values, so that does not correspond to a known value of when is odd. To work around this practical problem, we can operate in the frequency domain, directly estimating the spectral correlation function and inverse Fourier transforming it to obtain the cyclic autocorrelation. But we can also employ the asymmetric cyclic autocorrelation, which is friendly to discrete-time data, and then scale it appropriately to obtain the symmetric version. Let’s go through the supporting analysis here.
The asymmetric cyclic autocorrelation is
but it could be defined in other ways, including
Staying with definition (11), and using the change of variables , we have
so that we can compute the asymmetric version and then scale it by a simple complex exponential to obtain the symmetric version.
And that is it! The definitions in this post are the basic definitions for the (second-order) probabilistic parameters of cyclostationary signals in the time-domain. In other posts, I have much to say about their utility, their estimation, their connection to the frequency-domain parameters, and their generalization to higher-order parameters.