# The Cyclic Autocorrelation

In this post, I introduce the cyclic autocorrelation function (CAF). The easiest way to do this is to first review the conventional autocorrelation function. Suppose we have a complex-valued signal $x(t)$ defined on a suitable probability space. Then the mean value of $x(t)$ is given by

$M_x(t, \tau) = E[x(t + \tau)]. \hfill (1)$

For stationary signals, and many cyclostationary signals, this mean value is independent of the lag parameter $\tau$, so that

$\displaystyle M_x(t, \tau_1) = M_x(t, \tau_2) = M_x(t, 0) = M_x(t). \hfill (2)$

The autocorrelation function is the correlation between the random variables corresponding to two time instants of the random signal, or

$\displaystyle R_x(t_1, t_2) = E[x(t_1)x^*(t_2)]. \hfill (3)$

To see how the autocorrelation varies with some particular central time $t$, we can use a more convenient parameterization of the two time instants $t_1$ and $t_2$, such as

$\displaystyle R_x(t, \tau) = E[x(t+\tau/2)x^*(t-\tau/2)], \hfill (4)$

where

$\displaystyle t_1 = t+\tau/2$

$\displaystyle t_2 = t-\tau/2$

and

$\displaystyle t = (t_1 + t_2)/2$

$\displaystyle \tau = t_1 - t_2$

So time $t$ represents the center point of the two time instants and $\tau$ is their separation. If the autocorrelation depends only on the separation between the two time instants, and not their center point, the signal is stationary of order two, or just stationary, and we have

$\displaystyle R_x(t_1, \tau) = R_x(t_2, \tau) = R_x(0, \tau) = R_x(\tau). \hfill (5)$

For nonstationary signals, on the other hand, the autocorrelation does depend on time $t$. For the special case of nonstationary signals called cyclostationary signals, the autocorrelation is either a periodic function or an almost periodic function. In either case, it can be represented by a Fourier series

$\displaystyle R_x(t, \tau) = \sum_\alpha R_x^\alpha (\tau) e^{i 2 \pi \alpha t}, \hfill (6)$

where $R_x^\alpha(\tau)$ is a Fourier-series coefficient called the cyclic autocorrelation function. The Fourier frequencies $\alpha$ are called cycle frequencies (CFs). The CAFs are obtained in the usual way for Fourier coefficients,

$\displaystyle R_x^\alpha(\tau) = \lim_{T\rightarrow\infty} \frac{1}{T} \int_{-T/2}^{T/2} R_x(t,\tau) e^{-i 2 \pi \alpha t}\,dt. \hfill (7)$

If the signal is a cycloergodic signal (or we are using fraction-of-time probability), then the CAFs can be obtained directly from a sample path (the signal itself),

$\displaystyle R_x^\alpha(\tau) = \lim_{T\rightarrow\infty} \frac{1}{T} \int_{-T/2}^{T/2} x(t+\tau/2) x^* (t-\tau/2) e^{-i 2 \pi \alpha t} \, dt. \hfill (8)$

For many cyclostationary signals, the conjugate autocorrelation function is also non-zero (and also useful). This function is defined by

$\displaystyle R_{x^*}(t, \tau) = E[x(t+\tau/2)x(t-\tau/2)], \hfill (9)$

and is represented by its own Fourier series

$\displaystyle R_{x^*}(t, \tau) = \sum_\alpha R_{x^*}^\alpha (\tau) e^{i 2 \pi \alpha t}. \hfill (10)$

I explain in detail why we need two autocorrelation functions in this post. The problem is worse when we look at higher-order moments and cumulants, where we need $n/2 + 1$ functions to properly characterize a signal at order $n$.

And that is it! These are the basic definitions for the (second-order) probabilistic parameters of cyclostationary signals in the time-domain. In later posts, I’ll have much to say about their utility, their estimation, and their connection to the frequency-domain parameters.

## 33 thoughts on “The Cyclic Autocorrelation”

1. Lelele Viyviy says:

Hello dear,

May you share your codes for cyclic autocorrelation function?
Have a nice day.

2. Abdul says:

Hi Dr. Spooner,

Thank you for making this blog. Are there certain qualities of a signal that are more noticeable using the Cyclic Autocorrelation Function (CAF) versus the Spectral Correlation Function (SCF)? In other words, are there situations where the usage of the CAF is favorable over the SCF?

Thank you,
Abdul

• I think you asked this in a slightly different way in the comments to this post; see that answer. Thanks!

• manifest7 says:

Haha, my apologies. I initially expected my comment to be viewable immediately. I don’t normally use wordpress and didn’t realize that my comment was under moderation until the second time I tried to comment.

3. manifest7 says:

Hi Dr. Spooner,

Thank you for this blog. Are there situations where it would be advantageous to utilize the Cyclic Autocorrelation Function over the Spectral Correlation Function?

Thanks

• I’ve found the CAF to be useful when doing CSP for OFDM signals (see the
work of Octavia Dobre).
two functions depend on how much prior information you have about your signals.
When you don’t know anything, and you want to do RFSA, the SCF is quite useful because of FFT-based estimators like the SSCA and FAM. But if you know
quite a lot, such as a cycle frequency and some CAF lags
of interest, then just estimating the CAF over a limited range
of lags can be quite inexpensive.

• manifest7 says:

Thank you for your response. I’ll take a look at that article and see how they used the CAF for OFDM signal detection. I’ve been meaning to learn about OFDM signals, so this paper will be a good motivator.

So in general, if we know certain characteristics of our signal (e.g. cycle frequencies, lags of interest, etc.), then the CAF would be a computationally inexpensive means of calculating Second Order Cyclostationary Features. Makes sense, thanks for the clarification!

4. liang says:

Hi Dr. Spooner. Great blog. I know for ergodic signals, time average equals to mean value. So there should be two integral symbols in equation 8. Why do you delete one?
Thank you.

• Thanks Liang. We refer to “cycloergodicity” here, which means that ensemble averages in the stochastic framework equal the output of the sine-wave extraction operator (the expected value in the fraction-of-time probability framework):

$E[F(X(t))] = E^{\{\alpha\}}[F(x(t))]$

where $X(t)$ is the random process and $x(t)$ is a sample path thereof. When cycloergodicity holds, we can obtain the cyclic autocorrelation function directly from the second-order lag product for almost all sample paths (almost all == with probability one).

Agree?

• liang says:

What does F mean? And alpha? Can you give me some reference papers for the proved equation. I know little about cycloergodicity. Thanks.

• $F$ is just some functional, like $F(x(t)) = x(t+\tau/2)x^*(t-\tau/2)$. $\alpha$ denotes a cycle frequency. Or, more generally, the frequency of a finite-strength additive sine-wave component in $F(x(t))$. For information on cycloergodicity and the fraction-of-time probability framework, see The Literature [R8, R67, R68].

• liang says:

So for equation 8, can I calculate CAF with fft[x*conj(x)]? Because I find it has same form as DFT operation.

• Well, you can estimate the cyclic autocorrelation by picking out one element of the FFT of the lag product. But notice that the cyclic autocorrelation function is a limiting version of a time average. Because there is no guarantee that the cycle frequency $\alpha$ is exactly equal to a Fourier frequency in the DFT, you’ll get better results by just computing the single DFT directly. That works fine if you know the cycle frequency $\alpha$ in advance. If you don’t, you have to search for the cycle frequencies, and that is best done using the spectral correlation function and the strip spectral correlation analyzer.