The Cyclic Autocorrelation

In this post, I introduce the cyclic autocorrelation function (CAF). The easiest way to do this is to first review the conventional autocorrelation function. Suppose we have a complex-valued signal x(t) defined on a suitable probability space. Then the mean value of x(t) is given by

M_x(t, \tau) = E[x(t + \tau)]. \hfill (1)

For stationary signals, and many cyclostationary signals, this mean value is independent of the lag parameter \tau, so that

\displaystyle M_x(t, \tau_1) = M_x(t, \tau_2) = M_x(t, 0) = M_x(t). \hfill (2)

The autocorrelation function is the correlation between the random variables corresponding to two time instants of the random signal, or

\displaystyle R_x(t_1, t_2) = E[x(t_1)x^*(t_2)]. \hfill (3)

To see how the autocorrelation varies with some particular central time t, we can use a more convenient parameterization of the two time instants t_1 and t_2, such as

\displaystyle R_x(t, \tau) = E[x(t+\tau/2)x^*(t-\tau/2)], \hfill (4)


\displaystyle t_1 = t+\tau/2

\displaystyle t_2 = t-\tau/2


\displaystyle t = (t_1 + t_2)/2

\displaystyle \tau = t_1 - t_2

So time t represents the center point of the two time instants and \tau is their separation. If the autocorrelation depends only on the separation between the two time instants \tau, and not their center point t, the signal is stationary of order two, or just stationary, and we have

\displaystyle R_x(t_1, \tau) = R_x(t_2, \tau) = R_x(0, \tau) = R_x(\tau). \hfill (5)

For nonstationary signals, on the other hand, the autocorrelation does depend on central time t. For the special case of nonstationary signals called cyclostationary signals, the autocorrelation is either a periodic function or an almost periodic function. In either case, it can be represented by a Fourier series

\displaystyle R_x(t, \tau) =  \sum_\alpha R_x^\alpha (\tau) e^{i 2 \pi \alpha t}, \hfill (6)

where R_x^\alpha(\tau) is a Fourier-series coefficient called the cyclic autocorrelation function. The Fourier frequencies \alpha are called cycle frequencies (CFs). The CAFs are obtained in the usual way for Fourier coefficients,

\displaystyle  R_x^\alpha(\tau) =  \lim_{T\rightarrow\infty} \frac{1}{T} \int_{-T/2}^{T/2} R_x(t,\tau) e^{-i 2 \pi \alpha t}\,dt. \hfill (7)

If the signal is a cycloergodic signal (or we are using fraction-of-time probability), then the CAFs can be obtained directly from a sample path (the signal itself),

\displaystyle R_x^\alpha(\tau) = \lim_{T\rightarrow\infty} \frac{1}{T} \int_{-T/2}^{T/2} x(t+\tau/2) x^* (t-\tau/2) e^{-i 2 \pi \alpha t} \, dt. \hfill (8)

For many cyclostationary signals, such as BPSK, the conjugate autocorrelation function is also non-zero (and also useful). This function is defined by

\displaystyle R_{x^*}(t, \tau) = E[x(t+\tau/2)x(t-\tau/2)], \hfill (9)

and is represented by its own Fourier series

\displaystyle R_{x^*}(t, \tau) = \sum_\alpha R_{x^*}^\alpha (\tau) e^{i 2 \pi \alpha t}. \hfill (10)

I explain in detail why we need two autocorrelation functions in the post on conjugation configurations. The problem is worse when we look at higher-order moments and cumulants, where we need n/2 + 1 functions to properly characterize a signal at order n.

And that is it! These are the basic definitions for the (second-order) probabilistic parameters of cyclostationary signals in the time-domain. In later posts, I’ll have much to say about their utility, their estimation,  their connection to the frequency-domain parameters, and their generalization to higher-order parameters.


Support the CSP Blog here.


21 thoughts on “The Cyclic Autocorrelation

  1. Abdul says:

    Hi Dr. Spooner,

    Thank you for making this blog. Are there certain qualities of a signal that are more noticeable using the Cyclic Autocorrelation Function (CAF) versus the Spectral Correlation Function (SCF)? In other words, are there situations where the usage of the CAF is favorable over the SCF?

    Thank you,

      • manifest7 says:

        Haha, my apologies. I initially expected my comment to be viewable immediately. I don’t normally use wordpress and didn’t realize that my comment was under moderation until the second time I tried to comment.

  2. manifest7 says:

    Hi Dr. Spooner,

    Thank you for this blog. Are there situations where it would be advantageous to utilize the Cyclic Autocorrelation Function over the Spectral Correlation Function?


    • I’ve found the CAF to be useful when doing CSP for OFDM signals (see the
      work of Octavia Dobre).
      I think the general answer to your question is that the advantages of the
      two functions depend on how much prior information you have about your signals.
      When you don’t know anything, and you want to do RFSA, the SCF is quite useful because of FFT-based estimators like the SSCA and FAM. But if you know
      quite a lot, such as a cycle frequency and some CAF lags
      of interest, then just estimating the CAF over a limited range
      of lags can be quite inexpensive.

      • manifest7 says:

        Thank you for your response. I’ll take a look at that article and see how they used the CAF for OFDM signal detection. I’ve been meaning to learn about OFDM signals, so this paper will be a good motivator.

        So in general, if we know certain characteristics of our signal (e.g. cycle frequencies, lags of interest, etc.), then the CAF would be a computationally inexpensive means of calculating Second Order Cyclostationary Features. Makes sense, thanks for the clarification!

  3. liang says:

    Hi Dr. Spooner. Great blog. I know for ergodic signals, time average equals to mean value. So there should be two integral symbols in equation 8. Why do you delete one?
    Thank you.

    • Thanks Liang. We refer to “cycloergodicity” here, which means that ensemble averages in the stochastic framework equal the output of the sine-wave extraction operator (the expected value in the fraction-of-time probability framework):

      E[F(X(t))] = E^{\{\alpha\}}[F(x(t))]

      where X(t) is the random process and x(t) is a sample path thereof. When cycloergodicity holds, we can obtain the cyclic autocorrelation function directly from the second-order lag product for almost all sample paths (almost all == with probability one).


      • liang says:

        What does F mean? And alpha? Can you give me some reference papers for the proved equation. I know little about cycloergodicity. Thanks.

        • F is just some functional, like F(x(t)) = x(t+\tau/2)x^*(t-\tau/2). \alpha denotes a cycle frequency. Or, more generally, the frequency of a finite-strength additive sine-wave component in F(x(t)). For information on cycloergodicity and the fraction-of-time probability framework, see The Literature [R8, R67, R68].

        • Well, you can estimate the cyclic autocorrelation by picking out one element of the FFT of the lag product. But notice that the cyclic autocorrelation function is a limiting version of a time average. Because there is no guarantee that the cycle frequency \alpha is exactly equal to a Fourier frequency in the DFT, you’ll get better results by just computing the single DFT directly. That works fine if you know the cycle frequency \alpha in advance. If you don’t, you have to search for the cycle frequencies, and that is best done using the spectral correlation function and the strip spectral correlation analyzer.

  4. Sara says:

    Hello Sir,
    Thank you so much for your valuable information. I just wanted to ask how to cite this tutorial?
    Thanks in advance

  5. Johann says:

    Dear Dr. Spooner,

    Many thanks for such an extensive blog on cyclostationary.

    I am new to this cyclostationary analysis, though I have been reading through many your posts. In particular the one on cyclic autocorrelation.

    You wrote in one of your replies that for OFDM signal CAF be used especially when the properties of the signal are known.

    I also read several publications from Octavia Dobre, as you suggested. One of her recent publications (see the link below) is about identification of GSM and LTE signals.

    I have downloaded the sample LTE signal from your “Data Sets” and used it for testing the algorithm from the paper above. The algorithm seemed to be straightforward, however I didn’t obtain the expected results. Basically I computed the CCF at CF alpha. The CF for LTE is known, i.e. 2 kHz (corresponds to 0.5 ms time slot).

    I would like to know if the LTE sample file in your data set is based on OTA measurement or a simulated signal?

    Do you think I should better use SCF instead of CAF for this purpose?

    Thanks in advance.

    Best regards,

  6. Johann says:

    Hi Dr. Spooner,

    Thank you for your prompt reply. I see, I will stick with the CAF then.

    I computed the CAF with a zero delay only, as described in the paper. Maybe I need to include a range of delay values, as you suggested?

    So far I haven’t validated my CAF estimator with a simpler simulated signal. I guess I should do that first.

    I will keep you updated. Thanks again for your help.

    Best regards,

  7. Johann says:

    Dear Dr. Spooner,

    I managed to get similar results to the paper using your LTE and GSM data sets. There was a bug in my original code.

    The corresponding CAF plots can be found on the links below. It’s my first time using imgur, so I don’t know for how long the links will be available.

    For your information, I am still using a zero delay. The CFs for LTE and GSM are equal to the multiple of 2 kHz and 1733 Hz, respectively.

    Could you elaborate why you think using a range of delay values might be better?
    Thanks in advance.

    Best regards,

  8. Johann says:

    Hi Dr. Spooner,

    Many thanks for your reply and the nice plots.

    I will try to replicate your plots.

    Best regards,

Leave a Comment, Ask a Question, or Point out an Error