The Spectral Correlation Function

Spectral correlation is perhaps the most widely used characterization of the cyclostationarity property. The main reason is that the computational efficiency of the FFT can be harnessed to characterize the cyclostationarity of a given signal or data set in an efficient manner. And not just efficient, but with a reasonable total computational cost, so that one doesn’t have to wait too long for the result.

Just as the normal power spectrum is actually the power spectral density, or more accurately, the spectral density of time-averaged power (variance), the spectral correlation function is the spectral density of time-averaged correlation (covariance). What does this mean? Consider the following schematic showing two narrowband spectral components of an arbitrary signal:

scf_schematic

Figure 1. Illustration of the concept of spectral correlation. The time-series represented by the narrowband spectral components centered at f-A/2 and f+A/2 are downconverted to zero frequency and their correlation is measured. When A=0, the result is the power spectral density function, otherwise it is referred to as the spectral correlation function. It is non-zero only for a countable set of numbers \{A\}, which are equal to the frequencies of sine waves that can be generated by quadratically transforming the data.

The sequence of shaded rectangles on the left are meant to imply a time-series corresponding to the output of a bandpass filter centered at f-A/2 with bandwidth B. Similarly, the sequence of shaded rectangles on the right imply a time-series corresponding to the output of a bandpass filter centered at f+A/2 with bandwidth B.

Let’s call the first time-series Y_1(t, f-A/2) and the second one Y_2(t, f+A/2). Since these time-series, or signals, are bandpass in general, if we attempt to measure their correlation we will get a small value if A \neq 0. However, if we downconvert each of them to baseband (zero frequency), we will obtain lowpass signals, and there is the possibility that these new signals are correlated to some degree.

As a limiting case, suppose each of the Y_j(t, \cdot) signals were noiseless sine waves. By the construction of the figure, their frequencies must be different if A \neq 0, and so their correlation will be zero. However, if each sine wave is perfectly downconverted to baseband, the resulting signals are simply complex-valued constants, and the correlation between the two constant time-series is high.

In general, the narrowband time-series Y_1(t, \cdot) and Y_2(t, \cdot) are not simple sine waves, but complicated random processes. But the correlation between separated spectral components–spectral correlation–is still a highly useful characterization of the signal for a large class of interesting signals.

The spectral components (the individual downconverted narrowband spectral components of the signal) are most often obtained through the use of the Fourier transform. As the transform of length T slides along the signal x(t), it produces a number of downconverted spectral components with approximate bandwidth 1/T.  The two involved time-series of interest are then renamed as X_T(t, f-A/2) and X_T(t, f+A/2). A measure of the spectral correlation is given by the limiting average of the cyclic periodogram, which is defined by

\displaystyle I_T^A (t, f) = \frac{1}{T} X_T(t, f-A/2) X_T^*(t, f+A/2), \hfill (1)

as the amount of processed data increases without bound, and then the spectral resolution (B = 1/T) is allowed to decrease to zero,

\displaystyle S_x^A (f) = \lim_{T\rightarrow\infty} \lim_{U\rightarrow\infty} \displaystyle\frac{1}{U} \int_{-U/2}^{U/2} I_T^A(t, f) \, dt. \hfill (2)

The limit spectral correlation function we just wrote down is a time-smoothed (time-averaged) cyclic periodogram. But the limit function can also be obtained by frequency smoothing the cyclic periodogram

\displaystyle S_x^A(f) = \lim_{\Delta\rightarrow 0} \lim_{T\rightarrow\infty} g_\Delta(f) \otimes I_T^A(t, f), \hfill (3)

where g_\Delta(f) is a unit-area pulse-like smoothing kernel (such as a rectangle). In (3), the symbol \otimes denotes convolution.

The Significance of the Frequency A

The spectral correlation function (SCF) S_x^A(f) is typically zero for almost all real numbers A. Those A for which the SCF is not identically zero are called cycle frequencies (CFs). The set of SCF CFs is exactly the same as the set of cycle frequencies for the cyclic autocorrelation function (CAF)! That is, the separation between correlated narrowband signal components of x(t) is the same as a frequency of a sine wave that can be generated by a quadratic nonlinearity (for example, a squarer or a delay-and-multiply device) operating on the original time-series data x(t).

The Cyclic Wiener Relationship

It can be shown that the Fourier transform of the CAF is equal to the SCF (The Literature [R1], My Papers [5,6]):

\displaystyle S_x^\alpha(f) = \int_{-\infty}^\infty R_x^\alpha(\tau) e^{-i 2 \pi f \tau}\, d\tau, \hfill (4)

which is called the cyclic Wiener relationship. The Wiener relationship (sometimes called the Wiener-Khintchine theorem)  is a name given to the familiar Fourier transform relation between the conventional power spectral density and the autocorrelation

\displaystyle S_x^0 (f) = \int_{-\infty}^\infty R_x^0(\tau) e^{-i 2 \pi f \tau} \, d\tau, \hfill (5)

where S_x^0(f) is the conventional power spectrum and R_x^0(\tau) is the conventional autocorrelation function.

It follows that the cyclic autocorrelation function is the inverse Fourier transform of the spectral correlation function,

\displaystyle R_x^\alpha(\tau) = \int_{-\infty}^\infty S_x^\alpha(f) e^{i 2 \pi f \tau} \, df \hfill (4a)

and the normal autocorrelation is the inverse transform of the power spectral density

\displaystyle R_x^0(\tau) = \int_{-\infty}^\infty S_x^0(f) e^{i 2 \pi f \tau} \, df .\hfill (5a)

The mean-square (power) of the time-series x(t) (or variance if the time-series has a zero mean value) is simply the autocorrelation evaluated at \tau = 0. This implies that the power of the time-series is the integral of the power spectral density

\displaystyle P_x = R_x^0(0) = \int_{-\infty}^\infty S_x^0(f) \, df. \hfill (5b)

Conjugate Spectral Correlation

This post has defined the non-conjugate spectral correlation function, which is the correlation between X_T(t, f-\alpha/2) and X_T(t, f+\alpha/2). (The correlation between random variables X and Y is defined as E[XY^*]; that is, the standard correlation includes a conjugation.)

The conjugate SCF is defined as the Fourier transform of the conjugate cyclic autocorrelation function,

\displaystyle S_{x^*}^\alpha (f) = \int_{-\infty}^\infty R_{x^*}^\alpha (\tau) e^{-i 2 \pi f \tau}\, d\tau . \hfill (6)

From this definition, it can be shown that the conjugate SCF is the density of time-averaged correlation between X_T(t, f+\alpha/2) and X_T^*(t, \alpha/2-f),

\displaystyle S_{x^*}^\alpha (f) = \lim_{T\rightarrow\infty} \lim_{U\rightarrow\infty} \displaystyle\frac{1}{U} \int_{-U/2}^{U/2} J_T^\alpha(t, f) \, dt, \hfill (7)

where J_T^\alpha(t, f) is the conjugate cyclic periodogram

J_T^\alpha(t,f) = \displaystyle \frac{1}{T} X_T(t, f+\alpha/2) X_T(t, \alpha/2-f). \hfill (8)

The detailed explanation for why we need two kinds of spectral correlation functions (and, correspondingly, two kinds of cyclic autocorrelation functions) can be found in the post on conjugation configurations.

Illustrations

The SCF below is estimated (more on that estimation in another post) from a simulated BPSK signal having bit rate of 333.3 kHz and carrier frequency of 100 kHz. A small amount of noise is added to the signal prior to SCF estimation. The (non-conjugate) SCF shows the power spectrum for \alpha = 0 and the bit-rate SCF for \alpha = 333.3 kHz. The conjugate SCF plot shows the prominent feature for the doubled-carrier cycle frequency \alpha = 200.0 kHz, and features offset from the doubled-carrier feature by \pm 333.3 kHz. More on the spectral correlation of the BPSK signal can be found here and here. The spectral correlation surfaces for a variety of communication signals can be found in this gallery post.

A closely related function called the spectral coherence function is useful for blindly detecting cycle frequencies exhibited by arbitrary data sets.

ww_1

Figure 2. Non-conjugate and conjugate spectral correlation functions for a BPSK signal with bit rate of 333 kHz and carrier frequency 100 kHz.

Now consider a similar signal: QPSK with rectangular pulses. Let’s switch to normalized frequencies here for convenience. The signal has a symbol rate of 1/10, a carrier frequency of 0.05, unit power, and a small amount of additive white Gaussian noise. A power spectrum estimate is shown in the following figure:

spec_corr_example_psds

Figure 3. The power spectrum of a rectangular-pulse QPSK signal and the spectra of four selected narrowband components of the signal. The symbol rate is (normalized) 0.1 and the carrier frequency is 0.05.

Consider also four distinct narrowband (NB) components of this QPSK signal as shown in the figure. The center frequencies are 0.0, 0.1, 0.08, and 0.18. We know that this signal has non-conjugate cycle frequencies that are equal to harmonics of the symbol rate, or k/10 for k = 0, \pm 1, \pm 2, \ldots. This means that the NB components with separations k/10 are correlated. So if we extract such NB components “by hand” and calculate their correlation coefficients as a function of relative delay, we should see large results for the pairs (0.0, 0.1) and (0.08, 0.18), and small results for all other pairs drawn from the four frequencies.

So let’s do that. We apply a simple Fourier-based ideal filter (ideal meaning rectangular pass band) with center frequency f_0 \in \{0.0, 0.1, 0.08, 0.18\}, frequency shift to complex baseband (zero center frequency), and decimate. The results are our narrowband signal components. Are they correlated when they should be and uncorrelated when they should be?

Here are the correlation-coefficient results:

spec_corr_example_cc

Figure 4. Correlation coefficients for various pairs of the narrowband components of the QPSK signal shown in Figure 3. Large correlation coefficients are obtained only when the difference between the narrowband frequency components’ center frequencies equals a harmonic of the symbol rate k/10.

Here the signals y_j(t) arise from the frequencies \{0.0, 0.1, 0.08, 0.18\}. So the spectral correlation concept is verified here: the only large correlation coefficients are those corresponding to a spectral component difference that is equal to a cycle frequency. One can also simply plot the decimated shifted narrowband components and assess correlation visually:

spec_corr_example_time

Figure 5. Time-series plots of the narrowband components of the QPSK signal shown in Figure 3 (after frequency shifting to zero frequency and decimating). Which pairs appear to be correlated?

Estimators

I’ve written several posts on estimators for the spectral correlation function; they are listed below. I think of them as falling into two categories: exhaustive and focused. For exhaustive spectral correlation estimators, the goal is to estimate the function over its entire (non-redundant) domain of definition as efficiently as possible. For focused estimators, the goal is to estimate the spectral correlation function for one or a small number of cycle frequencies with high accuracy and selectable frequency resolution.

Exhaustive Estimators

Strip Spectral Correlation Analyzer (SSCA)

FFT Accumulation Method (FAM)

Focused Estimators

The Frequency-Smoothing Method (FSM)

The Time-Smoothing Method (TSM)

 

Support the CSP Blog and Keep it Ad-Free

Please consider donating to the CSP Blog to keep it ad-free and to support the addition of major new features. The small box below is used to specify the number of $5 donations.

$5.00

88 thoughts on “The Spectral Correlation Function

  1. sunson211 says:

    Dear sir,

    May I ask a question, could you upload the matlab code which generate the full size of normalized SCF ? I am a beginner level right now, it’s very hard for me to learn this just simply looking at those equations. I saw someone post a function called “autofam”. that function is really close to what I need (full size, and frequency are all normalized), but that is not normalized (the max SCF > 1) SCF. Thanks a lot in advance.

    • Sunson211:

      Thanks for reading the CSP blog and for your comment. I don’t give out much code (some signal generation code and some machine-learning code has been posted). The general rules I follow for providing help are laid out here.

      • sunson211 says:

        Hi Chad, thanks for your reply. Sure, allow me to ask questions here if possible. But do you know how to attach some pictures here? It’s much easier if I can put some pictures. Thank you.

        • I’ve been working on it, but I don’t think it can be done at this time. You can take a look at the post that you want to comment on again, and see if there are new options for uploading an attachment. If not, you might do what some others have done and post your images to another site, such as imgur.com, and then paste a link to them in a comment to the CSP Blog.

          And yes, you cannot create a post on the CSP Blog, you may only comment.

  2. sunson211 says:

    Dear friends,

    I have a general question about SCF. It is well known that SCF is very good for signal classification. So, my question is that, between original SCF, and normalized SCF, which one is better for classification purpose in practical ? Thank you.

  3. Claudio Dias says:

    Dear Chad Spooner,

    I’ve just found your blog after decided to dive into the cyclo stationary process world. I think I will spend long hours reading through your posts. I would like to thank you very much for your contributions.

    I would like to ask my first question if you don’t mind. What is the meaning of taking the FFT from the autocorrelation of an arbitrary signal vector (like MATLAB randn generated vector)? Is this similar to consider autocorrelation function with an alpha = 0?

    Best Regards.

    • Thanks for visiting the CSP Blog Claudio!

      What is the meaning of taking the FFT from the autocorrelation of an arbitrary signal vector (like MATLAB randn generated vector)?

      Do you mean “taking the FFT of the autocorrelation”? The Fourier transform of the traditional autocorrelation function is the power spectrum. But maybe by “autocorrelation function” you mean the time-varying autocorrelation function. In that case, the Fourier transform of the time-varying autocorrelation where the transform is over the time variable t, and not the lag variable \tau, will reveal the different cyclic autocorrelation components.

      Is this similar to consider autocorrelation function with an alpha = 0?

      Under the interpretation above, where “autocorrelation function” means “time-varying autocorrelation function”, if you take the FFT of the time-varying autocorrelation function and look at the FFT bin corresponding to zero frequency, you will have the normal stationary-signal autocorrelation value, which is also the cyclic autocorrelation function for \alpha = 0.

      I could help more if the question were made more precise. Do you think you can rephrase it?

Leave a Comment, Ask a Question, or Point out an Error