CSP Blog Highlights

Welcome to the CSP Blog!

To help new readers, I’m supplying here links to the posts that have gotten the most attention over the lifetime of the Blog. Omitted from this list are the more esoteric topics as well as the posts that comment on the engineering literature.

What is Cyclostationarity?

Introductory post.

Spectral correlation.

Cyclic autocorrelation.

Higher-order cyclostationarity.

Can I Get Help with my CSP Work Through the CSP Blog?

General rules for getting help.

Second-order estimator development guide.

What is Higher-Order Cyclostationarity and What are Cyclic Cumulants?

Introduction to higher-order cyclostationarity.

Cyclic cumulants and cyclic moments.

Optional conjugations in higher-order parameters.

The cyclic polyspectrum.

How do You Estimate the Parameters of Second-Order Cyclostationarity?

The frequency-smoothing method for spectral correlation estimation, one cycle frequency at a time.

The time-smoothing method for spectral correlation estimation, one cycle frequency at a time.

Exhaustive efficient spectral correlation estimation, all cycle frequencies.

Spectral coherence and blind estimation of significant cycle frequencies.

Continue reading

CSP Estimators: Cyclic Temporal Moments and Cumulants

In this post we discuss ways of estimating n-th order cyclic temporal moment and cumulant functions. Recall that for n=2, cyclic moments and cyclic cumulants are usually identical. They differ when the signal contains one or more finite-strength additive sine-wave components. In the common case when such components are absent (as in our recurring numerical example involving rectangular-pulse BPSK), they are equal and they are also equal to the conventional cyclic autocorrelation function provided the delay vector is chosen appropriately.

The more interesting case is when the order n is greater than 2. Most communication signal models possess odd-order moments and cumulants that are identically zero, so the first non-trivial order n greater than 2 is 4. So our estimation task is to estimate n-th order temporal moment and cumulant functions for n \ge 4 using a sampled-data record of length T.

Continue reading

Can a Machine Learn the Fourier Transform?

Or any transform for that matter. Or the power spectrum? Autocorrelation function? Cyclic moment? Cyclic cumulant?

I ask because the Machine Learners want to do away with what they call Expert Features in multiple areas involving classification, such as modulation recognition, image classification, facial recognition, etc. The idea is to train the machine (and by machine they seem to almost always mean an artificial neural network, or just neural network for short) by applying labeled data (supervised learning) where the data is the raw data involved in the classification application area. For us, here at the CSP Blog, that means complex-valued data samples obtained through standard RF signal reception techniques. In other words, the samples that we start with in all of our CSP algorithms, such as the frequency-smoothing method, the time-smoothing method, the strip spectral correlation analyzer, the cycle detectors, the time-delay estimators, automatic spectral segmentation, etc.

This is an interesting and potentially valuable line of inquiry, even if it does lead to the superfluousness of my work and the CSP Blog itself. Oh well, gotta face reality.

So can we start with complex samples (commonly called “I-Q samples”, which is short for “inphase and quadrature samples”) corresponding to labeled examples of the involved classes (BPSK, QPSK, AM, FM, etc.) and end up with a classifier with performance that exceeds that of the best Expert Feature classifier? From my point of view, that means that the machine has to learn cyclic cumulants or something even better. I have a hard time imagining something better (that is just a statement about my mental limitations, not about what might exist in the world), so I shift to asking Can a Machine Learn the Cyclic Cumulant?

Continue reading

Automatic Spectral Segmentation

In this post, I discuss a signal-processing algorithm that has almost nothing to do with cyclostationary signal processing. Almost. The topic is automated spectral segmentation, which I also call band-of-interest (BOI) detection. When attempting to perform automatic radio-frequency scene analysis (RFSA), we may be confronted with a data block that contains multiple signals in a large number of distinct frequency subbands. Moreover, these signals may be turning on an off within the data block. To apply our cyclostationary signal processing tools effectively, we would like to isolate these signals in time and frequency to the greatest extent possible using linear time-invariant filtering (for separating in the frequency dimension) and time-gating (for separating in the time dimension). Then the isolated signal components can be processed serially.

It is very important to remember that even perfect spectral and temporal segmentation will not solve the cochannel-signal problem. It is perfectly possible that an isolated subband will contain more that one cochannel signal.

The basics of my BOI-detection approach are published in a 2007 conference paper (My Papers [32]). I’ll describe this basic approach, illustrate it with examples relevant to RFSA, and also provide a few extensions of interest, including one that relates to cyclostationary signal processing.

Continue reading

More on Pure and Impure Sine Waves

Remember when we derived the cumulant as the solution to the pure nth-order sine-wave problem? It sounded good at the time, I hope. But here I describe a curious special case where the interpretation of the cumulant as the pure component of a nonlinearly generated sine wave seems to break down.

Continue reading

Cyclostationarity of Direct-Sequence Spread-Spectrum Signals

In this post we look at direct-sequence spread-spectrum (DSSS) signals, which can be usefully modeled as a kind of PSK signal. DSSS signals are used in a variety of real-world situations, including the familiar CDMA and WCDMA signals, covert signaling, and GPS. My colleague Antonio Napolitano has done some work on a large class of DSSS signals (The Literature [R11, R17, R95]), resulting in formulas for their spectral correlation functions, and I’ve made some remarks about their cyclostationary properties myself here and there (My Papers [16]).

A good thing, from the point of view of modulation recognition, about DSSS signals is that they are easily distinguished from other PSK and QAM signals by their spectral correlation functions. Whereas most PSK/QAM signals have only a single non-conjugate cycle frequency, and no conjugate cycle frequencies, DSSS signals have many non-conjugate cycle frequencies and in some cases also have many conjugate cycle frequencies.

Continue reading