Spectral Correlation and Cyclic Correlation Plots for Real-Valued Signals

Spectral correlation surfaces for real-valued and complex-valued versions of the same signal look quite different.

In the real world, the electromagnetic field is a multi-dimensional time-varying real-valued function (volts/meter or newtons/coulomb). But in mathematical physics and signal processing, we often use complex-valued representations of the field, or of quantities derived from it, to facilitate our mathematics or make the signal processing more compact and efficient.

So throughout the CSP Blog I’ve focused almost exclusively on complex-valued signals and data. However, there is a considerable older literature that uses real-valued signals, such as The Literature [R1, R151]. You can use either real-valued or complex-valued signal representations and data, as you prefer, but there are advantages and disadvantages to each choice. Moreover, an author might not be perfectly clear about which one is used, especially when presenting a spectral correlation surface (as opposed to a sequence of equations, where things are often more clear).

An example is the following sequence of four surfaces taken from [R151]:

Figure 1. Four theoretical spectral correlation surfaces (magnitudes) for real-valued PSK signals taken from [R151]. The carrier frequency for the signals is such that there is negligible signal energy for low frequencies near zero.

In this post, I show my own surfaces for real- and complex-valued representations of these common PSK signals. In a previous post, I explained mathematically how the complex-valued representation relates to the real-valued representation. In a future Signal Processing ToolKit post, I’ll go over all the steps involved in obtaining the complex-valued representation of a signal from a real-valued one.

Continue reading “Spectral Correlation and Cyclic Correlation Plots for Real-Valued Signals”

50,000 Page Views in 2020

And counting …

Last evening the CSP Blog crossed the 50,000 page-view threshold for 2020, a yearly total that has not been achieved previously!

I want to thank each reader, each commenter, and each person that’s clicked the Donate button. You’ve made the CSP Blog the success it is, and I am so grateful for the time you spend here.

On these occasions I put some of the more interesting CSP-Blog statistics below the fold. If you have been wanting to see a post on a particular CSP or Signal Processing ToolKit topic, and it just hasn’t appeared, feel free to leave me a note in the Comments section.

Continue reading “50,000 Page Views in 2020”

Stationary Signal Models Versus Cyclostationary Signal Models

What happens when a cyclostationary time-series is treated as if it were stationary?

In this post let’s consider the difference between modeling a communication signal as stationary or as cyclostationary.

There are two contexts for this kind of issue. The first is when someone recognizes that a particular signal model is cyclostationary, and then takes some action to render it stationary (sometimes called ‘stationarizing the signal’). They then proceed with their analysis or algorithm development using the stationary signal model. The second context is when someone applies stationary-signal processing to a cyclostationary signal model, either without knowing that the signal is cyclostationary, or perhaps knowing but not caring.

At the center of this topic is the difference between the mathematical object known as a random process (or stochastic process) and the mathematical object that is a single infinite-time function (or signal or time-series).

A related paper is The Literature [R68], which discusses the pitfalls of applying tools meant for stationary signals to the samples of cyclostationary signals.

Continue reading “Stationary Signal Models Versus Cyclostationary Signal Models”

DeepSig’s 2018 Data Set: 2018.01.OSC.0001_1024x2M.h5.tar.gz

The third DeepSig data set I’ve examined. It’s better!

Update February 2021. I added material relating to the DeepSig-claimed variation of the roll-off parameter in a square-root raised-cosine pulse-shaping function. It does not appear that the roll-off was actually varied as stated in Table I of [R137].

DeepSig’s data sets are popular in the machine-learning modulation-recognition community, and in that community there are many claims that the deep neural networks are vastly outperforming any expertly hand-crafted tired old conventional method you care to name (none are usually named though). So I’ve been looking under the hood at these data sets to see what the machine learners think of as high-quality inputs that lead to disruptive upending of the sclerotic mod-rec establishment. In previous posts, I’ve looked at two of the most popular DeepSig data sets from 2016 (here and here). In this post, we’ll look at one more and I will then try to get back to the CSP posts.

Let’s take a look at one more DeepSig data set: 2018.01.OSC.0001_1024x2M.h5.tar.gz.

Continue reading “DeepSig’s 2018 Data Set: 2018.01.OSC.0001_1024x2M.h5.tar.gz”

More on DeepSig’s RML Data Sets

The second DeepSig data set I analyze: SNR problems and strange PSDs.

I presented an analysis of one of DeepSig’s earlier modulation-recognition data sets (RML2016.10a.tar.bz2) in the post on All BPSK Signals. There we saw several flaws in the data set as well as curiosities. Most notably, the signals in the data set labeled as analog amplitude-modulated single sideband (AM-SSB) were absent: these signals were only noise. DeepSig has several other data sets on offer at the time of this writing:

In this post, I’ll present a few thoughts and results for the “Larger Version” of RML2016.10a.tar.bz2, which is called RML2016.10b.tar.bz2. This is a good post to offer because it is coherent with the first RML post, but also because more papers are being published that use the RML 10b data set, and of course more such papers are in review. Maybe the offered analysis here will help reviewers to better understand and critique the machine-learning papers. The latter do not ever contain any side analysis or validation of the RML data sets (let me know if you find one that does in the Comments below), so we can’t rely on the machine learners to assess their inputs. (Update: I analyze a third DeepSig data set here.)

Continue reading “More on DeepSig’s RML Data Sets”

Blog Notes: New Page with All CSP Blog Posts in Chronological Order

To aid navigating the CSP Blog, I’ve added a new page called “All CSP Blog Posts.” You can find the page link at the top of the home page, or in various lists on the right side of the Blog, such as “Pages” and “Site Navigation.”

Let me know in the Comments if there are other ways that you think I can improve the usability of the site.

h/t: Reader Clint.

All BPSK Signals

An analysis of DeepSig’s 2016.10A data set, used in many published machine-learning papers, and detailed comments on quite a few of those papers.

Update June 2020

I’ll be adding new papers to this post as I find them. At the end of the original post there is a sequence of date-labeled updates that briefly describe the relevant aspects of the newly found papers. Some machine-learning modulation-recognition papers deserve their own post, so check back at the CSP Blog from time-to-time for “Comments On …” posts.

Continue reading “All BPSK Signals”

Symmetries of Higher-Order Temporal Probabilistic Parameters in CSP

What are the unique parts of the multidimensional cyclic moments and cyclic cumulants?

In this post, we continue our study of the symmetries of CSP parameters. The second-order parameters–spectral correlation and cyclic correlation–are covered in detail in the companion post, including the symmetries for ‘auto’ and ‘cross’ versions of those parameters.

Here we tackle the generalizations of cyclic correlation: cyclic temporal moments and cumulants. We’ll deal with the generalization of the spectral correlation function, theĀ  cyclic polyspectra, in a subsequent post. It is reasonable to me to focus first on the higher-order temporal parameters, because I consider the temporal parameters to be much more useful in practice than the spectral parameters.

This topic is somewhat harder and more abstract than the second-order topic, but perhaps there are bigger payoffs in algorithm development for exploiting symmetries in higher-order parameters than in second-order parameters because the parameters are multidimensional. So it could be worthwhile to sally forth.

Continue reading “Symmetries of Higher-Order Temporal Probabilistic Parameters in CSP”

New Look for a New Year and New Decade

2020 is the fifth full year of existence for the CSP Blog, and the beginning of a new decade that will be full of CSP explorations. I thought I’d freshen up the look of the Blog, so I’ve switched the theme. It is a cleaner look with fewer colors and no more hexagons. I’m not completely happy with it, so I might change it yet again. Let me know if you have problems viewing the content or posting a comment (cmspooner at ieee dot org).

Happy New Year to all my readers!

The Ambiguity Function and the Cyclic Autocorrelation Function: Are They the Same Thing?

To-may-to, to-mah-to?

Let’s talk about ambiguity and correlation. The ambiguity function is a core component of radar signal processing practice and theory. The autocorrelation function and the cyclic autocorrelation function, are key elements of generic signal processing and cyclostationary signal processing, respectively. Ambiguity and correlation both apply a quadratic functional to the data or signal of interest, and they both weight that quadratic functional by a complex exponential (sine wave) prior to integration or summation.

Are they the same thing? Well, my answer is both yes and no.

Continue reading “The Ambiguity Function and the Cyclic Autocorrelation Function: Are They the Same Thing?”

CSP Resources: The Ultimate Guides to Cyclostationary Random Processes by Professor Napolitano

My friend and colleague Antonio Napolitano has just published a new book on cyclostationary signals and cyclostationary signal processing:

Cyclostationary Processes and Time Series: Theory, Applications, and Generalizations, Academic Press/Elsevier, 2020, ISBN: 978-0-08-102708-0. The book is a comprehensive guide to the structure of cyclostationary random processes and signals, and it also provides pointers to the literature on many different applications. The book is mathematical in nature; use it to deepen your understanding of the underlying mathematics that make CSP possible.

You can check out the book on amazon.com using the following link:

Cyclostationary Processes and Time Series

Continue reading “CSP Resources: The Ultimate Guides to Cyclostationary Random Processes by Professor Napolitano”

On Impulsive Noise, CSP, and Correntropy

And I still don’t understand how a random variable with infinite variance can be a good model for anything physical. So there.

I’ve seen several published and pre-published (arXiv.org) technical papers over the past couple of years on the topic of cyclic correntropy (The Literature [R123-R127]). I first criticized such a paper ([R123]) here, but the substance of that review was about my problems with the presented mathematics, not impulsive noise and its effects on CSP. Since the papers keep coming, apparently, I’m going to put down some thoughts on impulsive noise and some evidence regarding simple means of mitigation in the context of CSP. Preview: I don’t think we need to go to the trouble of investigating cyclic correntropy as a means of salvaging CSP from the evil clutches of impulsive noise.

Continue reading “On Impulsive Noise, CSP, and Correntropy”

For the Beginner at CSP

Here is a list of links to CSP Blog posts that I think are suitable for a beginner: read them in the order given.

How to Obtain Help from the CSP Blog

Introduction to CSP

How to Create a Simple Cyclostationary Signal: Rectangular-Pulse BPSK

The Cyclic Autocorrelation Function

The Spectral Correlation Function

The Cyclic Autocorrelation for BPSK

Continue reading “For the Beginner at CSP”

A Gallery of Cyclic Correlations

For your delectation.

There are some situations in which the spectral correlation function is not the preferred measure of (second-order) cyclostationarity. In these situations, the cyclic autocorrelation (non-conjugate and conjugate versions) may be much simpler to estimate and work with in terms of detector, classifier, and estimator structures. So in this post, I’m going to provide plots of the cyclic autocorrelation for each of the signals in the spectral correlation gallery post. The exceptions are those signals I called feature-rich in the spectral correlation gallery post, such as LTE and radar. Recall that such signals possess a large number of cycle frequencies, and plotting their three-dimensional spectral correlation surface is not helpful as it is difficult to interpret with the human eye. So for the cycle-frequency patterns of feature-rich signals, we’ll rely on the stem-style (cyclic-domain profile) plots that I used in the gallery post.

Continue reading “A Gallery of Cyclic Correlations”

On The Shoulders

What modest academic success I’ve had in the area of cyclostationary signal theory and cyclostationary signal processing is largely due to the patient mentorship of my doctoral adviser, William (Bill) Gardner, and the fact that I was able to build on an excellent foundation put in place by Gardner, his advisor Lewis Franks, and key Gardner students such as William (Bill) Brown.

Continue reading “On The Shoulders”

100,000 Page Views!

The CSP Blog has reached 100,000 page views! Also, a while back it passed the “20,000 visitors” milestone. All of this for 53 posts and 10 pages. More to come!

yearly_totals

I started the CSP Blog in late 2015, so it has taken a bit over three years to get to 100,000 views. I don’t know if that should be considered fast or slow. But I like it anyway.

I want to thank each and every one of the visitors to the CSP Blog. It has reached so many more people that I though it ever would when I started it.

Thank you for all your clicks, comments, emails, and downloads! If you’d like, leave a comment to this post if you have an idea for a post you’d like to see.

Below the fold, some graphics that show the vital statistics of the CSP Blog as of the 100,000 page-view milestone.

Continue reading “100,000 Page Views!”

Can a Machine Learn a Power Spectrum Estimator?

I continue with my foray into machine learning (ML) by considering whether we can use widely available ML tools to create a machine that can output accurate power spectrum estimates. Previously we considered the perhaps simpler problem of learning the Fourier transform. See here and here.

Along the way I’ll expose my ignorance of the intricacies of machine learning and my apparent inability to find the correct hyperparameter settings for any problem I look at. But, that’s where you come in, dear reader. Let me know what to do!

Continue reading “Can a Machine Learn a Power Spectrum Estimator?”

Data Set for the Machine-Learning Challenge

Update September 2020. I made a mistake when I created the signal-parameter “truth” files signal_record.txt and signal_record_first_20000.txt. Like the DeepSig RML data sets that I analyzed on the CSP Blog here and here, the SNR parameter in the truth files did not match the actual SNR of the signals in the data files. I’ve updated the truth files and the links below. You can still use the original files for all other signal parameters, but the SNR parameter was in error.

Update July 2020. I originally posted 20,000 signals in the posted data set. I’ve now added another 92,000 for a total of 112,000 signals. The original signals are contained in Batches 1-5, the additional signals in Batches 6-28. I’ve placed these additional Batches at the end of the post to preserve the original post’s content.

Continue reading “Data Set for the Machine-Learning Challenge”

MATLAB’s SSCA: commP25ssca.m

In this short post, I describe some errors that are produced by MATLAB’s strip spectral correlation analyzer function commP25ssca.m. I don’t recommend that you use it; far better to create your own function.

Continue reading “MATLAB’s SSCA: commP25ssca.m”

How we Learned CSP

This post is just a blog post. Just some guy on the internet thinking out loud. If you have relevant thoughts or arguments you’d like to advance, please leave them in the Comments section at the end of the post.

How did we, as people not machines, learn to do cyclostationary signal processing? We’ve successfully applied it to many real-world problems, such as weak-signal detection, interference-tolerant detection, interference-tolerant time-delay estimation, modulation recognition, joint multiple-cochannel-signal modulation recognition (My Papers [25,26,28,38,43]), synchronization (The Literature [R7]), beamforming (The Literature [R102,R103]), direction-finding (The Literature [R104-R106]), detection of imminent mechanical failures (The Literature [R017-R109]), linear time-invariant system identification (The Literature [R110-R115]), and linear periodically time-variant filtering for cochannel signal separation (FRESH filtering) (My Papers [45], The Literature [R6]).

How did this come about? Is it even interesting to ask the question? Well, it is to me. I ask it because of the current hot topic in signal processing: machine learning. And in particular, machine learning applied to modulation recognition (see here and here). The machine learners want to capitalize on the success of machine learning applied to image recognition by directly applying the same sorts of image-recognition techniques to the problem of automatic type-recognition for human-made electromagnetic waves.

Continue reading “How we Learned CSP”