The CSP Blog Turns 10

Raise a glass!

I launched the site way back in September 2015. As with most things in my life, the CSP Blog was not the result of some carefully crafted plan, such as to corner the online market on signal-processing instruction, create a side-hustle, or manage my brand, whatever that might mean. It was a lark. I wanted my wife to start a blog or website as a place to share her writing with the world. “Look, dear, it really is super easy to create your own website,” I said to her, after writing a post or two. And it really is easy.

But then the CSP Blog took on a life of its own. Or, better said, it took over my life. Certainly it took a lot of my time, and still does.

Continue reading “The CSP Blog Turns 10”

SPTK: The Matched Filter

Matchmaker, Matchmaker,
Make me a match,
Find me a find,
catch me a catch!
–“Matchmaker” from Fiddler on the Roof

Previous SPTK Post: The Characteristic Function Next SPTK Post: Wavelets

In this post, we take a look at a special linear time-invariant system called the matched filter. It is used to detect the presence of a known signal. In practice, it is often applied to the detection of a periodically repeated known portion of a communication signal, such as a channel-estimation frame or frequency-correction burst. It is also widely used in the detection of radar pulses, where the matched-filtering operation is renamed to pulse compression. Filtering, which implies a convolution operation, and correlation are nearly the same thing. Therefore applying a matched filter is sometimes referred to as applying a correlator.

Continue reading “SPTK: The Matched Filter”

SPTK: The Characteristic Function

The collision of probability, Fourier analysis, and communication-signal models.

Previous SPTK Post: I and Q Next SPTK Post: The Matched Filter

Let’s return to the probability section of the Signal Processing ToolKit posts with a look at the characteristic function, which is the Fourier transform of the probability density function. We will see it has a deep connection to the central mathematical entities of CSP, which are moments and cumulants.

Continue reading “SPTK: The Characteristic Function”

The Big Time

“The place where I come from is a small town
They think so small, they use small words
But not me, I’m smarter than that
I worked it out
I’ve been stretching my mouth
To let those big words come right out”

‘Big Time’ by Peter Gabriel

The CSP Blog is now linked-to at the top of cyclostationarity.com, Professor Gardner’s online repository of all things cyclostationary! (See also The Literature [R1].)

Continue reading “The Big Time”

CSP Reduction to Sine-Wave Generation

“Five different voices behind him bellowed, “REDUCTO!” Five curses flew in five different directions and the shelves opposite them exploded as they hit; the towering structure swayed as a hundred glass spheres burst apart, pearly-white figures unfurled into the air and floated there, their voices echoing from who knew what long-dead past amid the torrent of crashing glass and splintered wood now raining down upon the floor…”

J. K. Rowling, Harry Potter and the Order of the Phoenix

We know that if we subject a cyclostationary signal to a squaring or delay-and-multiply operation we will obtain finite-strength additive sine-wave components at the output of the operation, where at least one of the sine waves has a non-zero frequency.

But I want to make a conjecture: All of CSP can be reduced to interpretations involving sine-wave generation by nonlinear operations. Let’s see if we can show this conjecture is true. After I make my attempt, I’ll also show what ChatGPT comes up with. Any guesses about how well it does?

Continue reading “CSP Reduction to Sine-Wave Generation”

CSPB.ML.2018R2.NF

A noise-free version of the 2018 CSP Blog dataset CSPB.ML.2018R2 is posted here. This allows researchers to correctly apply propagation-channel effects to the generated signals, and to easily add their own noise at whatever level they wish.

The format of the files is the same as CSPB.ML.2018R2, and the truth parameters for each file are the same as the truth parameters for the corresponding file in CSPB.ML.2018R2, except for SNR, which is infinite.

Continue reading “CSPB.ML.2018R2.NF”

End-of-Year Blog Notes 2024

Hey … I … Oh-oh … I’m still alive… (Alive by Pearl Jam)

Hey everybody! I’m still here.

I want to wish all of you a Happy New Year! I hope all your signal-processing projects succeed and your math skills grow steadily in 2025.

“May your noise be additive and white, and may all your SCFs be right.”

Continue reading “End-of-Year Blog Notes 2024”

Interference Mitigation Course at GTRI

Update December 2024: The likely date for this course at GTRI is February 4-5, 2025.

Update September 2024: This course is postponed until Spring 2025. I’ll post further updates here as they become available.


I’ll be part of a team of researchers and practicing engineers, led by the estimable Dr. Ryan Westafer, that will be teaching a class on radio-frequency interference mitigation in September. The class is hosted by the Georgia Tech Research Institute (GTRI) and will be held on the Georgia Tech campus on September 10-11, 2024.

Continue reading “Interference Mitigation Course at GTRI”

Final Snoap Doctoral-Work Journal Paper: My Papers [56] on Novel Network Layers for Modulation Recognition that Generalizes

Dr. Snoap’s final journal paper related to his recently completed doctoral work has been published in IEEE Transactions on Broadcasting (My Papers [56]).

Continue reading “Final Snoap Doctoral-Work Journal Paper: My Papers [56] on Novel Network Layers for Modulation Recognition that Generalizes”

SPTK: I and Q

Where does IQ (or I/Q) data come from?

Previous SPTK Post: Digital Filters Next SPTK Post: The Characteristic Function

Let’s really get into the mathematical details of “IQ data,” a phrase that appears in many CSP Blog posts and an awful lot of machine-learning papers on modulation recognition. Just what are “I” and “Q” anyway?

Continue reading “SPTK: I and Q”

Desultory CSP: What’s That Under the TV?

“Alive in the Superunknown
First it steals your Mind, and then it steals your … Soul”

–Soundgarden

An advantage of using and understanding the statistics of communication signals ™, the basics of signal processing, and the rich details of cyclostationary signal processing is that a practitioner can deal with, to some useful degree, unknown unknowns. The unknown unknowns I’m talking about here on the CSP Blog are, of course, signals. We know about the by-now-familiar known-type detection, multi-class modulation-recognition, and RF scene-analysis problems, in which it is often assumed that we know the signals we are looking for, but we don’t know their times of arrival, some of their parameters, or how they might overlap in time, frequency, and space. Then there are the less-familiar problems involving unknown unknowns.

Sometimes we just don’t know the signals we are looking for. We still want to do as good a job on RF scene analysis as we can, but there might be signals in the scene that do not conform to the body of knowledge we have, to date, of manmade RF signals. Or, in modern parlance, we didn’t even know we left such signals out of our neural-network training dataset; we’re a couple steps back from even worrying about generalization, because we don’t even know we can’t generalize since we are ignorant about what to generalize to.

In this post I look at the broadcast TV band, seen in downtown Monterey, California, sometime in the recent past. I expect to see ATSC DTV signals (of the older 8VSB/16VSB or the newer OFDM types), and I do. But what else is there? Spoiler: Unknown unknowns.

Let’s take a look.

Continue reading “Desultory CSP: What’s That Under the TV?”

CSPB.ML.2023G1

Another dataset aimed at the continuing problem of generalization in machine-learning-based modulation recognition. This one is a companion to CSPB.ML.2023, which features cochannel situations.

Quality datasets containing digital signals with varied parameters and lengths sufficient to permit many kinds of validation checks by signal-processing experts remain in short supply. In this post, we continue our efforts to provide such datasets by offering a companion unlabeled dataset to CSPB.ML.2023.

Continue reading “CSPB.ML.2023G1”

Stupid Laws Getting In My Way

A kvetch.

As the generative-AI crowd continues to feast on copyrighted material of all kinds, they are getting pushback in the form of lawsuits from artists, writers, and journalists. I discussed this recently with Dan and Eunice on the CSP Blog.

Open AI in particular seems to believe they have some kind of divine right to pursue whatever business they want, whether it is legal or not. Because reasons … including national security … and “meeting the needs of today’s citizens.” But probably just greed and hubris.

In a statement to the UK’s House of Lords, Open AI says this, and I assume they did so with a straight face, which would have been admirably difficult:

Continue reading “Stupid Laws Getting In My Way”

SPTK: Digital Filters

A look at general linear time-invariant filtering in the discrete-time domain.

Previous SPTK Post: The Z Transform   Next SPTK Post: IQ Data

Linear shift-invariant systems are often called digital filters when they are designed objects as opposed to found objects, which are models, really, of systems occurring in the natural world. A basic goal of digital filtering is to perform the same kind of function as does an analog filter, but it is used after sampling rather than before. In some cases, the digitally filtered signal is then converted to an analog signal. These ideas are illustrated in Figure 1.

Figure 1. A typical role for a linear shift-invariant system, or digital filter, in signal processing.
Continue reading “SPTK: Digital Filters”

Introducing Dr. John A. Snoap

An expert signal processor. An expert machine learner. All in one person!

I am very pleased to announce that my signal-processing, machine-learning, and modulation-recognition collaborator and friend John Snoap has successfully defended his doctoral dissertation and is now Dr. Snoap!

I started working with John after we met in the Comments section of the CSP Blog way back in 2019. John was building his own set of CSP software tools and ran into a small bump in the road and asked for some advice. Just the kind of reader I hope for–independent-minded, gets to the bottom of things, and embraces signal processing.

As we interacted over email and zoom it became clear that John was thinking of making a contribution in the area of modulation recognition, and was also interested in learning more about machine learning using neural networks. Since I had been recently engaged in hand-to-hand combat with machine learners who were, in my opinion of course, injecting more confusion than elucidation into the field, I figured this might be a friendly way for me to understand machine learning better, and maybe there would be a way or two to marry signal processing with supervised learning. So off we went.

Fast forward four years and we’ve published five papers, with a sixth in review, that I believe are trailblazing. John is that rare person that has mastered two very different technical areas: cyclostationary signal processing and deep learning. Because I believe that neural networks do not actually learn the things that we hope they will, but need not-so-gentle nudges toward learning the truly valuable things, a researcher with one foot firmly in the signal-processing world and the other firmly in the machine-learning world has a very bright future indeed.

The title of John’s dissertation is Deep-Learning-Based Classification of Digitally Modulated Signals, which he wrote as a student in the Department of Electrical and Computer Engineering at Old Dominion University under the direction of his advisor Professor Dimitrie Popescu.

Congratulations Dr. Snoap! And thank you for everything.

Infinity, Periodicity, and Frequency: Comments on a Recent Signal-Processing Perspectives Paper ([R195])

If a tool isn’t appropriate for your problem, don’t blame the tool. Find another one.

Let’s take a look at a recent perspectives-style paper published in the IEEE Signal Processing Magazine called “On the Concept of Frequency in Signal Processing: A Discussion [Perspectives],” (The Literature [R195]). While I criticize the paper directly, I’m hoping to use this post to provide my own perspective, and perhaps a bit of a tutorial, on the interrelated concepts of frequency, infinity, sine waves, and signal representations.

I appreciate tutorial papers in the signal-processing literature (see, for example, my positive post on Candan’s article about the Dirac delta [impulse] function), because my jaundiced view of the field is such that I think the basics, both of mathematics and communication-related signal-processing, are neglected in favor of fawning over the research flavor of the month. Over time, everybody–students, researchers, professors–is diminished because of this lack of attention to foundations.

Continue reading “Infinity, Periodicity, and Frequency: Comments on a Recent Signal-Processing Perspectives Paper ([R195])”

SPTK: The Z Transform

I think of the Z transform as the Laplace transform for discrete-time signals and systems.

Previous SPTK Post: Practical Filters Next SPTK Post: Digital Filters

In this Signal Processing ToolKit post, we look at the discrete-time version of the Laplace Transform: The Z Transform.

Continue reading “SPTK: The Z Transform”

CSPB.ML.2022R2: Correcting an RNG Flaw in CSPB.ML.2022

For completeness, I also correct the CSPB.ML.2022 dataset, which is aimed at facilitating neural-network generalization studies.

The same random-number-generator (RNG) error that plagued CSPB.ML.2018 corrupts CSPB.ML.2022, so that some of the files in the dataset correspond to identical signal parameters. This makes the CSPB.ML.2018 dataset potentially problematic for training a neural network using supervised learning.

In a recent post, I remedied the error and provided an updated CSPB.ML.2018 dataset and called it CSPB.ML.2018R2. Both are still available on the CSP Blog.

In this post, I provide an update to CSPB.ML.2022, called CSPB.ML.2022R2.

Continue reading “CSPB.ML.2022R2: Correcting an RNG Flaw in CSPB.ML.2022”

CSPB.ML.2018R2: Correcting an RNG Flaw in CSPB.ML.2018

KIRK: Everything that is in error must be sterilised.
NOMAD: There are no exceptions.
KIRK: Nomad, I made an error in creating you.
NOMAD: The creation of perfection is no error.
KIRK: I did not create perfection. I created error.

I’ve had to update the original Challenge for the Machine Learners post, and the associated dataset post, a couple times due to flaws in my metadata (truth) files. Those were fairly minor, so I just updated the original posts.

But a new flaw in CSPB.ML.2018 and CSPB.ML.2022 has come to light due to the work of the estimable research engineers at Expedition Technology. The problem is not with labeling or the fundamental correctness of the modulation types, pulse functions, etc., but with the way a random-number generator was applied in my multi-threaded dataset-generation technique.

I’ll explain after the fold, and this post will provide links to an updated version of the dataset, CSPB.ML.2018R2. I’ll keep the original up for continuity and also place a link to this post there. Moreover, the descriptions of the truth files over at CSPB.ML.2018 are still valid–the truth file posted here has the same format as the truth files available on the CSPB.ML.2018 and CSPB.ML.2022 posts.

Continue reading “CSPB.ML.2018R2: Correcting an RNG Flaw in CSPB.ML.2018”