OpenStax-CNX

You are here: Home » Content » Introduction to Statistical Signal Processing

Recently Viewed

This feature requires Javascript to be enabled.

Introduction to Statistical Signal Processing

Module by: Clayton Scott. E-mail the author

Digital Signal Processing

• Digital ≡ sampled, discrete-time, quantized
• Signal ≡ waveform, sequnce of measurements or observations
• Processing ≡ analyze, modify, filter, synthesize

Examples of Digital Signals

• sampled speech waveform
• "pixelized" image
• Dow-Jones Index

DSP Applications

• Filtering (noise reduction)
• Pattern recognition (speech, faces, fingerprints)
• Compression

A Major Difficulty

In many (perhaps most) DSP applications we don't have complete or perfect knowledge of the signals we wish to process. We are faced with many unknowns and uncertainties.

Examples

• noisy measurements
• unknown signal parameters
• noisy system or environmental conditions
• natural variability in the signals encountered

How can we design signal processing algorithms in the face of such uncertainty?

Can we model the uncertainty and incorporate this model into the design process?

Statistical signal processing is the study of these questions.

Modeling Uncertainty

The most widely accepted and commonly used approach to modeling uncertainty is Probability Theory (although other alternatives exist such as Fuzzy Logic).

Probability Theory models uncertainty by specifying the chance of observing certain signals.

Alternatively, one can view probability as specifying the degree to which we believe a signal reflects the true state of nature.

Examples of Probabilistic Models

• errors in a measurement (due to an imprecise measuring device) modeled as realizations of a Gaussian random variable.
• uncertainty in the phase of a sinusoidal signal modeled as a uniform random variable on 0 2π 0 2 .
• uncertainty in the number of photons stiking a CCD per unit time modeled as a Poisson random variable.

Statistical Inference

A statistic is a function of observed data.

Example 1

Suppose we observe NN scalar values x 1 , , x N x 1 , , x N . The following are statistics:

• x-=1N n =1N x n x 1 N n 1 N x n (sample mean)
• x 1 , , x N x 1 , , x N (the data itself)
• min x 1 x N x 1 x N (order statistic)
• ( x 1 2 x 2 sin x 3 x 1 2 x 2 x 3 , e( x 1 x 3 ) x 1 x 3 )
A statistic cannot depend on unknown parameters.

Probability is used to model uncertainty.

Statistics are used to draw conclusions about probability models.

Probability models our uncertainty about signals we may observe.

Statistics reasons from the measured signal to the population of possible signals.

Statistical Signal Processing

• Step 1: Postulate a probability model (or models) that reasonably capture the uncertainties at hand
• Step 2: Collect data
• Step 3: Formulate statistics that allow us to interpret or understand our probability model(s)

In this class

The two major kinds of problems that we will study are detection and estimation. Most SSP problems fall under one of these two headings.

Detection Theory

Given two (or more) probability models, which on best explains the signal?

Examples

1. Decode wireless comm signal into string of 0's and 1's
2. Pattern recognition
• voice recognition
• face recognition
• handwritten character recognition
3. Anomaly detection
• irregular, heartbeat
• gamma-ray burst in deep space

Estimation Theory

If our probability model has free parameters, what are the best parameter settings to describe the signal we've observed?

Examples

1. Noise reduction
2. Determine parameters of a sinusoid (phase, amplitude, frequency)
• track trajectories of space-craft
• automatic control systems
• channel equalization
4. Determine location of a submarine (sonar)
5. Seismology: estimate depth below ground of an oil deposit

Example 2: Detection Example

Suppose we observe NN tosses of an unfair coin. We would like to decide which side the coin favors, heads or tails.

• Step 1: Assume each toss is a realization of a Bernoulli random variable. PrHeads=p=1PrTails Heads p 1 Tails Must decide p=14 p 1 4 vs. p=34 p 3 4 .
• Step 2: Collect data x 1 , , x N x 1 , , x N x i =1Heads x i 1 Heads x i =0Tails x i 0 Tails
• Step 3: Formulate a useful statistic k= n =1N x n k n 1 N x n If k<N2 k N 2 , guess p=14 p 1 4 . If k>N2 k N 2 , guess p=34 p 3 4 .

Example 3: Estimation Example

Suppose we take NN measurements of a DC voltage AA with a noisy voltmeter. We would like to estimate AA.

• Step 1: Assume a Gaussian noise model x n =A+ w n x n A w n where w n 𝒩01 w n 0 1 .
• Step 2: Gather data x 1 , , x N x 1 , , x N
• Step 3: Compute the sample mean, A ^=1N n =1N x n A 1 N n 1 N x n and use this as an estimate.

In these examples (Example 2 and Example 3), we solved detection and estimation problems using intuition and heuristics (in Step 3).

This course will focus on developing principled and mathematically rigorous approaches to detection and estimation, using the theoretical framework of probability and statistics.

Summary

• DSP ≡ processing signals with computer algorithms.
• SSP ≡ statistical DSP ≡ processing in the presence of uncertainties and unknowns.

Content actions

Give feedback:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks