Summary: (Blank Abstract)
Note: You are viewing an old version of this document. The latest version is available here.
The Sampling Theorem says that if we sample a bandlimited signal
Recalling the plot of average daily highs in Problem FIX ME why is this plot so jagged? Interpret this effect in terms of analog-to-digital conversion.
The plotted temperatures were quantized to the nearest degree. Thus, the high temperature's amplitude was quantized as a form of A/D conversion.
A phenomenon reminiscent of the errors incurred in representing numbers on
a computer prevents signal amplitudes from being converted with no error into
a binary number representation. In analog-to-digital conversion, the signal is
assumed to lie within a predefined range. Assuming we can scale the signal
without affecting the information it expresses, we'll define this range to be
A/D converter |
---|
We define a quantization interval to be the range of values assigned to the same
integer. Thus, for our example two-bit A/D converter, the quantization interval is
How many bits would be required in the A/D converter to ensure that the maximum amplitude quantization error was less than 60 db smaller than the signal's peak value?
Solving
Once we have acquired signals with an A/D converter, we can process them using digital hardware or software. It can be shown that if the computer processing is linear, the result of sampling, computer processing, and unsampling is equivalent to some analog linear system. Why go to all the bother if the same function can be accomplished using analog techniques? Knowing when digital processing excels and when it does not is an important issue.