Skip to content Skip to navigation Skip to collection information

OpenStax_CNX

You are here: Home » Content » An Introduction to Source-Coding: Quantization, DPCM, Transform Coding, and Sub-band Coding » Entropy Coding

Navigation

Lenses

What is a lens?

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

This content is ...

Affiliated with (What does "Affiliated with" mean?)

This content is either by members of the organizations listed or about topics related to the organizations listed. Click each link to see a list of all content affiliated with the organization.

Also in these lenses

  • UniqU content

    This collection is included inLens: UniqU's lens
    By: UniqU, LLC

    Click the "UniqU content" link to see all content selected in this lens.

  • Lens for Engineering

    This module and collection are included inLens: Lens for Engineering
    By: Sidney Burrus

    Click the "Lens for Engineering" link to see all content selected in this lens.

Recently Viewed

This feature requires Javascript to be enabled.

Tags

(What is a tag?)

These tags come from the endorsement, affiliation, and other lenses that include this content.
 

Entropy Coding

Module by: Phil Schniter. E-mail the author

Summary: In this module, the concepts of entropy and variable-length coding are introduced, motivating the description of the Huffman encoder.

  • Binary Scalar Encoding: Previously we have focused on the memoryless scalar quantizer y=Q(x)y=Q(x), where y takes a value from a set of L reconstruction levels. By coding each quantizer output in binary format, we transmit (store) the information at a rate (cost) of
    R=log2Lbits/sample.R=log2Lbits/sample.
    (1)
    If, for example, L=8L=8, then we transmit at 3 bits/sample. Say we can tolerate a bit more quantization error, e.g., as results from L=5L=5. We hope that this reduction in fidelity reduces our transmission requirements, but with this simple binary encoding scheme we still require R=3R=3 bits/sample!
  • Idea—Block Coding: Let's assign a symbol to each block of 3 consecutive quantizer outputs. We need a symbol alphabet of size 53=12553=125, which is adequately represented by a 7-bit word (27=12827=128). Transmitting these words requires only 7/3=2.337/3=2.33 bits/sample!
  • Idea—Variable Length Coding: Assume some of the quantizer outputs occur more frequently than others. Could we come up with an alphabet consisting of short words for representing frequent outputs and longer words for infrequent outputs that would have a lower average transmission rate?

    Example 1: Variable Length Coding)

    Consider the quantizer with L=4L=4 and output probabilities indicated in Table 1. Straightforward 2-bit encoding requires average bit rate of 2 bits/sample, while the variable length code in Table 1 gives average R=kPknk=0.6·1+0.25·2+0.1·3+0.05·3=1.55R=kPknk=0.6·1+0.25·2+0.1·3+0.05·3=1.55 bits/sample.

    Table 1
    outputPkcode
    y10.600
    y20.2501
    y30.10011
    y40.05111
  • (Just enough information about) Entropy:

    Exercise 1

    Question

    Given an arbitrarily complex coding scheme, what is the minimum bits/sample required to transmit (store) the sequence {y(n)}{y(n)}?

    Answer

    When random process {y(n)}{y(n)} is i.i.d., the minimum average bit rate is

    Rmin=Hy+ϵ,Rmin=Hy+ϵ,
    (2)
    where Hy is the entropy of random variable y(n)y(n) in bits:
    Hy=-k=1LPklog2Pk,Hy=-k=1LPklog2Pk,
    (3)
    and ϵ is an arbitrarily small positive constant (see textbooks by Berger and by Cover & Thomas).

    Notes:

    • Entropy obeys 0Hylog2L0Hylog2L. The left inequality occurs when Pk=1Pk=1 for some k, while the right inequality occurs when Pk=1/LPk=1/L for every k.
    • The term entropy refers to the average information of a single random variable, while the term entropy rate refers to a sequence of random variables, i.e., a random process.
    • When {y(n)}{y(n)} is not independent (the focus of later sections), a different expression for Rmin applies.
    • Though the minimum rate is well specified, the construction of a coding scheme which always achieves this rate is not.

    Example: Entropy of Variable Length Code

    Recalling the setup of Example 1, we find that Hy=-0.6log20.6+0.25log20.25+0.1log20.1+0.05log20.05=1.49Hy=-0.6log20.6+0.25log20.25+0.1log20.1+0.05log20.05=1.49 bits. Assuming i.i.d. {y(n)}{y(n)}, we have Rmin=1.49Rmin=1.49 bits/sample. Compare to the variable length code on the right which gave R=1.55R=1.55 bits/sample.

    Table 2
    outputPkcode
    y10.600
    y20.2501
    y30.10011
    y40.05111
  • Huffman Encoding: Given quantizer outputs yk or fixed-length blocks of outputs (yjyky)(yjyky), the Huffman procedure constructs variable length codes that are optimal in certain respects (see Cover & Thomas). For example, when the probabilities of {Pk}{Pk} are powers of 1/2 (and {y(n)}{y(n)} is i.i.d.), the entropy rate of a Huffman encoded output attains Rmin.

    Aside: Huffman Procedure (Binary Case):

    1. Arrange ouput probabilities Pk in decreasing order and consider them as leaf nodes of a tree.
    2. While there exists more than one node:
      • Merge the two nodes with smallest probability to form a new node whose probability equals the sum of the two merged nodes.
      • Arbitrarily assign 1 and 0 to the two branches of the merging pair.
    3. The code assigned to each output is obtained by reading the branch bits sequentially from root note to leaf node.
    Figure 1
    Figure 1 (img009.png)

    Example 2: Huffman Encoder Attaining Rmin

    In Aside 1, a Huffman code was constructed for the output probabilities listed below. Here Hy=-0.5log20.5+0.25log20.25+2·0.125log20.125=1.75Hy=-0.5log20.5+0.25log20.25+2·0.125log20.125=1.75 bits, so that Rmin=1.75Rmin=1.75 bits/sample (with the i.i.d. assumption). Since the average bit rate for the Huffman code is also R=0.5·1+0.25·2+0.125·3+0.125·3=1.75R=0.5·1+0.25·2+0.125·3+0.125·3=1.75 bits/sample, Huffman encoding attains Rmin for this output distribution.

    Table 3
    outputPkcode
    y10.50
    y20.2501
    y30.125011
    y40.125111

Collection Navigation

Content actions

Download:

Collection as:

EPUB (?)

What is an EPUB file?

EPUB is an electronic book format that can be read on a variety of mobile devices.

Downloading to a reading device

For detailed instructions on how to download this content's EPUB to your specific device, click the "(?)" link.

| More downloads ...

Module as:

PDF | More downloads ...

Add:

Collection to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks

Module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks