Skip to content Skip to navigation Skip to collection information

OpenStax_CNX

You are here: Home » Content » Fundamentals of Electrical Engineering I » Compression and the Huffman Code

Navigation

Table of Contents

Lenses

What is a lens?

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

This content is ...

Affiliated with (What does "Affiliated with" mean?)

This content is either by members of the organizations listed or about topics related to the organizations listed. Click each link to see a list of all content affiliated with the organization.
  • OrangeGrove display tagshide tags

    This collection is included inLens: Florida Orange Grove Textbooks
    By: Florida Orange Grove

    Click the "OrangeGrove" link to see all content affiliated with them.

    Click the tag icon tag icon to display tags associated with this content.

  • Rice DSS - Braille display tagshide tags

    This collection is included inLens: Rice University Disability Support Services's Lens
    By: Rice University Disability Support Services

    Comments:

    "Electrical Engineering Digital Processing Systems in Braille."

    Click the "Rice DSS - Braille" link to see all content affiliated with them.

    Click the tag icon tag icon to display tags associated with this content.

  • Rice Digital Scholarship display tagshide tags

    This collection is included in aLens by: Digital Scholarship at Rice University

    Click the "Rice Digital Scholarship" link to see all content affiliated with them.

    Click the tag icon tag icon to display tags associated with this content.

  • Bookshare

    This collection is included inLens: Bookshare's Lens
    By: Bookshare - A Benetech Initiative

    Comments:

    "Accessible versions of this collection are available at Bookshare. DAISY and BRF provided."

    Click the "Bookshare" link to see all content affiliated with them.

  • Featured Content display tagshide tags

    This collection is included inLens: Connexions Featured Content
    By: Connexions

    Comments:

    "The course focuses on the creation, manipulation, transmission, and reception of information by electronic means. It covers elementary signal theory, time- and frequency-domain analysis, the […]"

    Click the "Featured Content" link to see all content affiliated with them.

    Click the tag icon tag icon to display tags associated with this content.

Also in these lenses

  • Lens for Engineering

    This module and collection are included inLens: Lens for Engineering
    By: Sidney Burrus

    Click the "Lens for Engineering" link to see all content selected in this lens.

Recently Viewed

This feature requires Javascript to be enabled.

Tags

(What is a tag?)

These tags come from the endorsement, affiliation, and other lenses that include this content.
 

Compression and the Huffman Code

Module by: Don Johnson. E-mail the author

Summary: The Huffman source coding algorithm is provably maximally efficient.

Shannon's Source Coding Theorem has additional applications in data compression. Here, we have a symbolic-valued signal source, like a computer file or an image, that we want to represent with as few bits as possible. Compression schemes that assign symbols to bit sequences are known as lossless if they obey the Source Coding Theorem; they are lossy if they use fewer bits than the alphabet's entropy. Using a lossy compression scheme means that you cannot recover a symbolic-valued signal from its compressed version without incurring some error. You might be wondering why anyone would want to intentionally create errors, but lossy compression schemes are frequently used where the efficiency gained in representing the signal outweighs the significance of the errors.

Shannon's Source Coding Theorem states that symbolic-valued signals require on the average at least HA H A number of bits to represent each of its values, which are symbols drawn from the alphabet AA. In the module on the Source Coding Theorem we find that using a so-called fixed rate source coder, one that produces a fixed number of bits/symbol, may not be the most efficient way of encoding symbols into bits. What is not discussed there is a procedure for designing an efficient source coder: one guaranteed to produce the fewest bits/symbol on the average. That source coder is not unique, and one approach that does achieve that limit is the Huffman source coding algorithm.

Point of Interest:

In the early years of information theory, the race was on to be the first to find a provably maximally efficient source coding algorithm. The race was won by then MIT graduate student David Huffman in 1954, who worked on the problem as a project in his information theory course. We're pretty sure he received an “A.”
  • Create a vertical table for the symbols, the best ordering being in decreasing order of probability.
  • Form a binary tree to the right of the table. A binary tree always has two branches at each node. Build the tree by merging the two lowest probability symbols at each level, making the probability of the node equal to the sum of the merged nodes' probabilities. If more than two nodes/symbols share the lowest probability at a given level, pick any two; your choice won't affect BA- B A .
  • At each node, label each of the emanating branches with a binary number. The bit sequence obtained from passing from the tree's root to the symbol is its Huffman code.

Example 1

The simple four-symbol alphabet used in the Entropy and Source Coding modules has a four-symbol alphabet with the following probabilities, Pra0=12 a0 1 2 Pra1=14 a1 1 4 Pra2=18 a2 1 8 Pra3=18 a3 1 8 and an entropy of 1.75 bits. This alphabet has the Huffman coding tree shown in Figure 1.

Figure 1: We form a Huffman code for a four-letter alphabet having the indicated probabilities of occurrence. The binary tree created by the algorithm extends to the right, with the root node (the one at which the tree begins) defining the codewords. The bit sequence obtained by traversing the tree from the root to the symbol defines that symbol's binary code.
Huffman Coding Tree
Huffman Coding Tree (sys21.png)

The code thus obtained is not unique as we could have labeled the branches coming out of each node differently. The average number of bits required to represent this alphabet equals 1.751.75 bits, which is the Shannon entropy limit for this source alphabet. If we had the symbolic-valued signal sm= a 2 a 3 a 1 a 4 a 1 a 2 s m a 2 a 3 a 1 a 4 a 1 a 2 , our Huffman code would produce the bitstream bn=101100111010… b n 101100111010… .

If the alphabet probabilities were different, clearly a different tree, and therefore different code, could well result. Furthermore, we may not be able to achieve the entropy limit. If our symbols had the probabilities Pr a 1 =12 a 1 1 2 , Pr a 2 =14 a 2 1 4 , Pr a 3 =15 a 3 1 5 , and Pr a 4 =120 a 4 1 20 , the average number of bits/symbol resulting from the Huffman coding algorithm would equal 1.751.75 bits. However, the entropy limit is 1.68 bits. The Huffman code does satisfy the Source Coding Theorem—its average length is within one bit of the alphabet's entropy—but you might wonder if a better code existed. David Huffman showed mathematically that no other code could achieve a shorter average code than his. We can't do better.

Exercise 1

Derive the Huffman code for this second set of probabilities, and verify the claimed average code length and alphabet entropy.

Solution

The Huffman coding tree for the second set of probabilities is identical to that for the first (Figure 1). The average code length is 121+142+153+1203=1.75 1 2 1 1 4 2 1 5 3 1 20 3 1.75 bits. The entropy calculation is straightforward: HA=(12log12+14log14+15log15+120log120) H A 1 2 1 2 1 4 1 4 1 5 1 5 1 20 1 20 , which equals 1.68 1.68 bits.

Collection Navigation

Content actions

Download:

Collection as:

PDF | EPUB (?)

What is an EPUB file?

EPUB is an electronic book format that can be read on a variety of mobile devices.

Downloading to a reading device

For detailed instructions on how to download this content's EPUB to your specific device, click the "(?)" link.

| More downloads ...

Module as:

PDF | EPUB (?)

What is an EPUB file?

EPUB is an electronic book format that can be read on a variety of mobile devices.

Downloading to a reading device

For detailed instructions on how to download this content's EPUB to your specific device, click the "(?)" link.

| More downloads ...

Add:

Collection to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks

Module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks