<< Chapter < Page Chapter >> Page >
In this module, the concepts of entropy and variable-length coding are introduced, motivating the description of the Huffman encoder.
  • Binary Scalar Encoding: Previously we have focused on the memoryless scalar quantizer y = Q ( x ) , where y takes a value from a set of L reconstruction levels. By coding each quantizer output in binary format, we transmit(store) the information at a rate (cost) of
    R = log 2 L bits/sample .
    If, for example, L = 8 , then we transmit at 3 bits/sample. Say we can tolerate a bit more quantization error, e.g., as resultsfrom L = 5 . We hope that this reduction in fidelity reduces our transmissionrequirements, but with this simple binary encoding scheme we still require R = 3 bits/sample!
  • Idea—Block Coding: Let's assign a symbol to each block of 3 consecutive quantizer outputs.We need a symbol alphabet of size 5 3 = 125 , which is adequately represented by a 7-bit word ( 2 7 = 128 ). Transmitting these words requires only 7 / 3 = 2 . 33 bits/sample!
  • Idea—Variable Length Coding: Assume some of the quantizer outputs occur more frequently than others.Could we come up with an alphabet consisting of short words for representing frequent outputs and longer words for infrequent outputsthat would have a lower average transmission rate?

    Variable length coding)

    Consider the quantizer with L = 4 and output probabilities indicated in [link] . Straightforward 2-bit encoding requires average bit rate of 2bits/sample, while the variable length code in [link] gives average R = k P k n k = 0 . 6 · 1 + 0 . 25 · 2 + 0 . 1 · 3 + 0 . 05 · 3 = 1 . 55 bits/sample.

    output P k code
    y 1 0.60 0
    y 2 0.25 01
    y 3 0.10 011
    y 4 0.05 111
  • (Just enough information about) Entropy:

    Given an arbitrarily complex coding scheme, what is the minimum bits/sample required to transmit (store) the sequence { y ( n ) } ?

    When random process { y ( n ) } is i.i.d., the minimum average bit rate is

    R min = H y + ϵ ,
    where H y is the entropy of random variable y ( n ) in bits:
    H y = - k = 1 L P k log 2 P k ,
    and ϵ is an arbitrarily small positive constant (see textbooks by Berger and by Cover&Thomas).

  • Huffman Encoding: Given quantizer outputs y k or fixed-length blocks of outputs ( y j y k y ) , the Huffman procedure constructs variable length codes that are optimal in certain respects (see Cover&Thomas). For example, when the probabilities of { P k } are powers of 1/2 (and { y ( n ) } is i.i.d.), the entropy rate of a Huffman encoded output attains R min .

    Huffman procedure (binary case)

    1. Arrange ouput probabilities P k in decreasing order and consider them as leaf nodes of a tree.
    2. While there exists more than one node:
      • Merge the two nodes with smallest probability to form a new node whose probability equals the sum of the twomerged nodes.
      • Arbitrarily assign 1 and 0 to the two branches of the merging pair.
    3. The code assigned to each output is obtained by reading the branch bits sequentially from root note to leaf node.

    Huffman encoder attaining r min

    In [link] , a Huffman code was constructed for the output probabilities listed below.Here H y = - 0 . 5 log 2 0 . 5 + 0 . 25 log 2 0 . 25 + 2 · 0 . 125 log 2 0 . 125 = 1 . 75 bits, so that R min = 1 . 75 bits/sample (with the i.i.d. assumption). Since the average bit rate for the Huffman code is also R = 0 . 5 · 1 + 0 . 25 · 2 + 0 . 125 · 3 + 0 . 125 · 3 = 1 . 75 bits/sample, Huffman encoding attains R min for this output distribution.

    output P k code
    y 1 0.5 0
    y 2 0.25 01
    y 3 0.125 011
    y 4 0.125 111

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, An introduction to source-coding: quantization, dpcm, transform coding, and sub-band coding. OpenStax CNX. Sep 25, 2009 Download for free at http://cnx.org/content/col11121/1.2
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'An introduction to source-coding: quantization, dpcm, transform coding, and sub-band coding' conversation and receive update notifications?

Ask