<< Chapter < Page Chapter >> Page >
The Huffman source coding algorithm is provably maximally efficient.

Shannon's Source Coding Theorem has additional applications in data compression . Here, we have a symbolic-valued signal source, like a computer file or an image, that we want torepresent with as few bits as possible. Compression schemes that assign symbols to bit sequences are known as lossless if they obey the Source Coding Theorem; they are lossy if they use fewer bits than the alphabet's entropy. Using a lossy compression scheme means that you cannotrecover a symbolic-valued signal from its compressed version without incurring some error. You might be wondering why anyonewould want to intentionally create errors, but lossy compression schemes are frequently used where the efficiency gained inrepresenting the signal outweighs the significance of the errors.

Shannon's Source Coding Theorem states that symbolic-valued signals require on the average at least H A number of bits to represent each of its values, which aresymbols drawn from the alphabet A . In the module on the Source Coding Theorem we find that using a so-called fixed rate source coder, one that produces a fixed number of bits/symbol, may not be the most efficient way of encodingsymbols into bits. What is not discussed there is a procedure for designing an efficient source coder: one guaranteed to produce the fewest bits/symbol on the average. That source coder is not unique,and one approach that does achieve that limit is the Huffman source coding algorithm .

In the early years of information theory, the race was on to be the first to find a provably maximally efficient source coding algorithm. The race was won by thenMIT graduate student David Huffman in 1954, who worked on the problem as a project in his information theory course. We'repretty sure he received an “A.”
  • Create a vertical table for the symbols, the best ordering being in decreasing order of probability.
  • Form a binary tree to the right of the table. A binary tree always has two branches at each node. Build the tree bymerging the two lowest probability symbols at each level, making the probability of the node equal to the sum of themerged nodes' probabilities. If more than two nodes/symbols share the lowest probability at a given level, pick any two;your choice won't affect B A .
  • At each node, label each of the emanating branches with a binary number. The bit sequence obtained from passingfrom the tree's root to the symbol is its Huffman code.

The simple four-symbol alphabet used in the Entropy and Source Coding modules has a four-symbol alphabet with the following probabilities, a 0 1 2 a 1 1 4 a 2 1 8 a 3 1 8 and an entropy of 1.75 bits . This alphabet has the Huffman coding tree shown in [link] .

Huffman coding tree

We form a Huffman code for a four-letter alphabet having the indicated probabilities of occurrence. The binary treecreated by the algorithm extends to the right, with the root node (the one at which the tree begins) defining thecodewords. The bit sequence obtained by traversing the tree from the root to the symbol defines that symbol's binarycode.

The code thus obtained is not unique as we could have labeled the branches coming out of each node differently. The averagenumber of bits required to represent this alphabet equals 1.75  bits, which is the Shannon entropy limit for this source alphabet. If we had thesymbolic-valued signal s m a 2 a 3 a 1 a 4 a 1 a 2 , our Huffman code would produce the bitstream b n 101100111010… .

If the alphabet probabilities were different, clearly a different tree, and therefore different code, could wellresult. Furthermore, we may not be able to achieve the entropy limit. If our symbols had the probabilities a 1 1 2 , a 2 1 4 , a 3 1 5 , and a 4 1 20 , the average number of bits/symbol resulting from the Huffman coding algorithm would equal 1.75  bits. However, the entropy limit is 1.68 bits. The Huffman code does satisfy the SourceCoding Theorem—its average length is within one bit of the alphabet's entropy—but you might wonder if a better codeexisted. David Huffman showed mathematically that no other code could achieve a shorter average code than his. We can'tdo better.

Got questions? Get instant answers now!

Derive the Huffman code for this second set of probabilities, and verify the claimed average code lengthand alphabet entropy.

The Huffman coding tree for the second set of probabilities is identical to that for the first ( [link] ). The average code length is 1 2 1 1 4 2 1 5 3 1 20 3 1.75 bits. The entropy calculation is straightforward: H A 1 2 1 2 1 4 1 4 1 5 1 5 1 20 1 20 , which equals 1.68 bits.

Got questions? Get instant answers now!

Questions & Answers

what does preconceived mean
sammie Reply
physiological Psychology
Nwosu Reply
How can I develope my cognitive domain
Amanyire Reply
why is communication effective
Dakolo Reply
Communication is effective because it allows individuals to share ideas, thoughts, and information with others.
effective communication can lead to improved outcomes in various settings, including personal relationships, business environments, and educational settings. By communicating effectively, individuals can negotiate effectively, solve problems collaboratively, and work towards common goals.
it starts up serve and return practice/assessments.it helps find voice talking therapy also assessments through relaxed conversation.
miss
Every time someone flushes a toilet in the apartment building, the person begins to jumb back automatically after hearing the flush, before the water temperature changes. Identify the types of learning, if it is classical conditioning identify the NS, UCS, CS and CR. If it is operant conditioning, identify the type of consequence positive reinforcement, negative reinforcement or punishment
Wekolamo Reply
please i need answer
Wekolamo
because it helps many people around the world to understand how to interact with other people and understand them well, for example at work (job).
Manix Reply
Agreed 👍 There are many parts of our brains and behaviors, we really need to get to know. Blessings for everyone and happy Sunday!
ARC
A child is a member of community not society elucidate ?
JESSY Reply
Isn't practices worldwide, be it psychology, be it science. isn't much just a false belief of control over something the mind cannot truly comprehend?
Simon Reply
compare and contrast skinner's perspective on personality development on freud
namakula Reply
Skinner skipped the whole unconscious phenomenon and rather emphasized on classical conditioning
war
explain how nature and nurture affect the development and later the productivity of an individual.
Amesalu Reply
nature is an hereditary factor while nurture is an environmental factor which constitute an individual personality. so if an individual's parent has a deviant behavior and was also brought up in an deviant environment, observation of the behavior and the inborn trait we make the individual deviant.
Samuel
I am taking this course because I am hoping that I could somehow learn more about my chosen field of interest and due to the fact that being a PsyD really ignites my passion as an individual the more I hope to learn about developing and literally explore the complexity of my critical thinking skills
Zyryn Reply
good👍
Jonathan
and having a good philosophy of the world is like a sandwich and a peanut butter 👍
Jonathan
generally amnesi how long yrs memory loss
Kelu Reply
interpersonal relationships
Abdulfatai Reply
What would be the best educational aid(s) for gifted kids/savants?
Heidi Reply
treat them normal, if they want help then give them. that will make everyone happy
Saurabh
What are the treatment for autism?
Magret Reply
hello. autism is a umbrella term. autistic kids have different disorder overlapping. for example. a kid may show symptoms of ADHD and also learning disabilities. before treatment please make sure the kid doesn't have physical disabilities like hearing..vision..speech problem. sometimes these
Jharna
continue.. sometimes due to these physical problems..the diagnosis may be misdiagnosed. treatment for autism. well it depends on the severity. since autistic kids have problems in communicating and adopting to the environment.. it's best to expose the child in situations where the child
Jharna
child interact with other kids under doc supervision. play therapy. speech therapy. Engaging in different activities that activate most parts of the brain.. like drawing..painting. matching color board game. string and beads game. the more you interact with the child the more effective
Jharna
results you'll get.. please consult a therapist to know what suits best on your child. and last as a parent. I know sometimes it's overwhelming to guide a special kid. but trust the process and be strong and patient as a parent.
Jharna
Got questions? Join the online conversation and get instant answers!
Jobilize.com Reply

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Fundamentals of electrical engineering i. OpenStax CNX. Aug 06, 2008 Download for free at http://legacy.cnx.org/content/col10040/1.9
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Fundamentals of electrical engineering i' conversation and receive update notifications?

Ask