<< Chapter < Page Chapter >> Page >
The Huffman source coding algorithm is provably maximally efficient.

Shannon's Source Coding Theorem has additional applications in data compression . Here, we have a symbolic-valued signal source, like a computer file or an image, that we want torepresent with as few bits as possible. Compression schemes that assign symbols to bit sequences are known as lossless if they obey the Source Coding Theorem; they are lossy if they use fewer bits than the alphabet's entropy. Using a lossy compression scheme means that you cannotrecover a symbolic-valued signal from its compressed version without incurring some error. You might be wondering why anyonewould want to intentionally create errors, but lossy compression schemes are frequently used where the efficiency gained inrepresenting the signal outweighs the significance of the errors.

Shannon's Source Coding Theorem states that symbolic-valued signals require on the average at least H A number of bits to represent each of its values, which aresymbols drawn from the alphabet A . In the module on the Source Coding Theorem we find that using a so-called fixed rate source coder, one that produces a fixed number of bits/symbol, may not be the most efficient way of encodingsymbols into bits. What is not discussed there is a procedure for designing an efficient source coder: one guaranteed to produce the fewest bits/symbol on the average. That source coder is not unique,and one approach that does achieve that limit is the Huffman source coding algorithm .

In the early years of information theory, the race was on to be the first to find a provably maximally efficient source coding algorithm. The race was won by thenMIT graduate student David Huffman in 1954, who worked on the problem as a project in his information theory course. We'repretty sure he received an “A.”
  • Create a vertical table for the symbols, the best ordering being in decreasing order of probability.
  • Form a binary tree to the right of the table. A binary tree always has two branches at each node. Build the tree bymerging the two lowest probability symbols at each level, making the probability of the node equal to the sum of themerged nodes' probabilities. If more than two nodes/symbols share the lowest probability at a given level, pick any two;your choice won't affect B A .
  • At each node, label each of the emanating branches with a binary number. The bit sequence obtained from passingfrom the tree's root to the symbol is its Huffman code.

The simple four-symbol alphabet used in the Entropy and Source Coding modules has a four-symbol alphabet with the following probabilities, a 0 1 2 a 1 1 4 a 2 1 8 a 3 1 8 and an entropy of 1.75 bits . This alphabet has the Huffman coding tree shown in [link] .

Huffman coding tree

We form a Huffman code for a four-letter alphabet having the indicated probabilities of occurrence. The binary treecreated by the algorithm extends to the right, with the root node (the one at which the tree begins) defining thecodewords. The bit sequence obtained by traversing the tree from the root to the symbol defines that symbol's binarycode.

The code thus obtained is not unique as we could have labeled the branches coming out of each node differently. The averagenumber of bits required to represent this alphabet equals 1.75  bits, which is the Shannon entropy limit for this source alphabet. If we had thesymbolic-valued signal s m a 2 a 3 a 1 a 4 a 1 a 2 , our Huffman code would produce the bitstream b n 101100111010… .

If the alphabet probabilities were different, clearly a different tree, and therefore different code, could wellresult. Furthermore, we may not be able to achieve the entropy limit. If our symbols had the probabilities a 1 1 2 , a 2 1 4 , a 3 1 5 , and a 4 1 20 , the average number of bits/symbol resulting from the Huffman coding algorithm would equal 1.75  bits. However, the entropy limit is 1.68 bits. The Huffman code does satisfy the SourceCoding Theorem—its average length is within one bit of the alphabet's entropy—but you might wonder if a better codeexisted. David Huffman showed mathematically that no other code could achieve a shorter average code than his. We can'tdo better.

Got questions? Get instant answers now!

Derive the Huffman code for this second set of probabilities, and verify the claimed average code lengthand alphabet entropy.

The Huffman coding tree for the second set of probabilities is identical to that for the first ( [link] ). The average code length is 1 2 1 1 4 2 1 5 3 1 20 3 1.75 bits. The entropy calculation is straightforward: H A 1 2 1 2 1 4 1 4 1 5 1 5 1 20 1 20 , which equals 1.68 bits.

Got questions? Get instant answers now!

Questions & Answers

What are the factors that affect demand for a commodity
Florence Reply
differentiate between demand and supply giving examples
Lambiv Reply
differentiated between demand and supply using examples
Lambiv
what is labour ?
Lambiv
how will I do?
Venny Reply
how is the graph works?I don't fully understand
Rezat Reply
information
Eliyee
devaluation
Eliyee
t
WARKISA
hi guys good evening to all
Lambiv
multiple choice question
Aster Reply
appreciation
Eliyee
explain perfect market
Lindiwe Reply
In economics, a perfect market refers to a theoretical construct where all participants have perfect information, goods are homogenous, there are no barriers to entry or exit, and prices are determined solely by supply and demand. It's an idealized model used for analysis,
Ezea
What is ceteris paribus?
Shukri Reply
other things being equal
AI-Robot
When MP₁ becomes negative, TP start to decline. Extuples Suppose that the short-run production function of certain cut-flower firm is given by: Q=4KL-0.6K2 - 0.112 • Where is quantity of cut flower produced, I is labour input and K is fixed capital input (K-5). Determine the average product of lab
Kelo
Extuples Suppose that the short-run production function of certain cut-flower firm is given by: Q=4KL-0.6K2 - 0.112 • Where is quantity of cut flower produced, I is labour input and K is fixed capital input (K-5). Determine the average product of labour (APL) and marginal product of labour (MPL)
Kelo
yes,thank you
Shukri
Can I ask you other question?
Shukri
what is monopoly mean?
Habtamu Reply
What is different between quantity demand and demand?
Shukri Reply
Quantity demanded refers to the specific amount of a good or service that consumers are willing and able to purchase at a give price and within a specific time period. Demand, on the other hand, is a broader concept that encompasses the entire relationship between price and quantity demanded
Ezea
ok
Shukri
how do you save a country economic situation when it's falling apart
Lilia Reply
what is the difference between economic growth and development
Fiker Reply
Economic growth as an increase in the production and consumption of goods and services within an economy.but Economic development as a broader concept that encompasses not only economic growth but also social & human well being.
Shukri
production function means
Jabir
What do you think is more important to focus on when considering inequality ?
Abdisa Reply
any question about economics?
Awais Reply
sir...I just want to ask one question... Define the term contract curve? if you are free please help me to find this answer 🙏
Asui
it is a curve that we get after connecting the pareto optimal combinations of two consumers after their mutually beneficial trade offs
Awais
thank you so much 👍 sir
Asui
In economics, the contract curve refers to the set of points in an Edgeworth box diagram where both parties involved in a trade cannot be made better off without making one of them worse off. It represents the Pareto efficient allocations of goods between two individuals or entities, where neither p
Cornelius
In economics, the contract curve refers to the set of points in an Edgeworth box diagram where both parties involved in a trade cannot be made better off without making one of them worse off. It represents the Pareto efficient allocations of goods between two individuals or entities,
Cornelius
Suppose a consumer consuming two commodities X and Y has The following utility function u=X0.4 Y0.6. If the price of the X and Y are 2 and 3 respectively and income Constraint is birr 50. A,Calculate quantities of x and y which maximize utility. B,Calculate value of Lagrange multiplier. C,Calculate quantities of X and Y consumed with a given price. D,alculate optimum level of output .
Feyisa Reply
Answer
Feyisa
c
Jabir
the market for lemon has 10 potential consumers, each having an individual demand curve p=101-10Qi, where p is price in dollar's per cup and Qi is the number of cups demanded per week by the i th consumer.Find the market demand curve using algebra. Draw an individual demand curve and the market dema
Gsbwnw Reply
suppose the production function is given by ( L, K)=L¼K¾.assuming capital is fixed find APL and MPL. consider the following short run production function:Q=6L²-0.4L³ a) find the value of L that maximizes output b)find the value of L that maximizes marginal product
Abdureman
Got questions? Join the online conversation and get instant answers!
Jobilize.com Reply

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Fundamentals of electrical engineering i. OpenStax CNX. Aug 06, 2008 Download for free at http://legacy.cnx.org/content/col10040/1.9
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Fundamentals of electrical engineering i' conversation and receive update notifications?

Ask