<< Chapter < Page Chapter >> Page >
The Source Coding Theorem states that the entropy of an alphabet of symbols specifies to within one bit how many bits on the average need to be used to send the alphabet.

The significance of an alphabet's entropy rests in how we can represent it with a sequence of bits . Bit sequences form the "coin of the realm" in digitalcommunications: they are the universal way of representing symbolic-valued signals. We convert back and forth betweensymbols to bit-sequences with what is known as a codebook : a table that associates symbols to bit sequences. In creating this table, we must be able to assign a unique bit sequence to each symbol so that we can go between symbol and bit sequences without error.

You may be conjuring the notion of hiding information from others when we use the name codebook for thesymbol-to-bit-sequence table. There is no relation to cryptology, which comprises mathematically provable methods ofsecuring information. The codebook terminology was developed during the beginnings of information theory just after WorldWar II.

As we shall explore in some detail elsewhere, digital communication is the transmission of symbolic-valued signals from one place toanother. When faced with the problem, for example, of sending a file across the Internet, we must first represent eachcharacter by a bit sequence. Because we want to send the file quickly, we want to use as few bits as possible. However, wedon't want to use so few bits that the receiver cannot determine what each character was from the bit sequence. Forexample, we could use one bit for every character: File transmission would be fast but useless because the codebookcreates errors. Shannon proved in his monumental work what we call today the Source Coding Theorem . Let B a k denote the number of bits used to represent the symbol a k . The average number of bits B A required to represent the entire alphabet equals k 1 K B a k a k . The Source Coding Theorem states that the average number of bits needed to accurately represent the alphabet need only to satisfy

H A B A H A 1
Thus, the alphabet's entropy specifies to within one bit how many bits on the average need to be used to send the alphabet.The smaller an alphabet's entropy, the fewer bits required for digital transmission of files expressed in that alphabet.

A four-symbol alphabet has the following probabilities. a 0 1 2 a 1 1 4 a 2 1 8 a 3 1 8 and an entropy of 1.75 bits . Let's see if we can find a codebook for this four-letter alphabet that satisfies the Source CodingTheorem. The simplest code to try is known as the simple binary code : convert the symbol's index into a binary number and use the same number of bits for each symbol byincluding leading zeros where necessary.

a 0 00 a 1 01 a 2 10 a 3 11
Whenever the number of symbols in the alphabet is a power oftwo (as in this case), the average number of bits B A equals 2 logbase --> K , which equals 2 in this case. Because the entropy equals 1.75 bits, the simple binary code indeed satisfies the Source Coding Theorem—we arewithin one bit of the entropy limit—but you might wonder if you can do better. If we choose a codebook with differingnumber of bits for the symbols, a smaller average number of bits can indeed be obtained. The idea is to use shorter bitsequences for the symbols that occur more often. One codebook like this is
a 0 0 a 1 10 a 2 110 a 3 111
Now B A 1 · 1 2 2 · 1 4 3 · 1 8 3 · 1 8 1.75 . We can reach the entropy limit! The simple binary code is, in this case, less efficient than theunequal-length code. Using the efficient code, we can transmit the symbolic-valued signal having this alphabet 12.5%faster. Furthermore, we know that no more efficient codebook can be found because of Shannon's Theorem.

Got questions? Get instant answers now!

Questions & Answers

how do you get the 2/50
Abba Reply
number of sport play by 50 student construct discrete data
Aminu Reply
width of the frangebany leaves on how to write a introduction
Theresa Reply
Solve the mean of variance
Veronica Reply
Step 1: Find the mean. To find the mean, add up all the scores, then divide them by the number of scores. ... Step 2: Find each score's deviation from the mean. ... Step 3: Square each deviation from the mean. ... Step 4: Find the sum of squares. ... Step 5: Divide the sum of squares by n – 1 or N.
kenneth
what is error
Yakuba Reply
Is mistake done to something
Vutshila
Hy
anas
hy
What is the life teble
anas
hy
Jibrin
statistics is the analyzing of data
Tajudeen Reply
what is statics?
Zelalem Reply
how do you calculate mean
Gloria Reply
diveving the sum if all values
Shaynaynay
let A1,A2 and A3 events be independent,show that (A1)^c, (A2)^c and (A3)^c are independent?
Fisaye Reply
what is statistics
Akhisani Reply
data collected all over the world
Shaynaynay
construct a less than and more than table
Imad Reply
The sample of 16 students is taken. The average age in the sample was 22 years with astandard deviation of 6 years. Construct a 95% confidence interval for the age of the population.
Aschalew Reply
Bhartdarshan' is an internet-based travel agency wherein customer can see videos of the cities they plant to visit. The number of hits daily is a normally distributed random variable with a mean of 10,000 and a standard deviation of 2,400 a. what is the probability of getting more than 12,000 hits? b. what is the probability of getting fewer than 9,000 hits?
Akshay Reply
Bhartdarshan'is an internet-based travel agency wherein customer can see videos of the cities they plan to visit. The number of hits daily is a normally distributed random variable with a mean of 10,000 and a standard deviation of 2,400. a. What is the probability of getting more than 12,000 hits
Akshay
1
Bright
Sorry i want to learn more about this question
Bright
Someone help
Bright
a= 0.20233 b=0.3384
Sufiyan
a
Shaynaynay
How do I interpret level of significance?
Mohd Reply
It depends on your business problem or in Machine Learning you could use ROC- AUC cruve to decide the threshold value
Shivam
how skewness and kurtosis are used in statistics
Owen Reply
yes what is it
Taneeya
Got questions? Join the online conversation and get instant answers!
Jobilize.com Reply

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Fundamentals of electrical engineering i. OpenStax CNX. Aug 06, 2008 Download for free at http://legacy.cnx.org/content/col10040/1.9
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Fundamentals of electrical engineering i' conversation and receive update notifications?

Ask