<< Chapter < Page Chapter >> Page >

At first glance, this appears paradoxical; source coding is used to remove redundancy, whilechannel coding is used to add redundancy. But it is not really self-defeating or contradictorybecause the redundancy that is removed by source coding does not have a structure or pattern that a computeralgorithm at the receiver can exploit to detect or correct errors. The redundancy that is added in channel codingis highly structured, and can be exploited by computer programs implementing the appropriatedecoding routines. Thus [link] begins with a message, and uses a source code to remove the redundancy.This is then coded again by the channel encoder to add structured redundancy, and the resulting signalprovides the input to the transmitter of the previous chapters.One of the triumphs of modern digital communications systems is that, by cleverchoice of source and channel codes, it is possible to get close to the Shannon limits and toutilize all the capacity of a channel.

What is information?

Like many common English words, information has many meanings. The American Heritage Dictionary catalogs six:

  1. Knowledge derived from study, experience, or instruction.
  2. Knowledge of a specific event or situation; intelligence.
  3. A collection of facts or data.
  4. The act of informing or the condition of being informed; communication of knowledge.
  5. Computer Science. A nonaccidental signal or character used as an input to a computer or communication system.
  6. A numerical measure of the uncertainty of an experimental outcome.

It would clearly be impossible to capture all of these senses in a technical definition that would be useful in transmission systems.The final definition is closest to our needs, though it does not specify exactly how the numerical measure should be calculated.Shannon does. Shannon's insight was that there is a simple relationship between the amount of information conveyed in a messageand the probability of the message being sent. This does not apply directly to “messages” such as sentences, images, or.wav files, but to the symbols of the alphabet that are transmitted.

For instance, suppose that a fair coin has heads H on one side and tails T on the other. The two outcomes are equally uncertain, and receiving either H or T removes the same amount of uncertainty (conveys the same amount of information).But suppose the coin is biased. The extreme case is occurs when the probability of H is 1. Then, when H is received, no information is conveyed, because H is the only possible choice! Now suppose that the probability of sending H is 0 . 9 while the probability of sending T is 0 . 1 . Then, if H is received, it removes a little uncertainty, but not much. H is expected, since it usually occurs. But if T is received, it is somewhat unusual, and hence conveys a lot of information.In general, events that occur with high probability give little information, while events of low probabilitygive considerable information.

To make this relationship between the probability of events and information more plain, imagine a game in whichyou must guess a word chosen at random from the dictionary. You are given the starting letter as a hint.If the hint is that the first letter is “t,” then this does not narrow down the possibilities very much,since so many words start with “t.” But if the hint is that the first letter is “x,”then there are far fewer choices. The likely letter (the highly probable “t”) conveys little information,while the unlikely letter (the improbable “x”) conveys a lot more information by narrowing down thechoices.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Software receiver design. OpenStax CNX. Aug 13, 2013 Download for free at http://cnx.org/content/col11510/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Software receiver design' conversation and receive update notifications?

Ask