Shannon & Weaver's Information Theory

M.  Truex Spring 2002

 

            According to Littlejohn, “Information theory...grew out of the boom in the telecommunications industry after World War II” (41).  As a telecommunications engineer at Bell Telephone Laboratories, Claude Shannon was fascinated with the subject, and his fascination culminated in a groundbreaking essay on Information Theory which was first published in a Bell technical journal in 1948.  Although complex and difficult to understand (and now more than 50 years old), Information Theory is still critically important to the field of Communication, not just in and of itself, but also due to the fact that it laid the foundation for a vast array of modern theories.

 

            At its most basic level, “Information theory...is the area of study most concerned with communication in systems.  Information theory [sic] involves the quantitative study of signals” (Littlejohn 41).  The following is a schematic diagram of a general communication system:
 

 

  


Thus, the information source generates a message which is transmitted through a channel to a receiver (the message’s ultimate destination).  Along the channel, noise may interfere with the transmission of the original message, causing the receiver to decode the message in a way which differs from the sender’s original intent.  This model continues to be standard in core communication courses, although the concept of feedback has since been added by  the theorist Norbert Wiener.  Feedback refers to messages sent from the receiver to the sender in regards to the receiver’s interpretation of the original message, which allows the sender to clear up any misinterpretation, if necessary.  The concept of feedback is actually foreign to Shannon’s original intent behind Information Theory.  According to Darnell:

One common misunderstanding about information theory is related to the fact that it is not concerned with information at all-in the ordinary sense.  Information theory is directly concerned with only the technical problem of getting symbols or signals from one place to another without distortion.  It does not relate directly to the interpretation of those symbols (159).

Even so, the concept of feedback is a useful addition to our modern understanding of human communication, particularly our need to double check the original meaning behind messages, to further add to the elimination of misunderstanding amongst people.

 

A key element guiding Shannon’s Information Theory is the concept of freedom of choice.  The message the sender decides to send has been chosen to the rejection of all other possible messages; this idea is profound when one ponders its role in the area of human communication.  Shannon formulated logarithms to express this concept of freedom of choice in mathematical terms.  In essence, Shannon postulated that:

 

If there are two things a source can do, and they are equally probable, that source will transmit each time and on the average one bit  [sic] of information per transmission.  Bit  [sic], incidentally, is short for binary digit  [sic] (Darnell 159).

 

As the receiver is aware of the concept of freedom of choice:

[Shannon] reasoned that the uncertainty of the receiver-destination is equal to the freedom-of-choice of the source that the receipt of the message in an ideal system informs the destination of the source’s choice and eliminates the destination’s uncertainty; and, therefore, the information contained in the message is equal to the freedom-of-choice of the source and the uncertainty of the destination (Darnell 160).

Such information uncertainty is known as entropy in information theory.  The term entropy, borrowed from the field of thermodynamics, involves the analysis of the “various probabilities involved [in the transmission of a message]---those of getting to certain stages in the process of forming messages, and the probabilities that, when in those stages, certain symbols be chosen next” (Weaver <http: //darkwing.uoregon.edu/~felsing/virtual_asia/info.html>).

 

As the message moves across the channel between sender and receiver, it is subject to several forms of interference.  One of these, mentioned previously, is noise.  Littlejohn defines noise as “any disturbance in the channel that distorts or otherwise masks the signal” (43).  Regardless of how the message is coded, “the problem of transmission is the same: to reconstruct the message accurately at the destination” (Littlejohn 43).  Two other limiting factors which may influence accurate message transmission are channel capacity and throughput:

 

Channel capacity [sic] is usually defined in terms of the maximum amount of information that can be transmitted over a channel in a given time period.  The actual amount of information in the channel is throughput  [sic].  If throughput exceeds channel capacity, distortion will occur or transmission will slow down (43).

 

In the end:

Efficient transmission [of a message] involves coding at a maximum rate that will not exceed channel capacity.  It also means using a code with sufficient redundancy to compensate for the amount of noise present in the channel.  Too much redundancy means transmission will be inefficient; too little means it will be inaccurate (Littlejohn 43).

           

Now that I have explained Information Theory, I will move on to discuss theorist Claude Shannon’s place in the scheme of things.

 

            According to Donald K. Darnell:

Information theory began in several places at different times.  Samuel F.B. Morse, for instance, met and solved some of the problems of signal transmission.  He recognized that there is more than one way to transmit a message and that some ways are more efficient than others...Mathematicians from around the world were involved at one time and another with the problems of telegraphic communication        and contributed to the development of information theory.  But the credits for full formalization usually are assigned to Norbert Wiener and Claude E. Shannon.              Both men were stimulated by technical problems and requirements for communication that arose during World War II, which accounts for their simultaneous publications.  Although Wiener and Shannon solved essentially the same problems (from slightly different points of view), it is most commonly Shannon’s work that is referred to by the term information theory [sic].

Now that I have anchored the theorist Claude Shannon in place and time, I will go into detail as to why it is significant to study Information Theory.  States Weaver:

This is a theory so general that one does not need to say what kinds of symbols are being considered...The theory is deep enough so that the relationships it reveals indiscriminately apply to all these and to other forms of communication.  This means, of course, that the theory is sufficiently imaginatively motivated so that it is dealing with the real inner core of the communication problem--with those basic relationships which hold in general, no matter what special form the actual case may take  (<http://darkwing.uoregon.edu/~felsing/virtual_asia/info.html>).

Adds Darnell:

Information theory, in spite of its mechanistic view and mathematical precision, contains some highly creative ideas and provides a foundation from which a creative person can leap to new insights about human communication...The technical concept [of freedom of choice]-if we can avoid getting burned by the short circuit between two kinds of information-can be very stimulating in the analysis of human behavior (158;160).

 Thus Darnell, like Weaver, emphasizes that Information Theory has applications to concepts far beyond its original scope.

 

Now that I have explained the workings of Information Theory, anchored its primary creator in place and time, and discussed the significance of studying the concept, I will move on to conduct a critical evaluation of the theory.  Adequately evaluating any communication theory involves determining the answers to the following questions:

1) Evaluated from the perspective of the educated person (and what he or she      knows about life in the “real world,”) is the theory reasonable?

 

2)  Is the theory adaptable to the inevitable changes in society which are certain to take place in the centuries following its creation?

           

3)  Is the theory general enough to apply to more than one narrow area of human existence?

After having researched and written on the concept of Information Theory, I have come to the conclusion that the answer to each of the above three questions is yes; therefore, Information Theory continues to be a valuable tool for use in addressing problems and in meeting practical needs in the real world.  Although Shannon’s article is extremely difficult to understand, (For example, it’s not easy for even an educated person to make sense of the following, which was taken from Shannon’s work “A Mathematical Theory of Communication”:

 

 

(<http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html>)) other theorists (Weaver, Littlejohn, Darnell, and others) have explained it in language which can be readily comprehended by the educated individual.  Information Theory provides a believable and reasonable analysis of how messages are transmitted from sender to receiver across a channel.  Information Theory is general enough be adapted to the discovery and acknowledgment of new information, such as the importance of feedback in the communication of messages.  As such, Information Theory is applicable to areas far removed from the field of telecommunications, and this versatility adds to its value.  For example, in his article “Information Theory: An Approach to Human Communication,” Donald Darnell analyzes the American jury system from the standpoint of Information Theory.  Darnell’s statement that this application of information theory “is quite beyond the scope of the original theory” (164) only adds to our understanding of how versatile the theory truly is.  The following is Darnell’s finding in his own words:

 [The jury] system, which was presumably designed without the benefits of information theory, allows for a large margin of error.  That is, the typical decision  between guilty and not-guilty world require a one-bit absorption capacity for    satisfactory decisions to be made.  A twelve-man jury would, theoretically, be capable of transmitting twelve bits of information with a unanimous decision-a considerable safety factor.  Given that advocates for the defense and prosecution are allowed to challenge and eliminate jurors suspected of bias, and that jurors are typically not called upon to make a succession of related judgments, the system should be entirely adequate.  However, in the American jury-trial system, jurors do not make their judgments independently, but are permitted to arrive at their collective decision through group deliberation.  Given, for instance, the effects of interpersonal influence investigated by Asch, there is reason to believe that some juries are less capable of rendering a satisfactory decision than a single, qualified judge...If jurors were required to render their decisions independently, a smaller number, or a less-than-unanimous decision would (according to this view) be more satisfactory in every case than the present system.  The principle that independent data units provide more information than dependent data units is clearly recognized in statistics.  It is apparently not recognized in the American legal system, or is considered of less significance than other distorting influences.

The previous discussion has provided an overview of all that is involved in Information Theory---its basic concepts, an anchoring of its chief theorist, Claude Shannon, in time and place, an explanation of why it is a significant theory to study, the formula of criteria with which to analyze the theory, the actual analysis of the theory, and a discussion of ways in which the theory can be helpful in addressing problems or meeting practical needs in the “real world”.  The implications of the use of Information Theory are impressive, and as such it has already stood the test of time.  I am confident that it will continue to remain one of the most important theories in the study of Communication for years to come.

 

Works Cited

 

“Brief Excerpts from Warren Weaver’s Introduction to Claude Shannon’s The Mathematical Theory of Communication.”  3 Mar. 2002.  <http://darkwing.uoregon.edu/~felsing/virtual_asia/info.html>.

 

Darnell, Donald K.  “Information Theory: An Approach to Human Communication.”  Approaches to Human Communication.  Ed.  Budd, Richard W. and Brent D. Ruben.  New York:  Spartan, 1972. 

 

Littlejohn, Steven W.  Theories of Human Communication.  7th ed.  Belmont:  Wadsworth Thompson, 2002.

 

Shannon, C.E.  “A Mathematical Theory of Communication.”  13 Mar. 2002. 
<http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html>.