|
Information theory is the branch of mathematics concerned with the analysis of communication. Like games, communication is an ancient characteristic of the human race; like game theory, information theory was not developed until the 20th century. The first work in the field was that of Claude Elwood Shannon (1916 - ). There are three major aspects of communication: the actual passing of information from one device to another (via a telephone line, for example, or along the nerves between the ear and the brain); the semantic content of that information (understanding the meaning of what is said); and the effective content of that information (the emotional content of what is said—the use of propagandist words or the tone of voice, for instance). These three areas all affect one another, and there is considerable overlap between them.
The efficiency of the passing of information is obviously crucial to making any assessment of the meaning of the information in the other two parts of communication. Shannon principally investigated the first, engineering aspect, because it was most susceptible to mathematical analysis. Shannon\'s work revealed a great deal about the way that we communicate and also about what it is possible to communicate (or what it is possible to know about A given knowledge about B and the connections between A and B). For example, about half of the content of spoken English is redundant, which is why it is possible easily to correct spelling errors caused by inefficient transmission.
Information theory has had a massive effect on the way that electronic communications have been developed during the 20th century, particularly through coding theory. It also has applications, often unrecognized, in other fields such as economics. SMcL |
|