Information theory studies the transmission,
processing, utilization, and extraction of information. Abstractly, information can
be thought of as the resolution of uncertainty. In
the case of communication of information over a noisy channel, this abstract the concept was made concrete in 1948 by Claude
Shannon in A
Mathematical Theory of Communication, in which
“information” is thought of as a set of possible messages, where the goal is to
send these messages over a noisy channel, and then to have the receiver
reconstruct the message with a low probability of error, in spite of the channel
noise.
Information theory is closely associated with a collection of pure and
applied disciplines that have been investigated and reduced to engineering
practice under a variety of rubrics throughout the world over the past half-century or more: adaptive systems, anticipatory systems, artificial intelligence,
complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of any descriptions. Information theory is a broad and deep
mathematical theory, with equally broad and deep applications, amongst which is
the vital field of coding theory. Information theory is also used
in information
retrieval, intelligence gathering, gambling, statistics, and even in musical composition.
No comments:
Post a Comment
If you have any doubt, let me know