Information theory is the mathematical study of the quantification, storage, and communication of information. The field was established and formalized by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.
A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (which has two equally likely outcomes) provides less information (lower entropy, less uncertainty) than identifying the outcome from a roll of a die (which has six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.
Applications of fundamental topics of information theory include source coding/data compression (e.g. for ZIP files), and channel coding/error detection and correction (e.g. for DSL). Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones and the development of the Internet and artificial intelligence. The theory has also found applications in other areas, including statistical inference,cryptography, neurobiology,perception,signal processing,linguistics, the evolution and function of molecular codes (bioinformatics), thermal physics,molecular dynamics,black holes, quantum computing, information retrieval, intelligence gathering, plagiarism detection,pattern recognition, anomaly detection, the analysis of music,art creation,imaging system design, study of outer space, the dimensionality of space, and epistemology.
Overview
Information theory studies the transmission, processing, extraction, and utilization of information. Abstractly, information can be thought of as the resolution of uncertainty. In the case of communication of information over a noisy channel, this abstract concept was formalized in 1948 by Claude Shannon in a paper entitled A Mathematical Theory of Communication, in which information is thought of as a set of possible messages, and the goal is to send these messages over a noisy channel, and to have the receiver reconstruct the message with low probability of error, in spite of the channel noise. Shannon's main result, the noisy-channel coding theorem, showed that, in the limit of many channel uses, the rate of information that is asymptotically achievable is equal to the channel capacity, a quantity dependent merely on the statistics of the channel over which the messages are sent.
Coding theory is concerned with finding explicit methods, called codes, for increasing the efficiency and reducing the error rate of data communication over noisy channels to near the channel capacity. These codes can be roughly subdivided into data compression (source coding) and error-correction (channel coding) techniques. In the latter case, it took many years to find the methods Shannon's work proved were possible.[citation needed]
A third class of information theory codes are cryptographic algorithms (both codes and ciphers). Concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis, such as the unit ban.[citation needed]
Historical background
The landmark event establishing the discipline of information theory and bringing it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948. Historian James Gleick rated the paper as the most important development of 1948, noting that the paper was "even more profound and more fundamental" than the transistor. He came to be known as the "father of information theory". Shannon outlined some of his initial ideas of information theory as early as 1939 in a letter to Vannevar Bush.
Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability. Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation W = K log m (recalling the Boltzmann constant), where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant. Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as H = log Sn = n log S, where S was the number of possible symbols, and n the number of symbols in a transmission. The unit of information was therefore the decimal digit, which since has sometimes been called the hartley in his honor as a unit or scale or measure of information. Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.[citation needed]
Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs. Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory.[citation needed]
In Shannon's revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion:
- "The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."
With it came the ideas of:
- the information entropy and redundancy of a source, and its relevance through the source coding theorem;
- the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
- the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; as well as
- the bit—a new way of seeing the most fundamental unit of information.[citation needed]
Quantities of information
This section does not cite any sources.(April 2024) |
Information theory is based on probability theory and statistics, where quantified information is usually described in terms of bits. Information theory often concerns itself with measures of information of the distributions associated with random variables. One of the most important measures is called entropy, which forms the building block of many other measures. Entropy allows quantification of measure of information in a single random variable. Another useful concept is mutual information defined on two random variables, which describes the measure of information in common between those variables, which can be used to describe their correlation. The former quantity is a property of the probability distribution of a random variable and gives a limit on the rate at which data generated by independent samples with the given distribution can be reliably compressed. The latter is a property of the joint distribution of two random variables, and is the maximum rate of reliable communication across a noisy channel in the limit of long block lengths, when the channel statistics are determined by the joint distribution.
The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. A common unit of information is the bit or shannon, based on the binary logarithm. Other units include the nat, which is based on the natural logarithm, and the decimal digit, which is based on the common logarithm.
In what follows, an expression of the form p log p is considered by convention to be equal to zero whenever p = 0. This is justified because for any logarithmic base.
Entropy of an information source
Based on the probability mass function of each source symbol to be communicated, the Shannon entropy H, in units of bits (per symbol), is given by
where pi is the probability of occurrence of the i-th possible value of the source symbol. This equation gives the entropy in the units of "bits" (per symbol) because it uses a logarithm of base 2, and this base-2 measure of entropy has sometimes been called the shannon in his honor. Entropy is also commonly computed using the natural logarithm (base e, where e is Euler's number), which produces a measurement of entropy in nats per symbol and sometimes simplifies the analysis by avoiding the need to include extra constants in the formulas. Other bases are also possible, but less commonly used. For example, a logarithm of base 28 = 256 will produce a measurement in bytes per symbol, and a logarithm of base 10 will produce a measurement in decimal digits (or hartleys) per symbol.
Intuitively, the entropy HX of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X when only its distribution is known.
The entropy of a source that emits a sequence of N symbols that are independent and identically distributed (iid) is N ⋅ H bits (per message of N symbols). If the source data symbols are identically distributed but not independent, the entropy of a message of length N will be less than N ⋅ H.
If one transmits 1000 bits (0s and 1s), and the value of each of these bits is known to the receiver (has a specific value with certainty) ahead of transmission, it is clear that no information is transmitted. If, however, each bit is independently equally likely to be 0 or 1, 1000 shannons of information (more often called bits) have been transmitted. Between these two extremes, information can be quantified as follows. If is the set of all messages {x1, ..., xn} that X could be, and p(x) is the probability of some , then the entropy, H, of X is defined:
(Here, I(x) is the self-information, which is the entropy contribution of an individual message, and is the expected value.) A property of entropy is that it is maximized when all the messages in the message space are equiprobable p(x) = 1/n; i.e., most unpredictable, in which case H(X) = log n.
The special case of information entropy for a random variable with two outcomes is the binary entropy function, usually taken to the logarithmic base 2, thus having the shannon (Sh) as unit:
Joint entropy
The joint entropy of two discrete random variables X and Y is merely the entropy of their pairing: (X, Y). This implies that if X and Y are independent, then their joint entropy is the sum of their individual entropies.
For example, if (X, Y) represents the position of a chess piece—X the row and Y the column, then the joint entropy of the row of the piece and the column of the piece will be the entropy of the position of the piece.
Despite similar notation, joint entropy should not be confused with cross-entropy.
Conditional entropy (equivocation)
The conditional entropy or conditional uncertainty of X given random variable Y (also called the equivocation of X about Y) is the average conditional entropy over Y:
Because entropy can be conditioned on a random variable or on that random variable being a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use. A basic property of this form of conditional entropy is that:
Mutual information (transinformation)
Mutual information measures the amount of information that can be obtained about one random variable by observing another. It is important in communication where it can be used to maximize the amount of information shared between sent and received signals. The mutual information of X relative to Y is given by:
where SI (Specific mutual Information) is the pointwise mutual information.
A basic property of the mutual information is that
That is, knowing Y, we can save an average of I(X; Y) bits in encoding X compared to not knowing Y.
Mutual information is symmetric:
Mutual information can be expressed as the average Kullback–Leibler divergence (information gain) between the posterior probability distribution of X given the value of Y and the prior distribution on X:
In other words, this is a measure of how much, on the average, the probability distribution on X will change if we are given the value of Y. This is often recalculated as the divergence from the product of the marginal distributions to the actual joint distribution:
Mutual information is closely related to the log-likelihood ratio test in the context of contingency tables and the multinomial distribution and to Pearson's χ2 test: mutual information can be considered a statistic for assessing independence between a pair of variables, and has a well-specified asymptotic distribution.
Kullback–Leibler divergence (information gain)
The Kullback–Leibler divergence (or information divergence, information gain, or relative entropy) is a way of comparing two distributions: a "true" probability distribution , and an arbitrary probability distribution . If we compress data in a manner that assumes is the distribution underlying some data, when, in reality, is the correct distribution, the Kullback–Leibler divergence is the number of average additional bits per datum necessary for compression. It is thus defined
Although it is sometimes used as a 'distance metric', KL divergence is not a true metric since it is not symmetric and does not satisfy the triangle inequality (making it a semi-quasimetric).
Another interpretation of the KL divergence is the "unnecessary surprise" introduced by a prior from the truth: suppose a number X is about to be drawn randomly from a discrete set with probability distribution . If Alice knows the true distribution , while Bob believes (has a prior) that the distribution is , then Bob will be more surprised than Alice, on average, upon seeing the value of X. The KL divergence is the (objective) expected value of Bob's (subjective) surprisal minus Alice's surprisal, measured in bits if the log is in base 2. In this way, the extent to which Bob's prior is "wrong" can be quantified in terms of how "unnecessarily surprised" it is expected to make him.
Directed Information
Directed information, , is an information theory measure that quantifies the information flow from the random process to the random process . The term directed information was coined by James Massey and is defined as
- ,
where is the conditional mutual information .
In contrast to mutual information, directed information is not symmetric. The measures the information bits that are transmitted causally[clarification needed] from to . The Directed information has many applications in problems where causality plays an important role such as capacity of channel with feedback, capacity of discrete memoryless networks with feedback,gambling with causal side information,compression with causal side information,real-time control communication settings, and in statistical physics.
Other quantities
Other important information theoretic quantities include the Rényi entropy and the Tsallis entropy (generalizations of the concept of entropy), differential entropy (a generalization of quantities of information to continuous distributions), and the conditional mutual information. Also, pragmatic information has been proposed as a measure of how much information has been used in making a decision.
Coding theory
This section does not cite any sources.(April 2024) |
Coding theory is one of the most important and direct applications of information theory. It can be subdivided into source coding theory and channel coding theory. Using a statistical description for data, information theory quantifies the number of bits needed to describe the data, which is the information entropy of the source.
- Data compression (source coding): There are two formulations for the compression problem:
- lossless data compression: the data must be reconstructed exactly;
- lossy data compression: allocates bits needed to reconstruct the data, within a specified fidelity level measured by a distortion function. This subset of information theory is called rate–distortion theory.
- Error-correcting codes (channel coding): While data compression removes as much redundancy as possible, an error-correcting code adds just the right kind of redundancy (i.e., error correction) needed to transmit the data efficiently and faithfully across a noisy channel.
This division of coding theory into compression and transmission is justified by the information transmission theorems, or source–channel separation theorems that justify the use of bits as the universal currency for information in many contexts. However, these theorems only hold in the situation where one transmitting user wishes to communicate to one receiving user. In scenarios with more than one transmitter (the multiple-access channel), more than one receiver (the broadcast channel) or intermediary "helpers" (the relay channel), or more general networks, compression followed by transmission may no longer be optimal.
Source theory
Any process that generates successive messages can be considered a source of information. A memoryless source is one in which each message is an independent identically distributed random variable, whereas the properties of ergodicity and stationarity impose less restrictive constraints. All such sources are stochastic. These terms are well studied in their own right outside information theory.
Rate
Information rate is the average entropy per symbol. For memoryless sources, this is merely the entropy of each symbol, while, in the case of a stationary stochastic process, it is:
that is, the conditional entropy of a symbol given all the previous symbols generated. For the more general case of a process that is not necessarily stationary, the average rate is:
that is, the limit of the joint entropy per symbol. For stationary sources, these two expressions give the same result.
The information rate is defined as:
It is common in information theory to speak of the "rate" or "entropy" of a language. This is appropriate, for example, when the source of information is English prose. The rate of a source of information is related to its redundancy and how well it can be compressed, the subject of source coding.
Channel capacity
Communications over a channel is the primary motivation of information theory. However, channels often fail to produce exact reconstruction of a signal; noise, periods of silence, and other forms of signal corruption often degrade quality.
Consider the communications process over a discrete channel. A simple model of the process is shown below:
Here X represents the space of messages transmitted, and Y the space of messages received during a unit time over our channel. Let p(y|x) be the conditional probability distribution function of Y given X. We will consider p(y|x) to be an inherent fixed property of our communications channel (representing the nature of the noise of our channel). Then the joint distribution of X and Y is completely determined by our channel and by our choice of f(x), the marginal distribution of messages we choose to send over the channel. Under these constraints, we would like to maximize the rate of information, or the signal, we can communicate over the channel. The appropriate measure for this is the mutual information, and this maximum mutual information is called the channel capacity and is given by:
This capacity has the following property related to communicating at information rate R (where R is usually bits per symbol). For any information rate R < C and coding error ε > 0, for large enough N, there exists a code of length N and rate ≥ R and a decoding algorithm, such that the maximal probability of block error is ≤ ε; that is, it is always possible to transmit with arbitrarily small block error. In addition, for any rate R > C, it is impossible to transmit with arbitrarily small block error.
Channel coding is concerned with finding such nearly optimal codes that can be used to transmit data over a noisy channel with a small coding error at a rate near the channel capacity.
Capacity of particular channel models
- A continuous-time analog communications channel subject to Gaussian noise—see Shannon–Hartley theorem.
- A binary symmetric channel (BSC) with crossover probability p is a binary input, binary output channel that flips the input bit with probability p. The BSC has a capacity of 1 − Hb(p) bits per channel use, where Hb is the binary entropy function to the base-2 logarithm:
- A binary erasure channel (BEC) with erasure probability p is a binary input, ternary output channel. The possible channel outputs are 0, 1, and a third symbol 'e' called an erasure. The erasure represents complete loss of information about an input bit. The capacity of the BEC is 1 − p bits per channel use.
Channels with memory and directed information
In practice many channels have memory. Namely, at time the channel is given by the conditional probability. It is often more comfortable to use the notation and the channel become . In such a case the capacity is given by the mutual information rate when there is no feedback available and the Directed information rate in the case that either there is feedback or not (if there is no feedback the directed information equals the mutual information).
Fungible information
Fungible information is the information for which the means of encoding is not important. Classical information theorists and computer scientists are mainly concerned with information of this sort. It is sometimes referred as speakable information.
Applications to other fields
Intelligence uses and secrecy applications
This section does not cite any sources.(April 2024) |
Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the ban, was used in the Ultra project, breaking the German Enigma machine code and hastening the end of World War II in Europe. Shannon himself defined an important concept now called the unicity distance. Based on the redundancy of the plaintext, it attempts to give a minimum amount of ciphertext necessary to ensure unique decipherability.
Information theory leads us to believe it is much more difficult to keep secrets than it might first appear. A brute force attack can break systems based on asymmetric key algorithms or on most commonly used methods of symmetric key algorithms (sometimes called secret key algorithms), such as block ciphers. The security of all such methods comes from the assumption that no known attack can break them in a practical amount of time.
Information theoretic security refers to methods such as the one-time pad that are not vulnerable to such brute force attacks. In such cases, the positive conditional mutual information between the plaintext and ciphertext (conditioned on the key) can ensure proper transmission, while the unconditional mutual information between the plaintext and ciphertext remains zero, resulting in absolutely secure communications. In other words, an eavesdropper would not be able to improve his or her guess of the plaintext by gaining knowledge of the ciphertext but not of the key. However, as in any other cryptographic system, care must be used to correctly apply even information-theoretically secure methods; the Venona project was able to crack the one-time pads of the Soviet Union due to their improper reuse of key material.
Pseudorandom number generation
This section does not cite any sources.(April 2024) |
Pseudorandom number generators are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software. A class of improved random number generators is termed cryptographically secure pseudorandom number generators, but even they require random seeds external to the software to work as intended. These can be obtained via extractors, if done carefully. The measure of sufficient randomness in extractors is min-entropy, a value related to Shannon entropy through Rényi entropy; Rényi entropy is also used in evaluating randomness in cryptographic systems. Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses.
Seismic exploration
One early commercial application of information theory was in the field of seismic oil exploration. Work in this field made it possible to strip off and separate the unwanted noise from the desired seismic signal. Information theory and digital signal processing offer a major improvement of resolution and image clarity over previous analog methods.
Semiotics
Semioticians and Winfried Nöth both considered Charles Sanders Peirce as having created a theory of information in his works on semiotics.: 171 : 137 Nauta defined semiotic information theory as the study of "the internal processes of coding, filtering, and information processing.": 91
Concepts from information theory such as redundancy and code control have been used by semioticians such as Umberto Eco and
to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.Integrated process organization of neural information
Quantitative information theoretic methods have been applied in cognitive science to analyze the integrated process organization of neural information in the context of the binding problem in cognitive neuroscience. In this context, either an information-theoretical measure, such as functional clusters (Gerald Edelman and Giulio Tononi's functional clustering model and dynamic core hypothesis (DCH)) or effective information (Tononi's integrated information theory (IIT) of consciousness), is defined (on the basis of a reentrant process organization, i.e. the synchronization of neurophysiological activity between groups of neuronal populations), or the measure of the minimization of free energy on the basis of statistical methods (Karl J. Friston's free energy principle (FEP), an information-theoretical measure which states that every adaptive change in a self-organized system leads to a minimization of free energy, and the Bayesian brain hypothesis).
Miscellaneous applications
Information theory also has applications in the search for extraterrestrial intelligence,black holes, bioinformatics, and gambling.
See also
- Algorithmic probability
- Bayesian inference
- Communication theory
- Constructor theory – a generalization of information theory that includes quantum information
- Formal science
- Inductive probability
- Info-metrics
- Minimum message length
- Minimum description length
- Philosophy of information
Applications
- Active networking
- Cryptanalysis
- Cryptography
- Cybernetics
- Entropy in thermodynamics and information theory
- Gambling
- Intelligence (information gathering)
- Seismic exploration
History
- Hartley, R.V.L.
- History of information theory
- Shannon, C.E.
- Timeline of information theory
- Yockey, H.P.
- Andrey Kolmogorov
Theory
- Coding theory
- Detection theory
- Estimation theory
- Fisher information
- Information algebra
- Information asymmetry
- Information field theory
- Information geometry
- Information theory and measure theory
- Kolmogorov complexity
- List of unsolved problems in information theory
- Logic of information
- Network coding
- Philosophy of information
- Quantum information science
- Source coding
Concepts
- Ban (unit)
- Channel capacity
- Communication channel
- Communication source
- Conditional entropy
- Covert channel
- Data compression
- Decoder
- Differential entropy
- Fungible information
- Information fluctuation complexity
- Information entropy
- Joint entropy
- Kullback–Leibler divergence
- Mutual information
- Pointwise mutual information (PMI)
- Receiver (information theory)
- Redundancy
- Rényi entropy
- Self-information
- Unicity distance
- Variety
- Hamming distance
- Perplexity
References
- Schneider, Thomas D. (2006). "Claude Shannon: Biologist". IEEE Engineering in Medicine and Biology Magazine: The Quarterly Magazine of the Engineering in Medicine & Biology Society. 25 (1): 30–33. doi:10.1109/memb.2006.1578661. ISSN 0739-5175. PMC 1538977. PMID 16485389.
- Cruces, Sergio; Martín-Clemente, Rubén; Samek, Wojciech (2019-07-03). "Information Theory Applications in Signal Processing". Entropy. 21 (7): 653. Bibcode:2019Entrp..21..653C. doi:10.3390/e21070653. ISSN 1099-4300. PMC 7515149. PMID 33267367.
- Baleanu, D.; Balas, Valentina Emilia; Agarwal, Praveen, eds. (2023). Fractional Order Systems and Applications in Engineering. Advanced Studies in Complex Systems. London, United Kingdom: Academic Press. p. 23. ISBN 978-0-323-90953-2. OCLC 1314337815.
- Horgan, John (2016-04-27). "Claude Shannon: Tinkerer, Prankster, and Father of Information Theory". IEEE. Retrieved 2024-11-08.
- Shi, Zhongzhi (2011). Advanced Artificial Intelligence. World Scientific Publishing. p. 2. doi:10.1142/7547. ISBN 978-981-4291-34-7.
- Sinha, Sudhi; Al Huraimel, Khaled (2020-10-20). Reimagining Businesses with AI (1 ed.). Wiley. p. 4. doi:10.1002/9781119709183. ISBN 978-1-119-70915-2.
- Burnham, K. P.; Anderson, D. R. (2002). Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach (Second ed.). New York: Springer Science. ISBN 978-0-387-95364-9.
- F. Rieke; D. Warland; R Ruyter van Steveninck; W Bialek (1997). Spikes: Exploring the Neural Code. The MIT press. ISBN 978-0262681087.
- Delgado-Bonal, Alfonso; Martín-Torres, Javier (2016-11-03). "Human vision is determined based on information theory". Scientific Reports. 6 (1): 36038. Bibcode:2016NatSR...636038D. doi:10.1038/srep36038. ISSN 2045-2322. PMC 5093619. PMID 27808236.
- cf; Huelsenbeck, J. P.; Ronquist, F.; Nielsen, R.; Bollback, J. P. (2001). "Bayesian inference of phylogeny and its impact on evolutionary biology". Science. 294 (5550): 2310–2314. Bibcode:2001Sci...294.2310H. doi:10.1126/science.1065889. PMID 11743192. S2CID 2138288.
- Allikmets, Rando; Wasserman, Wyeth W.; Hutchinson, Amy; Smallwood, Philip; Nathans, Jeremy; Rogan, Peter K. (1998). "Thomas D. Schneider], Michael Dean (1998) Organization of the ABCR gene: analysis of promoter and splice junction sequences". Gene. 215 (1): 111–122. doi:10.1016/s0378-1119(98)00269-8. PMID 9666097.
- Jaynes, E. T. (1957). "Information Theory and Statistical Mechanics". Phys. Rev. 106 (4): 620. Bibcode:1957PhRv..106..620J. doi:10.1103/physrev.106.620. S2CID 17870175.
- Talaat, Khaled; Cowen, Benjamin; Anderoglu, Osman (2020-10-05). "Method of information entropy for convergence assessment of molecular dynamics simulations". Journal of Applied Physics. 128 (13): 135102. Bibcode:2020JAP...128m5102T. doi:10.1063/5.0019078. OSTI 1691442. S2CID 225010720.
- Bennett, Charles H.; Li, Ming; Ma, Bin (2003). "Chain Letters and Evolutionary Histories". Scientific American. 288 (6): 76–81. Bibcode:2003SciAm.288f..76B. doi:10.1038/scientificamerican0603-76. PMID 12764940. Archived from the original on 2007-10-07. Retrieved 2008-03-11.
- David R. Anderson (November 1, 2003). "Some background on why people in the empirical sciences may want to better understand the information-theoretic methods" (PDF). Archived from the original (PDF) on July 23, 2011. Retrieved 2010-06-23.
- Loy, D. Gareth (2017), Pareyon, Gabriel; Pina-Romero, Silvia; Agustín-Aquino, Octavio A.; Lluis-Puebla, Emilio (eds.), "Music, Expectation, and Information Theory", The Musical-Mathematical Mind: Patterns and Transformations, Computational Music Science, Cham: Springer International Publishing, pp. 161–169, doi:10.1007/978-3-319-47337-6_17, ISBN 978-3-319-47337-6, retrieved 2024-09-19
- Rocamora, Martín; Cancela, Pablo; Biscainho, Luiz (2019-04-05). "Information Theory Concepts Applied to the Analysis of Rhythm in Recorded Music with Recurrent Rhythmic Patterns". Journal of the Audio Engineering Society. 67 (4): 160–173. doi:10.17743/jaes.2019.0003.
- Marsden, Alan (2020). "New Prospects for Information Theory in Arts Research". Leonardo. 53 (3): 274–280. doi:10.1162/leon_a_01860. ISSN 0024-094X.
- Pinkard, Henry; Kabuli, Leyla; Markley, Eric; Chien, Tiffany; Jiao, Jiantao; Waller, Laura (2024). "Universal evaluation and design of imaging systems using information estimation". arXiv:2405.20559 [physics.optics].
- Wing, Simon; Johnson, Jay R. (2019-02-01). "Applications of Information Theory in Solar and Space Physics". Entropy. 21 (2): 140. Bibcode:2019Entrp..21..140W. doi:10.3390/e21020140. ISSN 1099-4300. PMC 7514618. PMID 33266856.
- Kak, Subhash (2020-11-26). "Information theory and dimensionality of space". Scientific Reports. 10 (1): 20733. doi:10.1038/s41598-020-77855-9. ISSN 2045-2322. PMC 7693271. PMID 33244156.
- Harms, William F. (1998). "The Use of Information Theory in Epistemology". Philosophy of Science. 65 (3): 472–501. doi:10.1086/392657. ISSN 0031-8248. JSTOR 188281.
- Gleick 2011, pp. 3–4.
- Horgan, John (2016-04-27). "Claude Shannon: Tinkerer, Prankster, and Father of Information Theory". IEEE. Retrieved 2023-09-30.
- Roberts, Siobhan (2016-04-30). "The Forgotten Father of the Information Age". The New Yorker. ISSN 0028-792X. Retrieved 2023-09-30.
- Tse, David (2020-12-22). "How Claude Shannon Invented the Future". Quanta Magazine. Retrieved 2023-09-30.
- Braverman, Mark (September 19, 2011). "Information Theory in Computer Science" (PDF).
- Reza 1994.
- Ash 1990.
- Massey, James (1990), "Causality, Feedback And Directed Information", Proc. 1990 Intl. Symp. on Info. Th. and its Applications, CiteSeerX 10.1.1.36.5688
- Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:cs/0608070. doi:10.1109/TIT.2008.2009849. S2CID 13178.
- Kramer, G. (January 2003). "Capacity results for the discrete memoryless network". IEEE Transactions on Information Theory. 49 (1): 4–21. doi:10.1109/TIT.2002.806135.
- Permuter, Haim H.; Kim, Young-Han; Weissman, Tsachy (June 2011). "Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing". IEEE Transactions on Information Theory. 57 (6): 3248–3259. arXiv:0912.4872. doi:10.1109/TIT.2011.2136270. S2CID 11722596.
- Simeone, Osvaldo; Permuter, Haim Henri (June 2013). "Source Coding When the Side Information May Be Delayed". IEEE Transactions on Information Theory. 59 (6): 3607–3618. arXiv:1109.1293. doi:10.1109/TIT.2013.2248192. S2CID 3211485.
- Charalambous, Charalambos D.; Stavrou, Photios A. (August 2016). "Directed Information on Abstract Spaces: Properties and Variational Equalities". IEEE Transactions on Information Theory. 62 (11): 6019–6052. arXiv:1302.3971. doi:10.1109/TIT.2016.2604846. S2CID 8107565.
- Tanaka, Takashi; Esfahani, Peyman Mohajerin; Mitter, Sanjoy K. (January 2018). "LQG Control With Minimum Directed Information: Semidefinite Programming Approach". IEEE Transactions on Automatic Control. 63 (1): 37–52. arXiv:1510.04214. doi:10.1109/TAC.2017.2709618. S2CID 1401958. Archived from the original on Apr 12, 2024 – via TU Delft Repositories.
- Vinkler, Dror A; Permuter, Haim H; Merhav, Neri (20 April 2016). "Analogy between gambling and measurement-based work extraction". Journal of Statistical Mechanics: Theory and Experiment. 2016 (4): 043403. arXiv:1404.6788. Bibcode:2016JSMTE..04.3403V. doi:10.1088/1742-5468/2016/04/043403. S2CID 124719237.
- Jerry D. Gibson (1998). Digital Compression for Multimedia: Principles and Standards. Morgan Kaufmann. ISBN 1-55860-369-7.
- Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:cs/0608070. doi:10.1109/TIT.2008.2009849. S2CID 13178.
- Bartlett, Stephen D.; Rudolph, Terry; Spekkens, Robert W. (April–June 2007). "Reference frames, superselection rules, and quantum information". Reviews of Modern Physics. 79 (2): 555–606. arXiv:quant-ph/0610030. Bibcode:2007RvMP...79..555B. doi:10.1103/RevModPhys.79.555.
- Peres, A.; P. F. Scudo (2002b). A. Khrennikov (ed.). Quantum Theory: Reconsideration of Foundations. Växjö University Press, Växjö, Sweden. p. 283.
- Haggerty, Patrick E. (1981). "The corporation and innovation". Strategic Management Journal. 2 (2): 97–118. doi:10.1002/smj.4250020202.
- Nauta, Doede (1972). The Meaning of Information. The Hague: Mouton. ISBN 9789027919960.
- Nöth, Winfried (January 2012). "Charles S. Peirce's theory of information: a theory of the growth of symbols and of knowledge". Cybernetics and Human Knowing. 19 (1–2): 137–161.
- Nöth, Winfried (1981). "Semiotics of ideology". Semiotica, Issue 148.
- Maurer, H. (2021). "Chapter 10: Systematic Class of Information Based Architecture Types". Cognitive Science: Integrative Synchronization Mechanisms in Cognitive Neuroarchitectures of the Modern Connectionism. Boca Raton/FL: CRC Press. doi:10.1201/9781351043526. ISBN 978-1-351-04352-6.
- Edelman, G.M.; Tononi, G. (2000). A Universe of Consciousness: How Matter Becomes Imagination. New York: Basic Books. ISBN 978-0465013777.
- Tononi, G.; Sporns, O. (2003). "Measuring information integration". BMC Neuroscience. 4: 1–20. doi:10.1186/1471-2202-4-31. PMC 331407. PMID 14641936.
- Tononi, G. (2004a). "An information integration theory of consciousness". BMC Neuroscience. 5: 1–22. doi:10.1186/1471-2202-5-42. PMC 543470. PMID 15522121.
- Tononi, G. (2004b). "Consciousness and the brain: theoretical aspects". In Adelman, G.; Smith, B. (eds.). Encyclopedia of Neuroscience (3rd ed.). Amsterdam, Oxford: Elsevier. ISBN 0-444-51432-5. Archived (PDF) from the original on 2023-12-02.
- Friston, K.; Stephan, K.E. (2007). "Free-energy and the brain". Synthese. 159 (3): 417–458. doi:10.1007/s11229-007-9237-y. PMC 2660582. PMID 19325932.
- Friston, K. (2010). "The free-energy principle: a unified brain theory". Nature Reviews Neuroscience. 11 (2): 127–138. doi:10.1038/nrn2787. PMID 20068583.
- Friston, K.; Breakstear, M.; Deco, G. (2012). "Perception and self-organized instability". Frontiers in Computational Neuroscience. 6: 1–19. doi:10.3389/fncom.2012.00044. PMC 3390798. PMID 22783185.
- Friston, K. (2013). "Life as we know it". Journal of the Royal Society Interface. 10 (86): 20130475. doi:10.1098/rsif.2013.0475. PMC 3730701. PMID 23825119.
- Kirchhoff, M.; Parr, T.; Palacios, E.; Friston, K.; Kiverstein, J. (2018). "The Markov blankets of life: autonomy, active inference and the free energy principle". Journal of the Royal Society Interface. 15 (138): 20170792. doi:10.1098/rsif.2017.0792. PMC 5805980. PMID 29343629.
- Doyle, Laurance R.; McCowan, Brenda; Johnston, Simon; Hanser, Sean F. (February 2011). "Information theory, animal communication, and the search for extraterrestrial intelligence". Acta Astronautica. 68 (3–4): 406–417. Bibcode:2011AcAau..68..406D. doi:10.1016/j.actaastro.2009.11.018.
- Bekenstein, Jacob D (2004). "Black holes and information theory". Contemporary Physics. 45 (1): 31–43. arXiv:quant-ph/0311049. Bibcode:2004ConPh..45...31B. doi:10.1080/00107510310001632523. ISSN 0010-7514.
- Vinga, Susana (2014-05-01). "Information theory applications for biological sequence analysis". Briefings in Bioinformatics. 15 (3): 376–389. doi:10.1093/bib/bbt068. ISSN 1467-5463. PMC 7109941. PMID 24058049.
- Thorp, Edward O. (2008-01-01), Zenios, S. A.; Ziemba, W. T. (eds.), "The kelly criterion in blackjack sports betting, and the stock market*", Handbook of Asset and Liability Management, San Diego: North-Holland, pp. 385–428, doi:10.1016/b978-044453248-0.50015-0, ISBN 978-0-444-53248-0, retrieved 2025-01-20
- Haigh, John (2000). "The Kelly Criterion and Bet Comparisons in Spread Betting". Journal of the Royal Statistical Society: Series D (The Statistician). 49 (4): 531–539. doi:10.1111/1467-9884.00251. ISSN 1467-9884.
Further reading
The classic work
- Shannon, C.E. (1948), "A Mathematical Theory of Communication", Bell System Technical Journal, 27, pp. 379–423 & 623–656, July & October, 1948. PDF.
Notes and other formats. - R.V.L. Hartley, "Transmission of Information", Bell System Technical Journal, July 1928
- Andrey Kolmogorov (1968), "Three approaches to the quantitative definition of information" in International Journal of Computer Mathematics, 2, pp. 157–168.
Other journal articles
- J. L. Kelly Jr., Princeton, "A New Interpretation of Information Rate" Bell System Technical Journal, Vol. 35, July 1956, pp. 917–26.
- R. Landauer, IEEE.org, "Information is Physical" Proc. Workshop on Physics and Computation PhysComp'92 (IEEE Comp. Sci.Press, Los Alamitos, 1993) pp. 1–4.
- Landauer, R. (1961). "Irreversibility and Heat Generation in the Computing Process" (PDF). IBM J. Res. Dev. 5 (3): 183–191. doi:10.1147/rd.53.0183.
- Timme, Nicholas; Alford, Wesley; Flecker, Benjamin; Beggs, John M. (2012). "Multivariate information measures: an experimentalist's perspective". arXiv:1111.6857 [cs.IT].
Textbooks on information theory
- Alajaji, F. and Chen, P.N. An Introduction to Single-User Information Theory. Singapore: Springer, 2018. ISBN 978-981-10-8000-5
- Arndt, C. Information Measures, Information and its Description in Science and Engineering (Springer Series: Signals and Communication Technology), 2004, ISBN 978-3-540-40855-0
- Ash, Robert B. (1990) [1965]. Information Theory. New York: Dover Publications, Inc. ISBN 0-486-66521-6.
- Gallager, R. Information Theory and Reliable Communication. New York: John Wiley and Sons, 1968. ISBN 0-471-29048-3
- Goldman, S. Information Theory. New York: Prentice Hall, 1953. New York: Dover 1968 ISBN 0-486-62209-6, 2005 ISBN 0-486-44271-3
- Cover, Thomas; Thomas, Joy A. (2006). Elements of information theory (2nd ed.). New York: Wiley-Interscience. ISBN 0-471-24195-4.
- Csiszar, I, Korner, J. Information Theory: Coding Theorems for Discrete Memoryless Systems Akademiai Kiado: 2nd edition, 1997. ISBN 963-05-7440-3
- MacKay, David J. C. Information Theory, Inference, and Learning Algorithms Cambridge: Cambridge University Press, 2003. ISBN 0-521-64298-1
- Mansuripur, M. Introduction to Information Theory. New York: Prentice Hall, 1987. ISBN 0-13-484668-0
- McEliece, R. The Theory of Information and Coding. Cambridge, 2002. ISBN 978-0521831857
- Pierce, JR. "An introduction to information theory: symbols, signals and noise". Dover (2nd Edition). 1961 (reprinted by Dover 1980).
- Reza, Fazlollah M. (1994) [1961]. An Introduction to Information Theory. New York: Dover Publications, Inc. ISBN 0-486-68210-2.
- Shannon, Claude; Weaver, Warren (1949). The Mathematical Theory of Communication (PDF). Urbana, Illinois: University of Illinois Press. ISBN 0-252-72548-4. LCCN 49-11922.
- Stone, JV. Chapter 1 of book "Information Theory: A Tutorial Introduction", University of Sheffield, England, 2014. ISBN 978-0956372857.
- Yeung, RW. A First Course in Information Theory Kluwer Academic/Plenum Publishers, 2002. ISBN 0-306-46791-7.
- Yeung, RW. Information Theory and Network Coding Springer 2008, 2002. ISBN 978-0-387-79233-0
Other books
- Leon Brillouin, Science and Information Theory, Mineola, N.Y.: Dover, [1956, 1962] 2004. ISBN 0-486-43918-6
- Gleick, James (2011). The Information: A History, a Theory, a Flood (1st ed.). New York: Vintage Books. ISBN 978-1-4000-9623-7.
- A. I. Khinchin, Mathematical Foundations of Information Theory, New York: Dover, 1957. ISBN 0-486-60434-9
- H. S. Leff and A. F. Rex, Editors, Maxwell's Demon: Entropy, Information, Computing, Princeton University Press, Princeton, New Jersey (1990). ISBN 0-691-08727-X
- Robert K. Logan. What is Information? - Propagating Organization in the Biosphere, the Symbolosphere, the Technosphere and the Econosphere, Toronto: DEMO Publishing.
- Tom Siegfried, The Bit and the Pendulum, Wiley, 2000. ISBN 0-471-32174-5
- Charles Seife, Decoding the Universe, Viking, 2006. ISBN 0-670-03441-X
- Jeremy Campbell, Grammatical Man, Touchstone/Simon & Schuster, 1982, ISBN 0-671-44062-4
- Henri Theil, Economics and Information Theory, Rand McNally & Company - Chicago, 1967.
- Escolano, Suau, Bonev, Information Theory in Computer Vision and Pattern Recognition, Springer, 2009. ISBN 978-1-84882-296-2
- Vlatko Vedral, Decoding Reality: The Universe as Quantum Information, Oxford University Press 2010. ISBN 0-19-923769-7
External links
- "Information", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
- Lambert F. L. (1999), "Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms - Examples of Entropy Increase? Nonsense!", Journal of Chemical Education
- IEEE Information Theory Society and ITSOC Monographs, Surveys, and Reviews Archived 2018-06-12 at the Wayback Machine
Information theory is the mathematical study of the quantification storage and communication of information The field was established and formalized by Claude Shannon in the 1940s though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley It is at the intersection of electronic engineering mathematics statistics computer science neurobiology physics and electrical engineering A key measure in information theory is entropy Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process For example identifying the outcome of a fair coin flip which has two equally likely outcomes provides less information lower entropy less uncertainty than identifying the outcome from a roll of a die which has six equally likely outcomes Some other important measures in information theory are mutual information channel capacity error exponents and relative entropy Important sub fields of information theory include source coding algorithmic complexity theory algorithmic information theory and information theoretic security Applications of fundamental topics of information theory include source coding data compression e g for ZIP files and channel coding error detection and correction e g for DSL Its impact has been crucial to the success of the Voyager missions to deep space the invention of the compact disc the feasibility of mobile phones and the development of the Internet and artificial intelligence The theory has also found applications in other areas including statistical inference cryptography neurobiology perception signal processing linguistics the evolution and function of molecular codes bioinformatics thermal physics molecular dynamics black holes quantum computing information retrieval intelligence gathering plagiarism detection pattern recognition anomaly detection the analysis of music art creation imaging system design study of outer space the dimensionality of space and epistemology OverviewInformation theory studies the transmission processing extraction and utilization of information Abstractly information can be thought of as the resolution of uncertainty In the case of communication of information over a noisy channel this abstract concept was formalized in 1948 by Claude Shannon in a paper entitled A Mathematical Theory of Communication in which information is thought of as a set of possible messages and the goal is to send these messages over a noisy channel and to have the receiver reconstruct the message with low probability of error in spite of the channel noise Shannon s main result the noisy channel coding theorem showed that in the limit of many channel uses the rate of information that is asymptotically achievable is equal to the channel capacity a quantity dependent merely on the statistics of the channel over which the messages are sent Coding theory is concerned with finding explicit methods called codes for increasing the efficiency and reducing the error rate of data communication over noisy channels to near the channel capacity These codes can be roughly subdivided into data compression source coding and error correction channel coding techniques In the latter case it took many years to find the methods Shannon s work proved were possible citation needed A third class of information theory codes are cryptographic algorithms both codes and ciphers Concepts methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis such as the unit ban citation needed Historical backgroundThe landmark event establishing the discipline of information theory and bringing it to immediate worldwide attention was the publication of Claude E Shannon s classic paper A Mathematical Theory of Communication in the Bell System Technical Journal in July and October 1948 Historian James Gleick rated the paper as the most important development of 1948 noting that the paper was even more profound and more fundamental than the transistor He came to be known as the father of information theory Shannon outlined some of his initial ideas of information theory as early as 1939 in a letter to Vannevar Bush Prior to this paper limited information theoretic ideas had been developed at Bell Labs all implicitly assuming events of equal probability Harry Nyquist s 1924 paper Certain Factors Affecting Telegraph Speed contains a theoretical section quantifying intelligence and the line speed at which it can be transmitted by a communication system giving the relation W K log m recalling the Boltzmann constant where W is the speed of transmission of intelligence m is the number of different voltage levels to choose from at each time step and K is a constant Ralph Hartley s 1928 paper Transmission of Information uses the word information as a measurable quantity reflecting the receiver s ability to distinguish one sequence of symbols from any other thus quantifying information as H log Sn n log S where S was the number of possible symbols and n the number of symbols in a transmission The unit of information was therefore the decimal digit which since has sometimes been called the hartley in his honor as a unit or scale or measure of information Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers citation needed Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J Willard Gibbs Connections between information theoretic entropy and thermodynamic entropy including the important contributions by Rolf Landauer in the 1960s are explored in Entropy in thermodynamics and information theory citation needed In Shannon s revolutionary and groundbreaking paper the work for which had been substantially completed at Bell Labs by the end of 1944 Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory opening with the assertion The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point With it came the ideas of the information entropy and redundancy of a source and its relevance through the source coding theorem the mutual information and the channel capacity of a noisy channel including the promise of perfect loss free communication given by the noisy channel coding theorem the practical result of the Shannon Hartley law for the channel capacity of a Gaussian channel as well as the bit a new way of seeing the most fundamental unit of information citation needed Quantities of informationThis section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed April 2024 Learn how and when to remove this message Information theory is based on probability theory and statistics where quantified information is usually described in terms of bits Information theory often concerns itself with measures of information of the distributions associated with random variables One of the most important measures is called entropy which forms the building block of many other measures Entropy allows quantification of measure of information in a single random variable Another useful concept is mutual information defined on two random variables which describes the measure of information in common between those variables which can be used to describe their correlation The former quantity is a property of the probability distribution of a random variable and gives a limit on the rate at which data generated by independent samples with the given distribution can be reliably compressed The latter is a property of the joint distribution of two random variables and is the maximum rate of reliable communication across a noisy channel in the limit of long block lengths when the channel statistics are determined by the joint distribution The choice of logarithmic base in the following formulae determines the unit of information entropy that is used A common unit of information is the bit or shannon based on the binary logarithm Other units include the nat which is based on the natural logarithm and the decimal digit which is based on the common logarithm In what follows an expression of the form p log p is considered by convention to be equal to zero whenever p 0 This is justified because limp 0 plog p 0 displaystyle lim p rightarrow 0 p log p 0 for any logarithmic base Entropy of an information source Based on the probability mass function of each source symbol to be communicated the Shannon entropy H in units of bits per symbol is given by H ipilog2 pi displaystyle H sum i p i log 2 p i where pi is the probability of occurrence of the i th possible value of the source symbol This equation gives the entropy in the units of bits per symbol because it uses a logarithm of base 2 and this base 2 measure of entropy has sometimes been called the shannon in his honor Entropy is also commonly computed using the natural logarithm base e where e is Euler s number which produces a measurement of entropy in nats per symbol and sometimes simplifies the analysis by avoiding the need to include extra constants in the formulas Other bases are also possible but less commonly used For example a logarithm of base 28 256 will produce a measurement in bytes per symbol and a logarithm of base 10 will produce a measurement in decimal digits or hartleys per symbol Intuitively the entropy HX of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X when only its distribution is known The entropy of a source that emits a sequence of N symbols that are independent and identically distributed iid is N H bits per message of N symbols If the source data symbols are identically distributed but not independent the entropy of a message of length N will be less than N H The entropy of a Bernoulli trial as a function of success probability often called the binary entropy function Hb p The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable as in an unbiased coin toss If one transmits 1000 bits 0s and 1s and the value of each of these bits is known to the receiver has a specific value with certainty ahead of transmission it is clear that no information is transmitted If however each bit is independently equally likely to be 0 or 1 1000 shannons of information more often called bits have been transmitted Between these two extremes information can be quantified as follows If X displaystyle mathbb X is the set of all messages x1 xn that X could be and p x is the probability of some x X displaystyle x in mathbb X then the entropy H of X is defined H X EX I x x Xp x log p x displaystyle H X mathbb E X I x sum x in mathbb X p x log p x Here I x is the self information which is the entropy contribution of an individual message and EX displaystyle mathbb E X is the expected value A property of entropy is that it is maximized when all the messages in the message space are equiprobable p x 1 n i e most unpredictable in which case H X log n The special case of information entropy for a random variable with two outcomes is the binary entropy function usually taken to the logarithmic base 2 thus having the shannon Sh as unit Hb p plog2 p 1 p log2 1 p displaystyle H mathrm b p p log 2 p 1 p log 2 1 p Joint entropy The joint entropy of two discrete random variables X and Y is merely the entropy of their pairing X Y This implies that if X and Y are independent then their joint entropy is the sum of their individual entropies For example if X Y represents the position of a chess piece X the row and Y the column then the joint entropy of the row of the piece and the column of the piece will be the entropy of the position of the piece H X Y EX Y log p x y x yp x y log p x y displaystyle H X Y mathbb E X Y log p x y sum x y p x y log p x y Despite similar notation joint entropy should not be confused with cross entropy Conditional entropy equivocation The conditional entropy or conditional uncertainty of X given random variable Y also called the equivocation of X about Y is the average conditional entropy over Y H X Y EY H X y y Yp y x Xp x y log p x y x yp x y log p x y displaystyle H X Y mathbb E Y H X y sum y in Y p y sum x in X p x y log p x y sum x y p x y log p x y Because entropy can be conditioned on a random variable or on that random variable being a certain value care should be taken not to confuse these two definitions of conditional entropy the former of which is in more common use A basic property of this form of conditional entropy is that H X Y H X Y H Y displaystyle H X Y H X Y H Y Mutual information transinformation Mutual information measures the amount of information that can be obtained about one random variable by observing another It is important in communication where it can be used to maximize the amount of information shared between sent and received signals The mutual information of X relative to Y is given by I X Y EX Y SI x y x yp x y log p x y p x p y displaystyle I X Y mathbb E X Y SI x y sum x y p x y log frac p x y p x p y where SI Specific mutual Information is the pointwise mutual information A basic property of the mutual information is that I X Y H X H X Y displaystyle I X Y H X H X Y That is knowing Y we can save an average of I X Y bits in encoding X compared to not knowing Y Mutual information is symmetric I X Y I Y X H X H Y H X Y displaystyle I X Y I Y X H X H Y H X Y Mutual information can be expressed as the average Kullback Leibler divergence information gain between the posterior probability distribution of X given the value of Y and the prior distribution on X I X Y Ep y DKL p X Y y p X displaystyle I X Y mathbb E p y D mathrm KL p X Y y p X In other words this is a measure of how much on the average the probability distribution on X will change if we are given the value of Y This is often recalculated as the divergence from the product of the marginal distributions to the actual joint distribution I X Y DKL p X Y p X p Y displaystyle I X Y D mathrm KL p X Y p X p Y Mutual information is closely related to the log likelihood ratio test in the context of contingency tables and the multinomial distribution and to Pearson s x2 test mutual information can be considered a statistic for assessing independence between a pair of variables and has a well specified asymptotic distribution Kullback Leibler divergence information gain The Kullback Leibler divergence or information divergence information gain or relative entropy is a way of comparing two distributions a true probability distribution p X displaystyle p X and an arbitrary probability distribution q X displaystyle q X If we compress data in a manner that assumes q X displaystyle q X is the distribution underlying some data when in reality p X displaystyle p X is the correct distribution the Kullback Leibler divergence is the number of average additional bits per datum necessary for compression It is thus defined DKL p X q X x X p x log q x x X p x log p x x Xp x log p x q x displaystyle D mathrm KL p X q X sum x in X p x log q x sum x in X p x log p x sum x in X p x log frac p x q x Although it is sometimes used as a distance metric KL divergence is not a true metric since it is not symmetric and does not satisfy the triangle inequality making it a semi quasimetric Another interpretation of the KL divergence is the unnecessary surprise introduced by a prior from the truth suppose a number X is about to be drawn randomly from a discrete set with probability distribution p x displaystyle p x If Alice knows the true distribution p x displaystyle p x while Bob believes has a prior that the distribution is q x displaystyle q x then Bob will be more surprised than Alice on average upon seeing the value of X The KL divergence is the objective expected value of Bob s subjective surprisal minus Alice s surprisal measured in bits if the log is in base 2 In this way the extent to which Bob s prior is wrong can be quantified in terms of how unnecessarily surprised it is expected to make him Directed Information Directed information I Xn Yn displaystyle I X n to Y n is an information theory measure that quantifies the information flow from the random process Xn X1 X2 Xn displaystyle X n X 1 X 2 dots X n to the random process Yn Y1 Y2 Yn displaystyle Y n Y 1 Y 2 dots Y n The term directed information was coined by James Massey and is defined as I Xn Yn i 1nI Xi Yi Yi 1 displaystyle I X n to Y n triangleq sum i 1 n I X i Y i Y i 1 where I Xi Yi Yi 1 displaystyle I X i Y i Y i 1 is the conditional mutual information I X1 X2 Xi Yi Y1 Y2 Yi 1 displaystyle I X 1 X 2 X i Y i Y 1 Y 2 Y i 1 In contrast to mutual information directed information is not symmetric The I Xn Yn displaystyle I X n to Y n measures the information bits that are transmitted causally clarification needed from Xn displaystyle X n to Yn displaystyle Y n The Directed information has many applications in problems where causality plays an important role such as capacity of channel with feedback capacity of discrete memoryless networks with feedback gambling with causal side information compression with causal side information real time control communication settings and in statistical physics Other quantities Other important information theoretic quantities include the Renyi entropy and the Tsallis entropy generalizations of the concept of entropy differential entropy a generalization of quantities of information to continuous distributions and the conditional mutual information Also pragmatic information has been proposed as a measure of how much information has been used in making a decision Coding theoryThis section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed April 2024 Learn how and when to remove this message A picture showing scratches on the readable surface of a CD R Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using error detection and correction Coding theory is one of the most important and direct applications of information theory It can be subdivided into source coding theory and channel coding theory Using a statistical description for data information theory quantifies the number of bits needed to describe the data which is the information entropy of the source Data compression source coding There are two formulations for the compression problem lossless data compression the data must be reconstructed exactly lossy data compression allocates bits needed to reconstruct the data within a specified fidelity level measured by a distortion function This subset of information theory is called rate distortion theory Error correcting codes channel coding While data compression removes as much redundancy as possible an error correcting code adds just the right kind of redundancy i e error correction needed to transmit the data efficiently and faithfully across a noisy channel This division of coding theory into compression and transmission is justified by the information transmission theorems or source channel separation theorems that justify the use of bits as the universal currency for information in many contexts However these theorems only hold in the situation where one transmitting user wishes to communicate to one receiving user In scenarios with more than one transmitter the multiple access channel more than one receiver the broadcast channel or intermediary helpers the relay channel or more general networks compression followed by transmission may no longer be optimal Source theory Any process that generates successive messages can be considered a source of information A memoryless source is one in which each message is an independent identically distributed random variable whereas the properties of ergodicity and stationarity impose less restrictive constraints All such sources are stochastic These terms are well studied in their own right outside information theory Rate Information rate is the average entropy per symbol For memoryless sources this is merely the entropy of each symbol while in the case of a stationary stochastic process it is r limn H Xn Xn 1 Xn 2 Xn 3 displaystyle r lim n to infty H X n X n 1 X n 2 X n 3 ldots that is the conditional entropy of a symbol given all the previous symbols generated For the more general case of a process that is not necessarily stationary the average rate is r limn 1nH X1 X2 Xn displaystyle r lim n to infty frac 1 n H X 1 X 2 dots X n that is the limit of the joint entropy per symbol For stationary sources these two expressions give the same result The information rate is defined as r limn 1nI X1 X2 Xn Y1 Y2 Yn displaystyle r lim n to infty frac 1 n I X 1 X 2 dots X n Y 1 Y 2 dots Y n It is common in information theory to speak of the rate or entropy of a language This is appropriate for example when the source of information is English prose The rate of a source of information is related to its redundancy and how well it can be compressed the subject of source coding Channel capacity Communications over a channel is the primary motivation of information theory However channels often fail to produce exact reconstruction of a signal noise periods of silence and other forms of signal corruption often degrade quality Consider the communications process over a discrete channel A simple model of the process is shown below MessageWEncoderfn EncodedsequenceXnChannelp y x ReceivedsequenceYnDecodergn EstimatedmessageW displaystyle xrightarrow text Message W begin array c hline text Encoder f n hline end array xrightarrow mathrm Encoded atop sequence X n begin array c hline text Channel p y x hline end array xrightarrow mathrm Received atop sequence Y n begin array c hline text Decoder g n hline end array xrightarrow mathrm Estimated atop message hat W Here X represents the space of messages transmitted and Y the space of messages received during a unit time over our channel Let p y x be the conditional probability distribution function of Y given X We will consider p y x to be an inherent fixed property of our communications channel representing the nature of the noise of our channel Then the joint distribution of X and Y is completely determined by our channel and by our choice of f x the marginal distribution of messages we choose to send over the channel Under these constraints we would like to maximize the rate of information or the signal we can communicate over the channel The appropriate measure for this is the mutual information and this maximum mutual information is called the channel capacity and is given by C maxfI X Y displaystyle C max f I X Y This capacity has the following property related to communicating at information rate R where R is usually bits per symbol For any information rate R lt C and coding error e gt 0 for large enough N there exists a code of length N and rate R and a decoding algorithm such that the maximal probability of block error is e that is it is always possible to transmit with arbitrarily small block error In addition for any rate R gt C it is impossible to transmit with arbitrarily small block error Channel coding is concerned with finding such nearly optimal codes that can be used to transmit data over a noisy channel with a small coding error at a rate near the channel capacity Capacity of particular channel models A continuous time analog communications channel subject to Gaussian noise see Shannon Hartley theorem A binary symmetric channel BSC with crossover probability p is a binary input binary output channel that flips the input bit with probability p The BSC has a capacity of 1 Hb p bits per channel use where Hb is the binary entropy function to the base 2 logarithm dd A binary erasure channel BEC with erasure probability p is a binary input ternary output channel The possible channel outputs are 0 1 and a third symbol e called an erasure The erasure represents complete loss of information about an input bit The capacity of the BEC is 1 p bits per channel use dd Channels with memory and directed information In practice many channels have memory Namely at time i displaystyle i the channel is given by the conditional probabilityP yi xi xi 1 xi 2 x1 yi 1 yi 2 y1 displaystyle P y i x i x i 1 x i 2 x 1 y i 1 y i 2 y 1 It is often more comfortable to use the notation xi xi xi 1 xi 2 x1 displaystyle x i x i x i 1 x i 2 x 1 and the channel become P yi xi yi 1 displaystyle P y i x i y i 1 In such a case the capacity is given by the mutual information rate when there is no feedback available and the Directed information rate in the case that either there is feedback or not if there is no feedback the directed information equals the mutual information Fungible information Fungible information is the information for which the means of encoding is not important Classical information theorists and computer scientists are mainly concerned with information of this sort It is sometimes referred as speakable information Applications to other fieldsIntelligence uses and secrecy applications This section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed April 2024 Learn how and when to remove this message Information theoretic concepts apply to cryptography and cryptanalysis Turing s information unit the ban was used in the Ultra project breaking the German Enigma machine code and hastening the end of World War II in Europe Shannon himself defined an important concept now called the unicity distance Based on the redundancy of the plaintext it attempts to give a minimum amount of ciphertext necessary to ensure unique decipherability Information theory leads us to believe it is much more difficult to keep secrets than it might first appear A brute force attack can break systems based on asymmetric key algorithms or on most commonly used methods of symmetric key algorithms sometimes called secret key algorithms such as block ciphers The security of all such methods comes from the assumption that no known attack can break them in a practical amount of time Information theoretic security refers to methods such as the one time pad that are not vulnerable to such brute force attacks In such cases the positive conditional mutual information between the plaintext and ciphertext conditioned on the key can ensure proper transmission while the unconditional mutual information between the plaintext and ciphertext remains zero resulting in absolutely secure communications In other words an eavesdropper would not be able to improve his or her guess of the plaintext by gaining knowledge of the ciphertext but not of the key However as in any other cryptographic system care must be used to correctly apply even information theoretically secure methods the Venona project was able to crack the one time pads of the Soviet Union due to their improper reuse of key material Pseudorandom number generation This section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed April 2024 Learn how and when to remove this message Pseudorandom number generators are widely available in computer language libraries and application programs They are almost universally unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software A class of improved random number generators is termed cryptographically secure pseudorandom number generators but even they require random seeds external to the software to work as intended These can be obtained via extractors if done carefully The measure of sufficient randomness in extractors is min entropy a value related to Shannon entropy through Renyi entropy Renyi entropy is also used in evaluating randomness in cryptographic systems Although related the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses Seismic exploration One early commercial application of information theory was in the field of seismic oil exploration Work in this field made it possible to strip off and separate the unwanted noise from the desired seismic signal Information theory and digital signal processing offer a major improvement of resolution and image clarity over previous analog methods Semiotics Semioticians nl and Winfried Noth both considered Charles Sanders Peirce as having created a theory of information in his works on semiotics 171 137 Nauta defined semiotic information theory as the study of the internal processes of coding filtering and information processing 91 Concepts from information theory such as redundancy and code control have been used by semioticians such as Umberto Eco and it to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones Integrated process organization of neural information Quantitative information theoretic methods have been applied in cognitive science to analyze the integrated process organization of neural information in the context of the binding problem in cognitive neuroscience In this context either an information theoretical measure such as functional clusters Gerald Edelman and Giulio Tononi s functional clustering model and dynamic core hypothesis DCH or effective information Tononi s integrated information theory IIT of consciousness is defined on the basis of a reentrant process organization i e the synchronization of neurophysiological activity between groups of neuronal populations or the measure of the minimization of free energy on the basis of statistical methods Karl J Friston s free energy principle FEP an information theoretical measure which states that every adaptive change in a self organized system leads to a minimization of free energy and the Bayesian brain hypothesis Miscellaneous applications Information theory also has applications in the search for extraterrestrial intelligence black holes bioinformatics and gambling See alsoMathematics portalAlgorithmic probability Bayesian inference Communication theory Constructor theory a generalization of information theory that includes quantum information Formal science Inductive probability Info metrics Minimum message length Minimum description length Philosophy of information Applications Active networking Cryptanalysis Cryptography Cybernetics Entropy in thermodynamics and information theory Gambling Intelligence information gathering Seismic exploration History Hartley R V L History of information theory Shannon C E Timeline of information theory Yockey H P Andrey KolmogorovTheory Coding theory Detection theory Estimation theory Fisher information Information algebra Information asymmetry Information field theory Information geometry Information theory and measure theory Kolmogorov complexity List of unsolved problems in information theory Logic of information Network coding Philosophy of information Quantum information science Source coding Concepts Ban unit Channel capacity Communication channel Communication source Conditional entropy Covert channel Data compression Decoder Differential entropy Fungible information Information fluctuation complexity Information entropy Joint entropy Kullback Leibler divergence Mutual information Pointwise mutual information PMI Receiver information theory Redundancy Renyi entropy Self information Unicity distance Variety Hamming distance PerplexityReferencesSchneider Thomas D 2006 Claude Shannon Biologist IEEE Engineering in Medicine and Biology Magazine The Quarterly Magazine of the Engineering in Medicine amp Biology Society 25 1 30 33 doi 10 1109 memb 2006 1578661 ISSN 0739 5175 PMC 1538977 PMID 16485389 Cruces Sergio Martin Clemente Ruben Samek Wojciech 2019 07 03 Information Theory Applications in Signal Processing Entropy 21 7 653 Bibcode 2019Entrp 21 653C doi 10 3390 e21070653 ISSN 1099 4300 PMC 7515149 PMID 33267367 Baleanu D Balas Valentina Emilia Agarwal Praveen eds 2023 Fractional Order Systems and Applications in Engineering Advanced Studies in Complex Systems London United Kingdom Academic Press p 23 ISBN 978 0 323 90953 2 OCLC 1314337815 Horgan John 2016 04 27 Claude Shannon Tinkerer Prankster and Father of Information Theory IEEE Retrieved 2024 11 08 Shi Zhongzhi 2011 Advanced Artificial Intelligence World Scientific Publishing p 2 doi 10 1142 7547 ISBN 978 981 4291 34 7 Sinha Sudhi Al Huraimel Khaled 2020 10 20 Reimagining Businesses with AI 1 ed Wiley p 4 doi 10 1002 9781119709183 ISBN 978 1 119 70915 2 Burnham K P Anderson D R 2002 Model Selection and Multimodel Inference A Practical Information Theoretic Approach Second ed New York Springer Science ISBN 978 0 387 95364 9 F Rieke D Warland R Ruyter van Steveninck W Bialek 1997 Spikes Exploring the Neural Code The MIT press ISBN 978 0262681087 Delgado Bonal Alfonso Martin Torres Javier 2016 11 03 Human vision is determined based on information theory Scientific Reports 6 1 36038 Bibcode 2016NatSR 636038D doi 10 1038 srep36038 ISSN 2045 2322 PMC 5093619 PMID 27808236 cf Huelsenbeck J P Ronquist F Nielsen R Bollback J P 2001 Bayesian inference of phylogeny and its impact on evolutionary biology Science 294 5550 2310 2314 Bibcode 2001Sci 294 2310H doi 10 1126 science 1065889 PMID 11743192 S2CID 2138288 Allikmets Rando Wasserman Wyeth W Hutchinson Amy Smallwood Philip Nathans Jeremy Rogan Peter K 1998 Thomas D Schneider Michael Dean 1998 Organization of the ABCR gene analysis of promoter and splice junction sequences Gene 215 1 111 122 doi 10 1016 s0378 1119 98 00269 8 PMID 9666097 Jaynes E T 1957 Information Theory and Statistical Mechanics Phys Rev 106 4 620 Bibcode 1957PhRv 106 620J doi 10 1103 physrev 106 620 S2CID 17870175 Talaat Khaled Cowen Benjamin Anderoglu Osman 2020 10 05 Method of information entropy for convergence assessment of molecular dynamics simulations Journal of Applied Physics 128 13 135102 Bibcode 2020JAP 128m5102T doi 10 1063 5 0019078 OSTI 1691442 S2CID 225010720 Bennett Charles H Li Ming Ma Bin 2003 Chain Letters and Evolutionary Histories Scientific American 288 6 76 81 Bibcode 2003SciAm 288f 76B doi 10 1038 scientificamerican0603 76 PMID 12764940 Archived from the original on 2007 10 07 Retrieved 2008 03 11 David R Anderson November 1 2003 Some background on why people in the empirical sciences may want to better understand the information theoretic methods PDF Archived from the original PDF on July 23 2011 Retrieved 2010 06 23 Loy D Gareth 2017 Pareyon Gabriel Pina Romero Silvia Agustin Aquino Octavio A Lluis Puebla Emilio eds Music Expectation and Information Theory The Musical Mathematical Mind Patterns and Transformations Computational Music Science Cham Springer International Publishing pp 161 169 doi 10 1007 978 3 319 47337 6 17 ISBN 978 3 319 47337 6 retrieved 2024 09 19 Rocamora Martin Cancela Pablo Biscainho Luiz 2019 04 05 Information Theory Concepts Applied to the Analysis of Rhythm in Recorded Music with Recurrent Rhythmic Patterns Journal of the Audio Engineering Society 67 4 160 173 doi 10 17743 jaes 2019 0003 Marsden Alan 2020 New Prospects for Information Theory in Arts Research Leonardo 53 3 274 280 doi 10 1162 leon a 01860 ISSN 0024 094X Pinkard Henry Kabuli Leyla Markley Eric Chien Tiffany Jiao Jiantao Waller Laura 2024 Universal evaluation and design of imaging systems using information estimation arXiv 2405 20559 physics optics Wing Simon Johnson Jay R 2019 02 01 Applications of Information Theory in Solar and Space Physics Entropy 21 2 140 Bibcode 2019Entrp 21 140W doi 10 3390 e21020140 ISSN 1099 4300 PMC 7514618 PMID 33266856 Kak Subhash 2020 11 26 Information theory and dimensionality of space Scientific Reports 10 1 20733 doi 10 1038 s41598 020 77855 9 ISSN 2045 2322 PMC 7693271 PMID 33244156 Harms William F 1998 The Use of Information Theory in Epistemology Philosophy of Science 65 3 472 501 doi 10 1086 392657 ISSN 0031 8248 JSTOR 188281 Gleick 2011 pp 3 4 Horgan John 2016 04 27 Claude Shannon Tinkerer Prankster and Father of Information Theory IEEE Retrieved 2023 09 30 Roberts Siobhan 2016 04 30 The Forgotten Father of the Information Age The New Yorker ISSN 0028 792X Retrieved 2023 09 30 Tse David 2020 12 22 How Claude Shannon Invented the Future Quanta Magazine Retrieved 2023 09 30 Braverman Mark September 19 2011 Information Theory in Computer Science PDF Reza 1994 Ash 1990 Massey James 1990 Causality Feedback And Directed Information Proc 1990 Intl Symp on Info Th and its Applications CiteSeerX 10 1 1 36 5688 Permuter Haim Henry Weissman Tsachy Goldsmith Andrea J February 2009 Finite State Channels With Time Invariant Deterministic Feedback IEEE Transactions on Information Theory 55 2 644 662 arXiv cs 0608070 doi 10 1109 TIT 2008 2009849 S2CID 13178 Kramer G January 2003 Capacity results for the discrete memoryless network IEEE Transactions on Information Theory 49 1 4 21 doi 10 1109 TIT 2002 806135 Permuter Haim H Kim Young Han Weissman Tsachy June 2011 Interpretations of Directed Information in Portfolio Theory Data Compression and Hypothesis Testing IEEE Transactions on Information Theory 57 6 3248 3259 arXiv 0912 4872 doi 10 1109 TIT 2011 2136270 S2CID 11722596 Simeone Osvaldo Permuter Haim Henri June 2013 Source Coding When the Side Information May Be Delayed IEEE Transactions on Information Theory 59 6 3607 3618 arXiv 1109 1293 doi 10 1109 TIT 2013 2248192 S2CID 3211485 Charalambous Charalambos D Stavrou Photios A August 2016 Directed Information on Abstract Spaces Properties and Variational Equalities IEEE Transactions on Information Theory 62 11 6019 6052 arXiv 1302 3971 doi 10 1109 TIT 2016 2604846 S2CID 8107565 Tanaka Takashi Esfahani Peyman Mohajerin Mitter Sanjoy K January 2018 LQG Control With Minimum Directed Information Semidefinite Programming Approach IEEE Transactions on Automatic Control 63 1 37 52 arXiv 1510 04214 doi 10 1109 TAC 2017 2709618 S2CID 1401958 Archived from the original on Apr 12 2024 via TU Delft Repositories Vinkler Dror A Permuter Haim H Merhav Neri 20 April 2016 Analogy between gambling and measurement based work extraction Journal of Statistical Mechanics Theory and Experiment 2016 4 043403 arXiv 1404 6788 Bibcode 2016JSMTE 04 3403V doi 10 1088 1742 5468 2016 04 043403 S2CID 124719237 Jerry D Gibson 1998 Digital Compression for Multimedia Principles and Standards Morgan Kaufmann ISBN 1 55860 369 7 Permuter Haim Henry Weissman Tsachy Goldsmith Andrea J February 2009 Finite State Channels With Time Invariant Deterministic Feedback IEEE Transactions on Information Theory 55 2 644 662 arXiv cs 0608070 doi 10 1109 TIT 2008 2009849 S2CID 13178 Bartlett Stephen D Rudolph Terry Spekkens Robert W April June 2007 Reference frames superselection rules and quantum information Reviews of Modern Physics 79 2 555 606 arXiv quant ph 0610030 Bibcode 2007RvMP 79 555B doi 10 1103 RevModPhys 79 555 Peres A P F Scudo 2002b A Khrennikov ed Quantum Theory Reconsideration of Foundations Vaxjo University Press Vaxjo Sweden p 283 Haggerty Patrick E 1981 The corporation and innovation Strategic Management Journal 2 2 97 118 doi 10 1002 smj 4250020202 Nauta Doede 1972 The Meaning of Information The Hague Mouton ISBN 9789027919960 Noth Winfried January 2012 Charles S Peirce s theory of information a theory of the growth of symbols and of knowledge Cybernetics and Human Knowing 19 1 2 137 161 Noth Winfried 1981 Semiotics of ideology Semiotica Issue 148 Maurer H 2021 Chapter 10 Systematic Class of Information Based Architecture Types Cognitive Science Integrative Synchronization Mechanisms in Cognitive Neuroarchitectures of the Modern Connectionism Boca Raton FL CRC Press doi 10 1201 9781351043526 ISBN 978 1 351 04352 6 Edelman G M Tononi G 2000 A Universe of Consciousness How Matter Becomes Imagination New York Basic Books ISBN 978 0465013777 Tononi G Sporns O 2003 Measuring information integration BMC Neuroscience 4 1 20 doi 10 1186 1471 2202 4 31 PMC 331407 PMID 14641936 Tononi G 2004a An information integration theory of consciousness BMC Neuroscience 5 1 22 doi 10 1186 1471 2202 5 42 PMC 543470 PMID 15522121 Tononi G 2004b Consciousness and the brain theoretical aspects In Adelman G Smith B eds Encyclopedia of Neuroscience 3rd ed Amsterdam Oxford Elsevier ISBN 0 444 51432 5 Archived PDF from the original on 2023 12 02 Friston K Stephan K E 2007 Free energy and the brain Synthese 159 3 417 458 doi 10 1007 s11229 007 9237 y PMC 2660582 PMID 19325932 Friston K 2010 The free energy principle a unified brain theory Nature Reviews Neuroscience 11 2 127 138 doi 10 1038 nrn2787 PMID 20068583 Friston K Breakstear M Deco G 2012 Perception and self organized instability Frontiers in Computational Neuroscience 6 1 19 doi 10 3389 fncom 2012 00044 PMC 3390798 PMID 22783185 Friston K 2013 Life as we know it Journal of the Royal Society Interface 10 86 20130475 doi 10 1098 rsif 2013 0475 PMC 3730701 PMID 23825119 Kirchhoff M Parr T Palacios E Friston K Kiverstein J 2018 The Markov blankets of life autonomy active inference and the free energy principle Journal of the Royal Society Interface 15 138 20170792 doi 10 1098 rsif 2017 0792 PMC 5805980 PMID 29343629 Doyle Laurance R McCowan Brenda Johnston Simon Hanser Sean F February 2011 Information theory animal communication and the search for extraterrestrial intelligence Acta Astronautica 68 3 4 406 417 Bibcode 2011AcAau 68 406D doi 10 1016 j actaastro 2009 11 018 Bekenstein Jacob D 2004 Black holes and information theory Contemporary Physics 45 1 31 43 arXiv quant ph 0311049 Bibcode 2004ConPh 45 31B doi 10 1080 00107510310001632523 ISSN 0010 7514 Vinga Susana 2014 05 01 Information theory applications for biological sequence analysis Briefings in Bioinformatics 15 3 376 389 doi 10 1093 bib bbt068 ISSN 1467 5463 PMC 7109941 PMID 24058049 Thorp Edward O 2008 01 01 Zenios S A Ziemba W T eds The kelly criterion in blackjack sports betting and the stock market Handbook of Asset and Liability Management San Diego North Holland pp 385 428 doi 10 1016 b978 044453248 0 50015 0 ISBN 978 0 444 53248 0 retrieved 2025 01 20 Haigh John 2000 The Kelly Criterion and Bet Comparisons in Spread Betting Journal of the Royal Statistical Society Series D The Statistician 49 4 531 539 doi 10 1111 1467 9884 00251 ISSN 1467 9884 Further readingThe classic work Shannon C E 1948 A Mathematical Theory of Communication Bell System Technical Journal 27 pp 379 423 amp 623 656 July amp October 1948 PDF Notes and other formats R V L Hartley Transmission of Information Bell System Technical Journal July 1928 Andrey Kolmogorov 1968 Three approaches to the quantitative definition of information in International Journal of Computer Mathematics 2 pp 157 168 Other journal articles J L Kelly Jr Princeton A New Interpretation of Information Rate Bell System Technical Journal Vol 35 July 1956 pp 917 26 R Landauer IEEE org Information is Physical Proc Workshop on Physics and Computation PhysComp 92 IEEE Comp Sci Press Los Alamitos 1993 pp 1 4 Landauer R 1961 Irreversibility and Heat Generation in the Computing Process PDF IBM J Res Dev 5 3 183 191 doi 10 1147 rd 53 0183 Timme Nicholas Alford Wesley Flecker Benjamin Beggs John M 2012 Multivariate information measures an experimentalist s perspective arXiv 1111 6857 cs IT Textbooks on information theory Alajaji F and Chen P N An Introduction to Single User Information Theory Singapore Springer 2018 ISBN 978 981 10 8000 5 Arndt C Information Measures Information and its Description in Science and Engineering Springer Series Signals and Communication Technology 2004 ISBN 978 3 540 40855 0 Ash Robert B 1990 1965 Information Theory New York Dover Publications Inc ISBN 0 486 66521 6 Gallager R Information Theory and Reliable Communication New York John Wiley and Sons 1968 ISBN 0 471 29048 3 Goldman S Information Theory New York Prentice Hall 1953 New York Dover 1968 ISBN 0 486 62209 6 2005 ISBN 0 486 44271 3 Cover Thomas Thomas Joy A 2006 Elements of information theory 2nd ed New York Wiley Interscience ISBN 0 471 24195 4 Csiszar I Korner J Information Theory Coding Theorems for Discrete Memoryless Systems Akademiai Kiado 2nd edition 1997 ISBN 963 05 7440 3 MacKay David J C Information Theory Inference and Learning Algorithms Cambridge Cambridge University Press 2003 ISBN 0 521 64298 1 Mansuripur M Introduction to Information Theory New York Prentice Hall 1987 ISBN 0 13 484668 0 McEliece R The Theory of Information and Coding Cambridge 2002 ISBN 978 0521831857 Pierce JR An introduction to information theory symbols signals and noise Dover 2nd Edition 1961 reprinted by Dover 1980 Reza Fazlollah M 1994 1961 An Introduction to Information Theory New York Dover Publications Inc ISBN 0 486 68210 2 Shannon Claude Weaver Warren 1949 The Mathematical Theory of Communication PDF Urbana Illinois University of Illinois Press ISBN 0 252 72548 4 LCCN 49 11922 Stone JV Chapter 1 of book Information Theory A Tutorial Introduction University of Sheffield England 2014 ISBN 978 0956372857 Yeung RW A First Course in Information Theory Kluwer Academic Plenum Publishers 2002 ISBN 0 306 46791 7 Yeung RW Information Theory and Network Coding Springer 2008 2002 ISBN 978 0 387 79233 0 Other books Leon Brillouin Science and Information Theory Mineola N Y Dover 1956 1962 2004 ISBN 0 486 43918 6 Gleick James 2011 The Information A History a Theory a Flood 1st ed New York Vintage Books ISBN 978 1 4000 9623 7 A I Khinchin Mathematical Foundations of Information Theory New York Dover 1957 ISBN 0 486 60434 9 H S Leff and A F Rex Editors Maxwell s Demon Entropy Information Computing Princeton University Press Princeton New Jersey 1990 ISBN 0 691 08727 X Robert K Logan What is Information Propagating Organization in the Biosphere the Symbolosphere the Technosphere and the Econosphere Toronto DEMO Publishing Tom Siegfried The Bit and the Pendulum Wiley 2000 ISBN 0 471 32174 5 Charles Seife Decoding the Universe Viking 2006 ISBN 0 670 03441 X Jeremy Campbell Grammatical Man Touchstone Simon amp Schuster 1982 ISBN 0 671 44062 4 Henri Theil Economics and Information Theory Rand McNally amp Company Chicago 1967 Escolano Suau Bonev Information Theory in Computer Vision and Pattern Recognition Springer 2009 ISBN 978 1 84882 296 2 Vlatko Vedral Decoding Reality The Universe as Quantum Information Oxford University Press 2010 ISBN 0 19 923769 7External linksWikiquote has quotations related to Information theory Library resources about Information theory Resources in your library Resources in other libraries Information Encyclopedia of Mathematics EMS Press 2001 1994 Lambert F L 1999 Shuffled Cards Messy Desks and Disorderly Dorm Rooms Examples of Entropy Increase Nonsense Journal of Chemical Education IEEE Information Theory Society and ITSOC Monographs Surveys and Reviews Archived 2018 06 12 at the Wayback Machine