写真

### Scanning food labels with this app will tell you if there is something to avoid

** ***If you write the contents roughly*

This is more convenient and efficient than standing in the aisle of a supermarket and checking the food composition table and getting confused by the amount of information.** **

Almost all over the world, it is more or less legal to list ingredients on the labels of groceries sold in supermarkets, etc ...** → Continue reading**

**Ubergizmo Japan**

Ubergizmo Japan is a technology media featuring the latest trend news for gadgets, smartphones, games, technology, home entertainment products, and product review articles from the perspective of consumers.

### Wikipedia related words

If there is no explanation, there is no corresponding item on Wikipedia.

# Amount of information

Information theory |
---|

Amount of information |

Communication path |

unit |

Other |

**Amount of information**Or**Entropy**(British: entropy) IsInformation theoryWith the concept of, an event (Event) Is a measure of how difficult it is to occur. Even if I knew that a mundane event (for example, "the sound of the wind") had occurred,情报However, if a rare event (for example, "playing a song") occurs, it is considered to contain more "information". The amount of information can also be regarded as a measure of how much information the event essentially has.

The term "information" here means that the event is unlikely to occur (probability) Is a mathematical quantity determined only by () and has no relation to its usefulness in individuals and society. For example, in the case of "I won the lottery" and "Mr. A, a stranger won the lottery," the former seems to be more useful information, but the amount of information in both is exactly the same (the probability of winning the lottery is given. Because everyone is the same under certain conditions).

## Choice information amount (self entropy) and average information amount (entropy)

Not only the amount of information for each event, but also the average value of the amount of information for each event is called the amount of information. When distinguishing between the two, the former**Amount of information selected**(**Self entropy**And the latter**Average amount of information**(**Entropy**Also called).

## Amount of information selected

EventHappensprobabilityTheAnd the event Receive when informed that something happened**(Choice) amount of information** The

It is defined as

The larger the information amount of an event that is unlikely to occur (=event of which occurrence probability is low), the larger the value.

In the above formulaLogarithm () ofbottomNo matter what you choose, the value of the amount of information changes only a constant multiple, so there is no essential difference, but 2 is often chosen as the base.

If the bottom is 2,The information amount of the event that occurs with the probability of.

### Intuitive meaning

IntegerAgainstLogarithm of TheIn baseRepresents a value approximately equal to the number of digits. Therefore,probabilityThe amount of information about the events that occur inIt becomes the number of digits.

### Additivity of information

The amount of information is additive.That is,independentThe amount of information of the event "A and B occur" is the sum of the amount of information of A and the amount of information of B for the event A / B.This is proved below.

For example, 52 sheetsPlaying cardsConsider a trial of randomly picking one from. From the above definition, it can be seen that the amount of information for the event "The card that was taken out is 1 of the heart" is log4. Here, "of the card taken outSuitConsidering the two events, "is a heart" and "the number of the card taken out is 4," the amount of information in the former is log4 and the latter is log13.The sum of these two is log4 + log13 = log (4 × 13) = log52, which is equal to the amount of information in the event that "the card taken out is 4 of hearts".This meets the intuitive request that "the sum of independent information matches the total amount of information".

### Derivation

The intuitive request for the amount of information is "The lower the probability of occurrence, the greater (**Monotonically decreasing**) ”“ Continuous changes in probability (**Continuity**) ”“ The amount of information of independent simultaneous events is equal to the sum of the amount of information of peripheral events (**Additivity**) ”.Functions that satisfy these three conditionsCauchy's functional equationBy usingIs uniquely obtained.Therefore, the definition of the amount of information can be uniquely derived from the above three conditions.Typically, C is set (C = -3) so that the base of the logarithm is 2 and p = 1/2 becomes 1.

## Average information amount (entropy)

The platformFinite setLet the probability space beProbability distribution on*P*When given, each eventAmount of information selectedExpected value of

The*P* Of**Entropy**Call (**Average amount of information**,**Shannon information amount**,**Entropy of information theory**Tomo).

However, here*P*(*A*)=0,To consider. this isDepends on.

Also, random variables that take values on the finite set U*X*Is the probability distribution*P*If you follow*X* Of**Entropy**The*H*(*X*)=*H*(*P*). That is,

- .

Entropy always takes a non-negative value (or infinity).

value*x*,*y*Are random variables*X*,*Y*If you follow theCan be regarded as a random variable. This random variableIf you writeThe entropy of

become. thisCombined entropyCall.

To each otherindependentIs a random variable, TheMatches That is, the total amount of informationIs the sum of the information content of each random variable.

But,*X*と*Y*Are not independent of each other,とDo not match, and the latter has a larger value than the former. The difference in the amount of information between the two**Mutual information**Called,

It is represented by. Mutual information is always non-negative.

Event*B*Of event A under the condition that**Conditional amount of information**TheDetermined by Random variable*X*Is given, the event "Conditional information amount Of*x*The average value for**Conditional entropy**Good,

It is represented by.

More random variables*Y*Is given, the event "Is occurring under the condition "Conditional entropy of Of*y*Mean value for

After all**Conditional entropy**Call.

### Basic properties of entropy

- The amount of information depends only on the probability.
- The amount of information takes a non-negative value or infinity.
- Space of n-bit string (Source of information) To (in a way that is not necessarily uniform random), the entropy when choosing a bit string randomly is less than or equal to n. The necessary and sufficient condition for entropy to be n is that the bit string is uniformly and randomly selected.
- The necessary and sufficient conditions for the random variables X and Y to be independent areIs established.

### Example of coin tossing

The probability that a coin will be flipped when you throw it , The probability of being out And The average amount of information (entropy) obtained when you throw this coin is

.

This functionThe**Entropy function**Call.

As you can see in the figure, と Then. Is zero. In other words, the average amount of information that can be obtained when it is known that the inside or the other will come out before the coin is thrown is zero. Is the maximum , And generally the maximum entropy occurs when all events (events) have equal probabilities.

## Entropy of continuous system

A random variable that takes a real value*X*The probability density function of*p*(*x*),*X*The entropy of

Define by

*X*If is a random variable that takes a finite set, then*X*Shannon information amountCan also be defined.*X* *n*When taking the street value,とIt is,

Meet

However, here The*n*A uniform distribution over the element set (ie).

## Renyi entropy

Let be a probability space whose platform is a finite set.*P*TheWith the above probability distribution,Let be a non-negative real number.

When,*P*The degee Of**Renyi entropy**The

Define by Also,In case of Renyi entropy

Define by

単 に**Renyi entropy**If you sayOften means.

Furthermore, random variables*X*Is the probability distribution*P*When you followTheDefine by

Renyi entropy satisfies the following properties:

- Is established.
- Is Shannon information Matches
- Is an integer greater than or equal to 2, Is established. here, Is the probability distribution Independent independent distribution according to The Each When choosing according to Is the probability that
- Is established. this The
**min entropy**Also called.

## History

The concept of "entropy" is1865ToRudolph ClausiusIs the Greek word for "conversion"EtymologyWas introduced as a certain state quantity of gas in thermodynamics. This is represented in statistical mechanics as a quantity proportional to the logarithm of the number of microscopic states.1929ToLeo SillardShows that there is a direct relationship between the observer's acquisition of information about a gas and the entropy in statistical mechanics, and what is now called 1 bit (1 Shannon) is statistical mechanics. *k* led to a relationship that corresponds to ln 2.^{[1]}.

The direct introduction of entropy in current information theory is1948 OfClaude ShannonThe paper, "Mathematical theory of communication] Applied the concept of entropy to information theory^{[2]}.. Shannon himself reached this definition without knowing that the concept related to this one was already used in thermostatistics, but when he was thinking of its nameVon NeumannHowever, von Neumann said that it is similar to the entropy of thermostatistics, ``There are few people who understand what statistical entropy is, so it would be advantageous to have a discussion,'' he said. Be done^{[3]}^{[4]}..But Shannon admits the conversation with von Neumann but denies its influence.^{[5]}.

Before ShannonRalph Hartley 1928To the set*A*AgainstIs being considered (“"Is*A* OfPrime number). The*A*It agrees with the uniformly distributed entropy above. Currently,The*A* Of**Hartley Entropy**Call.

## unit

The amount of information is originallyDimensionlessIs the amount of. However, since the value differs depending on what was used as the base of the logarithm, the unit is used to distinguish them. As mentioned above, the amount of information is the reciprocal of the probability.**Number of digits**Since it is the expected value of, the unit will be diverted for the number of digits. Therefore, the base of the logarithm is 2,e, 10 is the unit of information amount**bit**(bit),**nut**(nat),**Dit**(dit).

Also, although it is not mainstream at the moment, in 1997Japanese Industrial Standard JIS X 0016:1997 (This is an international standard ISO/Consistent with IEC 2382-16:1996) specifies a separate unit for expressing these quantities (NoteSee also).

bottom | Normal unit | Units defined by JIS and ISO | Remarks |
---|---|---|---|

2 | Bit | Shannon (shannon) | lb, binary logarithm |

e=2.718... | Nut | nut (nat) | ln, natural logarithm |

10 | Dit | Hartley (hartley) | lg, common logarithm |

The names of the units "Shannon" and "Hartley" each proposed the concept of information content.Claude Shannon,Ralph HartleyName it.

## footnote

- Shannon entropy calculator (English)
*A Mathematical Theory of Communication*Shannon 1948 (English)

**^**Szilard, L. (1929) "Über die Entropieverminderung in einem Thermodynamischen System bei Eingriffen Intelligenter Wesen",*Zeitschrift für Physik***53**: 840-856**^**Cover & Thomas 2006, Historical note.**^**Feynman Computer Science, p. 96 Feynman's footnote*8 introduces this theory with the refusal of "according to legend."**^**Tadatoshi Han, Kingo Kobayashi "Mathematical Mathematics of Information and Codes"**^**CLAUDE E. SHANNON: An Interview Conducted by Robert Price, 28 July 1982

## References

- Cover, Thomas M .; Thomas, Joy A. (2006).
*Elements of information theory*(Second ed.). John Wiley & Sons. ISBN 978-0-471-24195-9. MR2239987

## Related item

- Shannon's theorem(Sampling theoremBoth)
- Comparison of data volume
- Entropy
- Maxwell Devil
- Huffman code
- Kolmogorov complexity
- Landauer's principle
- Cross entropy
- Combined entropy

## 外部 リンク