Portal field news

Portal field news

in ,

📱 | Scanning food labels with this app will tell you if there is something to avoid


写真 

Scanning food labels with this app will tell you if there is something to avoid

 
If you write the contents roughly
This is more convenient and efficient than standing in the aisle of a supermarket and checking the food composition table and getting confused by the amount of information.
 

Almost all over the world, it is more or less legal to list ingredients on the labels of groceries sold in supermarkets, etc ... → Continue reading

 Ubergizmo Japan

Ubergizmo Japan is a technology media featuring the latest trend news for gadgets, smartphones, games, technology, home entertainment products, and product review articles from the perspective of consumers.


Wikipedia related words

If there is no explanation, there is no corresponding item on Wikipedia.

Amount of information

Amount of informationOrEntropy(British: entropy) IsInformation theoryWith the concept of, an event (Event) Is a measure of how difficult it is to occur. Even if I knew that a mundane event (for example, "the sound of the wind") had occurred,情报However, if a rare event (for example, "playing a song") occurs, it is considered to contain more "information". The amount of information can also be regarded as a measure of how much information the event essentially has.

The term "information" here means that the event is unlikely to occur (probability) Is a mathematical quantity determined only by () and has no relation to its usefulness in individuals and society. For example, in the case of "I won the lottery" and "Mr. A, a stranger won the lottery," the former seems to be more useful information, but the amount of information in both is exactly the same (the probability of winning the lottery is given. Because everyone is the same under certain conditions).

Choice information amount (self entropy) and average information amount (entropy)

Not only the amount of information for each event, but also the average value of the amount of information for each event is called the amount of information. When distinguishing between the two, the formerAmount of information selected(Self entropyAnd the latterAverage amount of information(EntropyAlso called).

Amount of information selected

EventHappensprobabilityTheAnd the event Receive when informed that something happened(Choice) amount of information The

It is defined as

The larger the information amount of an event that is unlikely to occur (=event of which occurrence probability is low), the larger the value.

In the above formulaLogarithm () ofbottomNo matter what you choose, the value of the amount of information changes only a constant multiple, so there is no essential difference, but 2 is often chosen as the base.

If the bottom is 2,The information amount of the event that occurs with the probability of.

Intuitive meaning

IntegerAgainstLogarithm of TheIn baseRepresents a value approximately equal to the number of digits. Therefore,probabilityThe amount of information about the events that occur inIt becomes the number of digits.

Additivity of information

The amount of information is additive.That is,independentThe amount of information of the event "A and B occur" is the sum of the amount of information of A and the amount of information of B for the event A / B.This is proved below.

For example, 52 sheetsPlaying cardsConsider a trial of randomly picking one from. From the above definition, it can be seen that the amount of information for the event "The card that was taken out is 1 of the heart" is log4. Here, "of the card taken outSuitConsidering the two events, "is a heart" and "the number of the card taken out is 4," the amount of information in the former is log4 and the latter is log13.The sum of these two is log4 + log13 = log (4 × 13) = log52, which is equal to the amount of information in the event that "the card taken out is 4 of hearts".This meets the intuitive request that "the sum of independent information matches the total amount of information".

Derivation

The intuitive request for the amount of information is "The lower the probability of occurrence, the greater (Monotonically decreasing) ”“ Continuous changes in probability (Continuity) ”“ The amount of information of independent simultaneous events is equal to the sum of the amount of information of peripheral events (Additivity) ”.Functions that satisfy these three conditionsCauchy's functional equationBy usingIs uniquely obtained.Therefore, the definition of the amount of information can be uniquely derived from the above three conditions.Typically, C is set (C = -3) so that the base of the logarithm is 2 and p = 1/2 becomes 1.

Average information amount (entropy)

The platformFinite setLet the probability space beProbability distribution onPWhen given, each eventAmount of information selectedExpected value of

TheP OfEntropyCall (Average amount of information,Shannon information amount,Entropy of information theoryTomo).

However, hereP(A)=0,To consider. this isDepends on.

Also, random variables that take values ​​on the finite set UXIs the probability distributionPIf you followX OfEntropyTheH(X)=H(P). That is,

.

Entropy always takes a non-negative value (or infinity).

valuex,yAre random variablesX,YIf you follow theCan be regarded as a random variable. This random variableIf you writeThe entropy of

become. thisCombined entropyCall.

To each otherindependentIs a random variable, TheMatches That is, the total amount of informationIs the sum of the information content of each random variable.

But,XとYAre not independent of each other,とDo not match, and the latter has a larger value than the former. The difference in the amount of information between the twoMutual informationCalled,

It is represented by. Mutual information is always non-negative.

EventBOf event A under the condition thatConditional amount of informationTheDetermined by Random variableXIs given, the event "Conditional information amount OfxThe average value forConditional entropyGood,

It is represented by.

More random variablesYIs given, the event "Is occurring under the condition "Conditional entropy of OfyMean value for

After allConditional entropyCall.

Basic properties of entropy

  1. The amount of information depends only on the probability.
  2. The amount of information takes a non-negative value or infinity.
  3. Space of n-bit string (Source of information) To (in a way that is not necessarily uniform random), the entropy when choosing a bit string randomly is less than or equal to n. The necessary and sufficient condition for entropy to be n is that the bit string is uniformly and randomly selected.
  4. The necessary and sufficient conditions for the random variables X and Y to be independent areIs established.

Example of coin tossing

The probability that a coin will be flipped when you throw it , The probability of being out And The average amount of information (entropy) obtained when you throw this coin is

.

This functionTheEntropy functionCall.

Entropy coin.png

As you can see in the figure, と Then. Is zero. In other words, the average amount of information that can be obtained when it is known that the inside or the other will come out before the coin is thrown is zero. Is the maximum , And generally the maximum entropy occurs when all events (events) have equal probabilities.

Entropy of continuous system

A random variable that takes a real valueXThe probability density function ofp(x),XThe entropy of

Define by

XIf is a random variable that takes a finite set, thenXShannon information amountCan also be defined.X nWhen taking the street value,とIt is,

Meet

However, here ThenA uniform distribution over the element set (ie).

Renyi entropy

Let be a probability space whose platform is a finite set.PTheWith the above probability distribution,Let be a non-negative real number.

When,PThe degee OfRenyi entropyThe

Define by Also,In case of Renyi entropy

Define by

単 にRenyi entropyIf you sayOften means.

Furthermore, random variablesXIs the probability distributionPWhen you followTheDefine by

Renyi entropy satisfies the following properties:

  • Is established.
  • Is Shannon information Matches
  • Is an integer greater than or equal to 2, Is established. here, Is the probability distribution Independent independent distribution according to The Each When choosing according to Is the probability that
  • Is established. this Themin entropyAlso called.

History

The concept of "entropy" is1865ToRudolph ClausiusIs the Greek word for "conversion"EtymologyWas introduced as a certain state quantity of gas in thermodynamics. This is represented in statistical mechanics as a quantity proportional to the logarithm of the number of microscopic states.1929ToLeo SillardShows that there is a direct relationship between the observer's acquisition of information about a gas and the entropy in statistical mechanics, and what is now called 1 bit (1 Shannon) is statistical mechanics. k led to a relationship that corresponds to ln 2.[1].

The direct introduction of entropy in current information theory is1948 OfClaude ShannonThe paper, "Mathematical theory of communication] Applied the concept of entropy to information theory[2].. Shannon himself reached this definition without knowing that the concept related to this one was already used in thermostatistics, but when he was thinking of its nameVon NeumannHowever, von Neumann said that it is similar to the entropy of thermostatistics, ``There are few people who understand what statistical entropy is, so it would be advantageous to have a discussion,'' he said. Be done[3][4]..But Shannon admits the conversation with von Neumann but denies its influence.[5].

Before ShannonRalph Hartley 1928To the setAAgainstIs being considered (“"IsA OfPrime number). TheAIt agrees with the uniformly distributed entropy above. Currently,TheA OfHartley EntropyCall.

unit

The amount of information is originallyDimensionlessIs the amount of. However, since the value differs depending on what was used as the base of the logarithm, the unit is used to distinguish them. As mentioned above, the amount of information is the reciprocal of the probability.Number of digitsSince it is the expected value of, the unit will be diverted for the number of digits. Therefore, the base of the logarithm is 2,e, 10 is the unit of information amountbit(bit),nut(nat),Dit(dit).

Also, although it is not mainstream at the moment, in 1997Japanese Industrial Standard JIS X 0016:1997 (This is an international standard ISO/Consistent with IEC 2382-16:1996) specifies a separate unit for expressing these quantities (NoteSee also).

Logarithmic base and units
bottomNormal unitUnits defined by JIS and ISORemarks
2BitShannon (shannon)lb, binary logarithm
e=2.718...Nutnut (nat)ln, natural logarithm
10DitHartley (hartley)lg, common logarithm

The names of the units "Shannon" and "Hartley" each proposed the concept of information content.Claude Shannon,Ralph HartleyName it.

footnote

[How to use footnotes]
  1. ^ Szilard, L. (1929) "Über die Entropieverminderung in einem Thermodynamischen System bei Eingriffen Intelligenter Wesen", Zeitschrift für Physik 53: 840-856
  2. ^ Cover & Thomas 2006, Historical note.
  3. ^ Feynman Computer Science, p. 96 Feynman's footnote*8 introduces this theory with the refusal of "according to legend."
  4. ^ Tadatoshi Han, Kingo Kobayashi "Mathematical Mathematics of Information and Codes"
  5. ^ CLAUDE E. SHANNON: An Interview Conducted by Robert Price, 28 July 1982

References

Related item

外部 リンク

Ingredient list


 

Back to Top
Close