Quantitative measure of information

Lecture



The theory of any phenomenon begins with the emergence of quantitative relationships between the objects of research, i.e. when establishing the principles of measurability of any properties of objects. The unit of quantitative measure of information - BIT (abbreviation binary digit - binary digit), was first proposed by R. Hartley in 1928. 1 bit is information about two possible equiprobable states of an object, the uncertainty of a choice from two equiprobable events. Mathematically, this is represented by the state 1 or 0 of one bit of the binary number system. The amount of information H (in bits), necessary and sufficient to completely remove the uncertainty of the state of an object that has N equally possible states, is measured as the base 2 logarithm of the number of possible states:

H = log 2 N. (1.4.1)

Accordingly, the binary numerical information code of one of the N possible states of the object occupies N binary bits.

Example. It is necessary to lift the load to a specific floor of a 16-storey building (floors are numbered 0-15, N = 16). How many bits of information completely define the job?

H = log 2 N = log 2 16 = 4.

Therefore, 4 bits of information are necessary and sufficient to completely remove the uncertainty of choice. This can be verified by using the logic of calculus with sequential division in half of intervals of states. For example, for the 9th floor:

1. Above the 7th floor? Yes = 1. 2. Above the 11th floor? No = 0.

3. Above the 9th floor? No = 0. 4. Above the 8th floor? Yes = 1.

Result: floor number 9 or 1001 in binary, four binary digits.

If in the above example there are 4 apartments on the floors with numbering on each floor 0-3 (M = 4), then when addressing the cargo to the apartment, 2 more bits of information will be required. We will get the same result if instead of independent numbering of floors and apartments on floors (two sources of uncertainty), we have only continuous numbering of apartments (one generalized source):

H = log 2 N + log 2 M = log 2 16 + log 2 4 = 6 є log 2 (N ґ M) = log 2 64 = 6,

those. the amount of information meets the additivity requirement: the uncertainty of the combined source is equal to the sum of the uncertainties of the initial sources, which corresponds to the intuitive requirement for information: it must be unambiguous, and its amount must be the same regardless of the method of assignment.

The base of the logarithm is of no fundamental importance and determines only the scale or unit of uncertainty. So, if we take three equiprobable states as a unit of uncertainty, then to determine, for example, one false gold coin (lighter) out of 27 seemingly indistinguishable coins, only H = log 3 27 = 3 is required , i.e. three weighings on equal-shoulder scales. It is proposed to determine the logic of calculating weighing independently.

The binary measure of information has received general recognition due to the ease of implementation of information technology on elements with two stable states. In decimal terms, the unit of information is one decimal place - DIT.

created: 2020-11-27
updated: 2021-03-13
132265



Rating 9 of 10. count vote: 2
Are you satisfied?:



Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

Signal and linear systems theory

Terms: Signal and linear systems theory