Approaches to measuring and assessing the value of...

Approaches to measuring and assessing the value of information

In various sciences about information, attempts were made to measure it.

The first scientific understanding of the notion amount of information began in the theory of communication. Information measures were introduced.

In 1924, Mr. Nyquist showed that the signal transfer rate W over the channel is related to the number n of different code symbols by the

W = k log n, (1.9)

where k is a constant determined by the transmission rate of consecutive symbols.

He was the first to propose a logarithmic measure of information.

In 1928, P. Hartley defined message information as

H = m log n, (1.10)

where t is the number of characters in the message, and n - the number of characters available for use.

From the point of view of the theory of modern information, this measure is applicable only when the symbols are selected independently of one another and the choice of any of the symbols is equally probable.

The most complete development of information theory in relation to the problems of its transmission was carried out in the 40s. XX century. To. Shannon , which connected the notion of information with the notion of entropy and proposed the following entropy measure:

(1.ll)

where p i is the probability that the i-th character from the full set of n characters that the message source generates is selected or not; The value H (entropy) is measured in bits (from English binary digit - binary system).

The work of Shannon and his followers has found wide application in practice - with the optimization of communication channels, in data processing systems, in the development of electronic computers, etc.

With the development of cybernetics and the theory of systems, the concept of information evolved.

To assess the possibilities of various technical means of collecting, transmitting and processing information, a game-theoretic approach to evaluating information was proposed. L. Stratonovich . algorithmic approach M. M. Bongard .

In the future, information was not only associated with the source and receiver signals, but also with its value for the consumer.

Initially, there were attempts to apply the measures discussed above and to assess the value of social information. However, it is difficult to prove the adequacy of such measures.

There have also been attempts to create a semantic information theory, of which the most known is the concept P. Carnap and And. Bar-Hillel , based on the concept of logical probability as the degree of confirmation of the hypothesis. In accordance with this concept, hypotheses, confirmed by reliable knowledge, experiment, have the highest value. In this case, the logical information is equated to one, and the semantic one to zero. With a decrease in the degree of confirmation of the hypothesis, the amount of semantic information contained in it increases. It should be noted that hypotheses are formulated in the special language proposed in the theory in question.

Developing the concept of Carnap - Bar-Hillel, L. Brillouin proposed a kind of statistical measure for measuring semantic information, based on the measurement and reduction of uncertainty (entropy). Relying on the second law of thermodynamics (the Carnot principle), L. Brillouin introduces the following measure of entropy reduction to measure information:

(1.12)

where P0 is the number of equiprobable outcomes, the occurrence of which is a priori considered equiprobable; In this case, there is no initial information about the problem, i.e. a priori information I 0 = 0; P1 is the number of equiprobable outcomes in the presence of information about measurements in similar, related situations, and then information reducing the uncertainty is I 1.

The Brillouin, by analogy with thermodynamics, used the natural logarithm, and in order to select a system of units, introduced a coefficient K, which, to bring to the bits taken by that time, is equal to K = 1/ln2.

A. A. Kharkevich linked the value of information with the purpose of the activity, suggesting to consider the entropy measure of Shannon as a measure of the probability of hitting the target, i.e. as a measure of the appropriateness.

Yu. A. Schrader proposed to build a theory of semantic information on the basis of the concept of diversity, rather than the concept of removing uncertainty (Figure 1.5), and in particular, based on the consideration of such information properties as the dependence of the received information on a priori information.

Dependency of information from a priori information

Fig. 1.5. Dependence of information on a priori information

Based on the idea H. (In terms of mathematical linguistics and the theory of formal languages ​​- thesaurus Θ), A. A. Schreider determines the amount of semantic information contained in the text T, as a thesaurus change Θ I ( T , Θ).

This approach differs significantly from the concept of choice in the statistical approach, in which it is assumed that the information received is the greater, the less a priori information is contained in the information receiver. On the contrary, according to Schrader, the more complex the thesaurus structure has, the more opportunities there are for changes under the influence of the same message. This is in good agreement with the law of the necessary variety U. R. Ashby , according to which the controlling (comprehending, understanding, decision-making) system must have a large necessary variety (complexity) in comparison with the variety of information coming into it from the managed (understood) system (see details in Chapter 3).

The approach proposed by Yu. A. Schreider is also consistent with observations of the information exchange processes: as our knowledge of the object under study grows, the amount of information extracted about this object grows, but in the future saturation and a decrease in the amount of information obtained. The dynamics of the saturation of the individual thesaurus by A. A. Schrader illustrates by some conditional dependence (Figure 1.5), the nature of which depends on a specific consumer of information or accumulating information system information.

This curve characterizes one of the features of information - the dependence of the information received on a priori information. To quantify the value of semantic information, various measures were proposed.

Thus, in [14] a measure is suggested for the universal thesaurus, which is understood as the totality of concepts developed by mankind in the process of scientific knowledge of nature, society and thought, connected with each other.

The process of scientific knowledge is presented as a refinement of the old and the formation of new scientific concepts, in the identification and adjustment of connections between concepts, ie, the thesaurus is treated as a specific structure that changes under the influence of new information in the process of cognition and suggests a hypothetical approach, according to which information does not equally depend on the complexity of the structure.

If, under the influence of the message, the internal structure of the universal thesaurus 0 becomes less complex than before this message, then this message contains more information than in the message that causes the structure 0 to become more complex.

This conclusion is consistent with experience: it is known that major scientific discoveries as a whole simplify the structure of knowledge - through the introduction of new, more general concepts, which is the consequence and manifestation of the cumulative information property

In [14], the proposed approach to assessing the value of information depending on the structure of the representation of the universal thesaurus is hypothetical, and an idea is voiced about the need to find other measures for estimating the value of information.

To measure the satisfaction of information needs in the theory of scientific and technical information, measures of relevance and of perpetuation are introduced (see Chapter 6).

Then, when they began to realize that there are different types of information, EK Voishvillo suggested distinguishing the information of perception (signs) and information-value, the semantic content for the consumer information (signified by a sign).

In the future, information was viewed in several aspects from the point of view of the observer (user): pragmatic - to achieve the observer's goals; semantic - from the point of view of the meaning and correctness of the interpretation; syntactic (or material-energy, sign, technological) - from the point of view of the technology of information transfer. At the same time, it was considered that the most general concept is the

of pragmatic information, a the semantic and syntactic aspects of the information have a subordinate value, or these aspects can be considered levels of information, and began to introduce different grades for different levels (sometimes with rather exotic names).

So, in the system of measures of economic information assessment, stated in the works Yu. I. Chernyak , it is proposed to distinguish several aspects (levels) of the representation and measurement of economic information (from the point of view of utility for solving the problem, meaning or semantics of the text, syntactics of the sign mapping, morphology of the formation of signs - words and phrases and their transfer through communication channels) and for each level their own information measures in terms of the (pragma, sem, sign, or symbol , etc.) taken at this decision level, etc.

In the theory of the information field and its discrete version - of the information approach to the analysis of systems AA Denisov [1, 8, 19], as shown in paragraph 1.2 , the notions of sensible J and of logical N (semantic and pragmatic) information are introduced, and information for the consumer is the intersection of perception information or sensory information and its potential or logical information, resulting in a single concept - information meaning, information complexity (C), which in particular cases ay is the Cartesian product of Y and H (1.7).

For the constructive use of these concepts, AA Denisov introduced the corresponding deterministic and probabilistic estimates.

Sensual information J is introduced as a measure of the objective reality reflected in our consciousness, the elemental base of the system in the form

J i = A i /ΔA i, (1.13)

where A i - the total number of any signs perceived by measuring devices or our sense organs; ΔA i is a "quantum", to the accuracy of which we are interested in perceived information, or the resolving power of the device.

Indeed, firstly, J essentially depends on the resolving power of the selected measuring device and, secondly, is determined by the significance for us of the measured quantity, i.e. purpose of measurements.

Thus, the presence or absence of a personal car for a particular citizen is fixed to an accuracy of Δ A = 1 car, since this circumstance is very significant for a motorist. On the contrary, the aggregate data on the production of cars in the country are given with an accuracy of Δ A = 10 thousand cars, as fewer cars for the country as a whole are unimportant. Similarly, the length of the segment can be measured with an accuracy of 1 m, up to 1 dm, up to 1 cm, and accordingly receive different values ​​of J.

Even with a fixed ΔA, information, strictly speaking, is not a number, since within a more or less bounded ΔA it can have any value.

So, if a voltmeter with a resolving power of 1 V shows 200 V, then the true voltage value is most likely either in the range from 200 to 201 V, or in the range from 199 to 200 V; in the general case, both ranges are equally probable, so that we can introduce the logarithmic measure of the information unit Δ J = -log2P = log20,5 = 1 bit, where p is the probability of presence or absence the minimum value of information.

Given the relation (1.13), the voltmeter readings give us J = 200 + + 1 bits of information, from which it follows that the information is not a number, but a value blurred within 1 bit. This means that in the semantic aspect, information is always J. But at the same time, and in the same respect it is not J, ie. does not satisfy the logical law of identity and carries in itself, albeit an objective, but relative truth, while the number always carries in itself an absolute truth. Thus, the voltmeter readings are 200 V when the pointer has stopped at this scale division, but at the same time is not 200 V, since the instrument readings are always approximate.

Thus, information always carries a very significant element of subjectivity and is different for different people for the same A. In addition, even for a fixed ΔA, information, strictly speaking, is not a number, because within more or less limited ΔА it can have any value.

It is also important to specify that two or more identical measuring devices when measuring the same magnitude within their resolution can give different information, but with the same reliability.

This means that information does not satisfy the logical law of the excluded third, does not allow the existence of several contradictory but equally true quantities, but it satisfies the dialectical law of unity and struggle of opposites.

So, information is a concept that can not be analyzed by means of formal logic and requires the use of dialectical logic, which provides the possibility of analysis not only absolutely, but also with respect to true statements.

From this point of view J is analogous to statements of natural language, which always have a blurred and relatively true character. However, in view of the dual nature of J (number and not number), information, unlike verbal forms, is subject to some (not all) mathematical operations.

Sensual information can be measured in a probabilistic way:

J i = -log p i, (1.14)

where p i is the probability of the event.

The logical information (essence) H , unlike J, always relating to specific objects or properties, characterizes a whole class of objects or properties that are homogeneous in a certain respect, being semantic a synthesis of the laws of logic, the rules of the functioning of the system and its elements that form the functional of its existence.

According to the basic law of Aristotle's classical logic, the essence of the system is inverse to the volume of the concept about it,

ie

H = J/n. (1.15)

The scope of the concept depends on the aspect of considering the system (element) and usually assumes their generic membership.

For example, the scope of the term manufacturing plant is the total number of all production enterprises in the city (region, country), and the scope of the concept "is a manufacturing enterprise" is equal to one.

If the system is characterized by its many states, like a working week, which consists of Monday, Tuesday, Wednesday, etc., this set is the volume of the concept of "week", which is 7, and the essence of the three days of the week H = 3/7.

The method of mediation (averaging) J can be different, for which the parameter y is entered, which the task director can choose. Then:

(1.16)

where J i - the measurement results of Ai according to (1.13); n is the volume of the concept, i.e. number of objects covered by the concept; γ is a parameter of the averaging logic, for various values ​​of which we obtain various expressions for the determination of H, given in Table. 1.1 (in Table II, the sign of the work).

Table 1.1

The main methods of measuring J and N according to AA Denisov

Deterministic measurement method

where A j - the value of the measured quantity; AAj - the "quantum", to the accuracy of which L PR is interested in perceived information (unit, resolution of the device)

where Ji - measurement results Ai, n - the scope of the notion of the objects covered by the measurement; y is the averaging parameter

For y = 1, we get the arithmetic mean

For y = 0, we get the geometric mean

For y = -1, the average harmonic is

Probabilistic method of measurement

where Pi - is the probability of the event. In the case of using information to achieve the goal Pi is called the probability of failure to achieve the goal or the degree of non-compliance

where q 1, is the probability of using an information item.

With

For an equiprobable selection of an element

For pragmatic information

where - probability of achieving the goal, degree of compliance

We note that as a verbal form of the basic law of logic, and the symbolic form (1.15), they are blurred, since various inverse forms are meant by reverse dependence, and the relation (1.15), although it has the form of inverse proportionality, J also means a certain range of specific dependencies. In general, the quantitative values ​​of H and J coincide in statics (since n = 1), but unlike J, the essence of the concept of I can not be an object of immediate sensory perception, but is the result of logical comprehension, which is reflected in the interpretation of n.

In particular, n can be considered as the volume (capacity) of memory occupied with information about the reflected concept, since the concept is formed on the basis of information contained in our memory (or in computer memory).

We also pay attention to the fact that the law in question is fundamentally continuous (by virtue of the inverse proportionality n) and its action should extend to continuous (multivalued) logic.

Logical information H can be determined not only through the parameters of the system synthesizing it (human, automated information system).

If we consider that H is characterized not by a single object, but by a class of objects or properties that are homogeneous in a certain sense, then H can be defined in terms of the probability density f ( J i) of the fact that J has the value J i:

(1.17)

In the particular case, instead of the probability density, one can characterize the class of homogeneous objects simply by the probability q i and represent J i in logarithmic form; then we get

(1.17, a)

The values ​​of q i and p i may not be equal, but situations are possible when q i = p i, which takes place in the Shannon formula

(1.17,6)

Pragmatic (target) information Н ц is described by a model similar to (1.17, b), only for practical applications it is more convenient to replace the probability of unreachable goal p i on the conjugate (1 - ):

(17, c)

where - probability of achieving the goal; q i is the probability that the evaluated component will be used to achieve the goal.

Thus, it follows from the foregoing that J and H can be measured in various ways - determinism and using probabilistic characteristics.

Since in some applications both forms of representing information characteristics can be used simultaneously: both deterministic and probabilistic, and also the transition from one form to another, it is convenient to use the comparative table. 1.1, which shows the basic methods of measuring J and H .

We should note the features of the probabilistic characteristics used in the approach presented. In the particular case, p, can be a statistical probability, determined on the basis of a representative sample that obeys one or another statistical regularity. However, the probability space can not always be strictly defined. In this case, we can use the concept of a diffuse probability in the sense of Zade.

In the general case, in AA Denisov's theory, the probability of achieving the goal and the probability of using the evaluated component (property) when deciding q i can have a broader interpretation and are used not in a strict sense from the point of view of probability theory, valid for stochastic, repetitive phenomena, but to characterize single phenomena, events, when p i appears as degree of purposefulness.

By analogy with the previous studies of R. Hartley, K. Shannon, A. A. Kharkevich, a unit based on the binary logarithm is used as the unit of measurement of information, giving a value of 1 bit as the minimum unit of information.

The ego is convenient for the following reasons. In order for (14) to give information in bits, it is necessary to understand that the a priori belonging of each scale division of the measuring device of the measured quantity is 0.5. Then, since the scale represents a sequential join of divisions, the joint probability that J of them belong to the measured quantity is p = 2-j, whence we get J = -log2p, which at the same time is the solution of the equation:

(1.18)

At the same time, in principle other measures of compression of the information scale can also be adopted - octal logarithms - bytes (already used to estimate the amount of information in computer technology) or even not yet used - decimal logarithms (unit can be called, for example, dec. ), natural logarithms ( neper ), and the like.

There are other measures of the amount of information, taking into account the process of its transmission and changing the probability of the appearance of a message from the a priori value p (x) at the input of the information transfer channel to the posterior value p (x/y) at the output of the channel, associated with the distortion of information in the channel, i.e. based on the conditional probability [10, p. 16-21].

Also We Can Offer!

Other services that we offer

If you don’t see the necessary subject, paper type, or topic in our list of available services and examples, don’t worry! We have a number of other academic disciplines to suit the needs of anyone who visits this website looking for help.

How to ...

We made your life easier with putting together a big number of articles and guidelines on how to plan and write different types of assignments (Essay, Research Paper, Dissertation etc)