Methods of collecting and analyzing information in...

Methods for collecting and analyzing information in management activities

In the most general form, the methods of collecting information that are used in the field of management can be divided into two groups: industrial espionage and analytical work. The first of these methods is industrial espionage - basically refers to the receipt of confidential information and will be considered later, and in this chapter we will pay attention to analytical work.

In the conditions of constantly changing economic relations, the formation of new organizations, the development and improvement of organizations already operating in the market, there is a great need for analytical work. There is a need to collect and accumulate information, experience, knowledge in all areas of management activities. The organization is interested in a detailed study of emerging market situations for the adoption of operational, economically sound solutions that will allow it to develop more rapidly.

Analysis is one of the most effective and safe ways of obtaining information. Using open information resources, you can get almost all the necessary information about the organization. The process of obtaining important information based on the synthesis of information from a variety of open sources will be called analytical work . It consists of the following steps:

Identify the source information for analysis, and how to obtain it.

Interpretation of information, i.e. revealing the true meaning of this or that information. First of all, information received in the oral form needs to be interpreted, since often this or that statement can be misunderstood, it can be caused by foreign speech, intonation, gestures, slang, phrases taken out of context or misunderstood, etc. .

Allocation of extraneous information - one of the most difficult and critical stages. The excess of information, as well as its lack, is a serious problem, makes it difficult and slows down the conduct of analytical work. In practice, a greater effect is brought on by focusing on a few key details than scattering between many disparate data. However, it is at this stage that there is a danger of losing important information. As a rule, this happens in case of incorrect interpretation of information in the previous stage.

Evaluation of information - the placement of information sources, the information itself and how to obtain it, depending on their reliability and reliability. The source of information can be specific people, newspapers, television, Internet sites, etc. When assessing information, there must be subjectivity, which must be minimized. Typically, the following rating system is used for this.

1. Source estimation:

A - reliable;

B - usually reliable;

B - pretty reliable;

D is not always reliable;

D - unreliable;

E - source of unknown reliability.

2. Evaluation of information:

I - confirmed by other facts;

II - probably true (true to 75%);

III - possibly truthful (true to 50%);

IV - questionable (true for 25%);

V - implausible;

VI - reliability can not be determined.

3. Evaluation of the way information is received:

I - getting information yourself (seen, heard myself, etc.);

II - obtaining information through a constant source (informer, open sources, etc.);

III - getting information through a one-time source (an accidentally overheard conversation, rumors, etc.).

At the evaluation stage, it is necessary to establish how much information can correspond to the truth. It should be taken into account that it is possible to obtain information that is not true for the following types:

• disinformation brought to the attention of the source;

• Information deliberately or inadvertently distorted by the source;

• Information that is arbitrarily or involuntarily changed during transmission.

With intentional misinformation, deliberate falsehood, half-truth, and truthful information are used, which in this context will push the person who perceives information to false conclusions. Distortions that occur during the transfer of information can occur for several reasons:

• sending only part of the message;

• retelling what he heard in his own words;

• Subjective perception of facts.

For the timely detection of distorted information, as well as to prevent misinformation, it is necessary to distinguish between facts and opinions, take into account the subjective characteristics of the source of information and its alleged relation to the message being issued. It should be clearly understood whether the source of information is able to have access to the facts reported. For insurance, it is necessary to have duplicate sources, use duplicate communication channels and try to exclude unnecessary intermediate links in the transmission of information. In addition, it should be remembered that it is especially easy to perceive disinformation, which is in good agreement with the previously adopted version, i.e. the one that is supposed or desired to receive.

Building preliminary versions - a stage of analytical work, explaining the place of the main facts obtained in the chain of events. Here it is necessary to highlight the key points, to separate them from less important ones that do not play the main role. The received information should be clearly classified according to the reliability of the source, the information itself and the way they are obtained. First of all, the most recent and complete information should be considered. Materials marked as source of unknown reliability and validity can not be determined it is not recommended to use it unless absolutely necessary.

Then it is necessary to identify all possible hypotheses that can explain the key events, and, arranging them according to the degree of probability, alternately check for interfacing with all the data. If a significant discrepancy between the preliminary hypothesis and the information obtained is found, and the latter have sufficiently high reliability estimates, then the following hypothesis should be adopted. Thus, the most probable assumptions are chosen.

At this stage, one of the most serious problems of analysis arises - contradictions in the information. To overcome it, it is necessary to compare the estimates of the information and the source, the date of obtaining the disputed information. The decisive importance is the knowledge, experience and intuition of the employee performing the analysis. Contradictions in the information should be eliminated during the analysis, for this purpose additional information is collected.

Determining the need for additional information means deciding which information is needed. At this stage, gaps in information are identified. It should be borne in mind that some of the gaps are found quite easily, because it is the result of insufficient research. Other gaps may not be detected, because they were missed at the preliminary information stage. Obviously, they are painful.

When information gaps are identified, their importance is determined for further analysis. If additional information is found necessary, all the steps described above are repeated. Although this can happen many times, at a certain stage we will have to confine ourselves to the available data and to formalize the findings as a report. On the basis of analytical reports, references, reviews of various kinds, top-ranking executives make important decisions, a significant part of which relates directly to management activities. The methods for collecting confidential commercial information of competing entrepreneurs include obtaining information from corrupted and discarded documents, drafts, copying paper, damaged floppy disks, and the like. On the one hand, they can carry in themselves the most valuable information, and, on the other hand, there is nothing illegal in this. In addition, it is possible to obtain important information legally by scientific and technical cooperation on the topic of interest, analyzing reservations at joint seminars, conferences, etc.

In order to evaluate and measure the amount of information in accordance with the above aspects, different approaches and methods are applied. Among them there are statistical, semantic, pragmatic and structural. Historically, the most developed statistical approach.

The statistical approach is studied in an extensive section of cybernetics, called information theory. The founder of this approach is K. Shannon, who published his mathematical theory of communication in 1948. A great contribution to the theory of information before it was made by the scientists Nyquist and Hartley, who respectively in 1924 and 1928. published works on the theory of telegraphy and information transfer. Studies in the theory of information of United States scientists AN Kolmogorov, A. Ya. Khinchin, VA Kotel'nikov, AA Kharkevich, and others have been acknowledged all over the world.

To. Shannon introduced the concept of the amount of information to be taken when receiving information, as a measure of the uncertainty of the state of the system. The quantitatively expressed uncertainty of a state is called entropy by analogy with a similar concept in statistical mechanics. When information is received, uncertainty decreases, i.e. entropy, system. Obviously, the more information an observer receives, the more uncertainty is removed, and the entropy of the system decreases. With entropy equal to zero, there is complete information about the system, and it seems to be entirely ordered to the observer. Thus, the receipt of information is associated with a change in the degree of ignorance of the recipient about the state of this system.

Prior to receiving information, the receiver could have some preliminary (a priori) information about the system X. The remaining ignorance is for him a measure of the uncertainty of the state of the system (entropy). Denote the a priori entropy of the system X by H (X). After receiving a message, the observer acquired additional information 1 (X), decreasing his initial ignorance so , that a posteriori (after receiving information) the uncertainty of the state of the system became H '(X).

Then the amount of information can be defined as

(1.1)

In other words, the amount of information is measured by decreasing (changing) the uncertainty of the state of the system.

If the a posteriori entropy of the system turns to zero, the initially incomplete knowledge will be replaced by the complete one and the amount of information received in this case by the observer will be

(1.2)

i.e. The entropy of the system can be considered as a measure of the missing information.

If the system X has discrete states (ie goes from state to state), their number is N, and the probability of finding a system in each of the states - P , P 2 , P 3, ..., P N, then according to Shannon's theorem the entropy of system

(1.3)

Here the coefficient K 0 and the base of the logarithm a determine the system of units for measuring the amount of information. The logarithmic measure of information was proposed by Hartley to represent the technical parameters of communication systems as more convenient and closer to perception by a person accustomed to linear comparisons with accepted standards. For example, one feels that two floppy disks of the same type should have twice the capacity of one, and two identical communication channels have twice the capacity. The minus sign in (1.3) is put in order that the entropy value is positive.

Entropy H has a number of interesting properties. Here are some of them.

Entropy H is equal to zero only when all the probabilities P, are zero, except for one, and this unique probability is equal to one. Thus, H = 0 only in the case of complete certainty of the state of the system.

For a given number of states of the system N , the quantity H is maximal and equal K 0 log0 TV, P are equal.

Define the units for measuring the amount of information using the expression for the entropy of a system with equiprobable states.

Let the system have two equiprobable states, i.e. N = 2. We will assume that the removal of uncertainty about the state of such a system gives one unit of information, since with complete removal of uncertainty, the entropy is quantitatively equal to the information H = 1. Then <

(1.4)

Obviously, the right side of the equation will be identically equal to the information unit if we assume that K 0 = 1 and the base of the logarithm a = 2. In general, for N equiprobable states the amount of information

(1.5)

The expression (1.5) is called the Hartley formula and shows that the amount of information needed to remove the uncertainty about a system with equiprobable states depends only on the number of these states.

Information about system states is transmitted to the recipient in the form of messages that can be represented in different syntactic forms, for example, in the form of code combinations using different symbols and n digits, in each of which can be any of the symbols. If the code is not redundant, then each code combination displays one of the system states. The number of codewords will be N = t n.

Substituting this expression in the formula (1.5), we obtain

(1.6)

If the code is binary, i.e. (0 or 1), then m = 2 = n. In this case, the amount of information in the message will be n binary units, called bits (binary digit (bit) - a binary digit).

When using the base of the logarithm of a number, ten units of information can be decimal. Since log 2 N = log10N/log102 = 3,321og10N, the decimal unit is approximately 3.33 bits.

Sometimes it is convenient to use the natural base of the logarithm - e. In this case, the resulting units of information are called natural, or notes. The transition from the base a to the base A requires only multiplication by logA a .

The introduced quantitative statistical measure of information is widely used in information theory to evaluate its own, mutual, conditional and other types of information. Consider, for example, our own information, by which we will understand the information contained in this particular message. A specific message, as indicated, gives the recipient information about the possible existence of a particular state of the system. Then the amount of own information contained in the message Xi, is defined as

(1.7)

Own information has the following properties: it is non-negative; The less likely the message is, the more information it contains. That is why unexpected reports so affect the human psyche that the large amount of information contained in them creates an information psychological blow, sometimes leading to tragic consequences; if the message has a probability of occurrence equal to one, then the information contained in it is zero, since it is known in advance that only this message can come, and therefore nothing new gets the consumer information; own information has the property of additivity, i.e. the number of own information of several independent messages is equal to their sum.

It should be noted again that a statistical approach to the quantitative assessment of information was considered for discrete systems randomly passing from state to state, and therefore the message about these states also arises in a random way.

In addition, the statistical method of determining the amount of information practically does not take into account the semantic and pragmatic aspects of information.

The semantic approach is the most difficult to formalize and is still completely undecided.

The thesaurus measure proposed by Yu. I. Schneider was the most recognized for measuring the semantic content of information. The ideas of the thesaurus method were formulated by the founder of cybernetics N. Wiener. To understand and use information, the recipient must have a certain amount of knowledge.

If the individual thesaurus of the consumer reflects his knowledge of this subject, then the amount of semantic information contained in a certain message can be estimated by the degree of change of this thesaurus that has occurred under the influence of this message. Obviously, the amount of information is nonlinearly dependent on the state of the individual thesaurus of the user, and although the semantic content of the message is constant, users who have different thesauri will receive a different amount of information.

In fact, if the individual thesaurus of the recipient of information is close to zero, then in this case the amount of perceived information is zero.

In other words, the recipient does not understand the received message, and, as a consequence, for him the amount of perceived information is zero. This situation is equivalent to listening to a message in an unknown foreign language. Undoubtedly, the message is not meaningless, but it is not clear, and therefore, has no information.

The amount of semantic information in the message will also be zero if the information user knows absolutely everything about the subject, i.e. his thesaurus, and the message does not give him anything new.

It is intuitively felt that there is some optimal value between these polar values ​​of the user's thesaurus, in which the amount of information retrieved from the message becomes maximum for the recipient.

The thesaurus method confirms the thesis that information has the property of relativity and thus has a relative, subjective value. In order to objectively evaluate scientific information, the concept of the universal thesaurus has appeared, the degree of its change determines the significance of new knowledge received by mankind.

A pragmatic approach defines the amount of information as a measure that contributes to the achievement of the goal. One of the first papers implementing this approach was the article by AA Kharkevich. In it, he proposed to take for the measure of the value of information the amount of information necessary to achieve the goal. This approach is based on Shannon's statistical theory and considers the amount of information as an increment in the probability of achieving a goal. So, if we accept the probability of reaching the goal before receiving information equal to P 0, and after obtaining it - P 1, then the pragmatic amount of information I n is determined as

(1.8)

If the base of the logarithm is made equal to two, then I n will be measured in bits, as in the statistical approach.

In assessing the amount of information in the semantic and pragmatic aspects, it is also necessary to take into account the temporal dependence of information. The fact is that information, especially in economic object management systems, has the property of aging; its value decreases over time, and it is important to use it at the time of greatest value.

The structural approach is associated with problems of storage, reorganization and retrieval of information, and as the amount of information accumulated in computers increases, it becomes increasingly important.

With a structural approach, they abstract from subjectivity, the relative value of information, and consider the logical and physical structures of information organization. With the invention of computers, it became possible to store huge amounts of information on machine carriers. But for its effective use it is necessary to define such structures of information organization that there was a possibility of fast search, extraction, recording, modification of the information base.

In machine storage, the structural unit of information is one byte that contains eight bits (binary information units). Less definite, but also translatable into bytes is an indivisible unit of economic information - props.

Requisites are combined into indicators, indicators - in records, records - in arrays, array complexes are created from arrays, and from complexes - information bases. Structural theory makes it possible at the logical level to construct an optimal structure of the information base, which then is realized with certain means at the physical level-the level of technical devices for storing information. From the chosen storage structure, an important parameter, such as the access time to the data, depends. structure affects the time of recording and reading of information, and therefore, for the time of creation and reorganization of the information base.

The information base together with the database management system (DBMS) forms an automated data bank.

The importance of the structural theory of information grows when moving from data banks to knowledge banks, in which information is subjected to an even higher degree of structuring.

Also We Can Offer!

Other services that we offer

If you don’t see the necessary subject, paper type, or topic in our list of available services and examples, don’t worry! We have a number of other academic disciplines to suit the needs of anyone who visits this website looking for help.

How to ...

We made your life easier with putting together a big number of articles and guidelines on how to plan and write different types of assignments (Essay, Research Paper, Dissertation etc)