Factor analysis and principal component method
In 1901, the distinguished English statistician K. Pearson proposed the method of principal components. The essence of it is that, with respect to the results of measurements presented on a two-dimensional plane, a straight line is searched, which would satisfy two conditions. Changes along it should be maximum, and in the orthogonal direction, on the contrary, minimal. The main component is the one that is counted along the found straight line. Such an analysis can be continued. In this case, a set of components is obtained, ranked according to the degree of their relevance. If you want, you can refuse to consider those components that will be considered insignificant. As a result, the number of variables will be reduced (reduced). The method of principal components perfectly illustrates the basic idea of factor analysis. It consists, first, in data reduction, and secondly, in their classification.
Correlation analysis. Its purpose is that some connection is revealed between the statistical values (sample means). Statistical analysis is never dispensed with without identifying correlations that are usually characterized by some correlation coefficients. Establishing these links is the first step in identifying empirical laws. As a rule, the correlation analysis is supplemented by a regression study.
Regression analysis. It is carried out with the aim of determining the equation that joins the dependent variable Y with independent variables X i If the degree of dependence Y from X i, then we can predict how its value varies in accordance with the changes X i Not always, but most often linear regression is defined as a straight line:
How to ...
Tired of looking for a solution?Get Your ESSAY Now!
The coefficients bi. characterize the degree of contribution of independent variables to the quantity Y. But when choosing a straight line, it is necessary to use some criterion that would allow one function to be selected from the set of linear dependencies. For this purpose, the least squares method is often used, which allows to minimize the sum of the squared deviations of the actually observed Y from their postulated values.
The least squares method was developed more than two hundred years ago by K. Gauss and A. Legendre. As they found out, it is necessary to minimize the sum of squares of deviations, and not the sum of deviations. It can be shown that the sum of the squared deviations of individual measurements from the sample mean will be less than the sum of the squares of the deviations of the individual measurements from any other value. The very concept of the sample mean is such that it brings to life the least squares method.
Planning an experiment. The process of cognition never begins from scratch. This circumstance makes it possible to plan an experiment. In fact, all pre-experimental knowledge is generalized in the model. It includes a number of stages, which just constitute the essence of the planning of the experiment.
First, the problem is determined, which forces us to turn to experiment. Obviously, the experiment is conducted in connection with the desire to develop new knowledge. The previous level of knowledge does not satisfy the researcher.
Secondly, the definition of the problem involves choosing the optimization parameters. They are characteristics of the goal. In the simplest case, they are limited to one optimization parameter. If there are several of them, then, as a rule, a generalized optimization parameter is determined, which is a function of the initial optimization parameters.
Thirdly, the composition of the factors affecting the optimization parameters is analyzed. Their list is determined. Factors are divided into different levels. Each level combines a class of factors that does not depend on another class of factors. Sometimes factors combine into blocks. The basis for such an association is their similarity in the degree of influence on the optimization parameters. For the sake of simplifying the experiment, it is permissible to consider only the most essential parameters of a block. The main and minor factors are identified. The degree of correlation between it is determined. Orthogonal factors by definition do not depend on each other.
Fourth, the number of tests to be carried out is determined. There should be exactly as many as enough to achieve the purpose of the experiment. The methods developed in statistics make it possible to determine the number mentioned.
Fifthly, the sample of data that is necessary for finding sample averages and empirical laws is determined.
Sixth, ways to ensure the reliability of data, the possibility of repeating the experiment by the researcher himself and by other scientists are determined.
Seventh, the experiment should be legitimate, not only from the legal, but also from the ethical point of view. Therefore, the ethical and legal aspects of the experiments are analyzed.
Ready to make your order?Get your great paper now
Eighth, in the preliminary plan, it is possible to adjust the experiment, depending on the receipt of certain intermediate results.
Concluding the paragraph, you should pay special attention to the fact that, as a rule, the experiment is regarded in its two qualities. It is claimed that it is necessary for: a) testing the theory, b) developing a new theory. This, of course, is relevant. Nevertheless, the main meaning of this section is different. Pay special attention to the transduction line, which reached induction. We did not discuss the relationship experiment → theory, but the place of processing the experimental results in the transduction line.
The results of the experiment are facts. It is generally accepted that it is facts that are truly non-objective objective events; they are primary in relation to the theory, which is necessary for their comprehension. The question "What exactly exists?" answer Facts & quot ;. But are there principles, laws, models? Factualists recognize their existence only if they are reduced to facts. Laws, they say, express the connection of facts. In the opinion of the author, the factualists absolutize the significance of the facts. That is why they consider facts the primary link in all conceptual frameworks. Actually, the facts are intermediate, and not the primary or final link in the intra-theoretical transduction.
1. Induction is the actual phase of conceptual transduction.
2. When performing induction, correlation and regression analysis are critical.
3. Induction is the separation of referents, inductive laws and principles.
Also We Can Offer!
- Argumentative essay
- Best college essays
- Buy custom essays online
- Buy essay online
- Cheap essay
- Cheap essay writing service
- Cheap writing service
- College essay
- College essay introduction
- College essay writing service
- Compare and contrast essay
- Custom essay
- Custom essay writing service
- Custom essays writing services
- Death penalty essay
- Do my essay
- Essay about love
- Essay about yourself
- Essay help
- Essay writing help
- Essay writing service reviews
- Essays online
- Fast food essay
- George orwell essays
- Human rights essay
- Narrative essay
- Pay to write essay
- Personal essay for college
- Personal narrative essay
- Persuasive writing
- Write my essay
- Write my essay for me cheap
- Writing a scholarship essay