## Arbitrary number of independent variables

The above concepts and concepts can easily be generalized to the situation when the researcher studies the effect of an arbitrary number of independent variables on one dependent variable.

The equation of linear regression (9.1) for * to * of independent variables under the condition of z-transformation of raw data takes the following form:

The general methodology for calculating regression coefficients in a complex regression situation with one dependent variable and independent variables does not fundamentally differ from what we already know by the example of a simple linear regression considered in Chap. 7. In fact, we need to draw a straight line in space with the dimension * k * + 1 by points, the coordinates for which were obtained in the experiment. The number of these points is determined by the number of subjects participating in the experiment, and the specific dimensionality of the space is the number of independent variables, to which one more dimension is added, which is assigned to the dependent variable. The optimal solution is a solution in which the sum of the squared differences is minimal. This is achieved using the differential calculus method known as the

*The minimum value for the next sum that determines the regression error is then searched for:*

**least squares method.**

The partial derivative for each value of βj is set to zero. Thus, to find all values of the regression coefficient, it is required to solve a system of equations of the following form:

We rewrite this system of equations somewhat differently:

Obviously, the right-hand side of this system of equations is a bivariate correlation vector of the dependent variable with all independent variables, and the left-hand side is the result of the product of the correlation matrix of all independent variables with a friend on the vector of values of all sought-for regression coefficients. In other words, the system of equations under consideration, using the apparatus of vector algebra, can be rewritten in such a vector form:

Thus, to find the desired vector of the values of the regression coefficients βj, it is enough for us to multiply the correlation vector * r *

*by*

*matrix of intercorrelations*

**inverted***Then the desired solution for the vector of standardized regression coefficients will look like this:*

**Rjj.**

It is clear that the problem of inversion of the matrix of intercorrelations can be quite complicated in "manual" calculations, especially if the number of independent variables is sufficiently large. Modern computer programs, however, can easily cope with it without requiring large computational resources.

With the values of the standardized regression coefficients β, we can calculate the values of the regression coefficients * B * in the complex linear regression equation (9.1). This can be done using the formula

** Coefficients ** * R * and

*2 for*

**R***of independent variables are defined exactly as in the case considered, when the number of independent variables was limited to two. In other words, the multiple correlation coefficient*

**k***is estimated as the usual bivariate correlation between the observed and predicted values *

**R***(formula (9.4 )), and the coefficient of determination*

**Y***2*

**R***as the variance ratio of the predicted values *

**-***to the observed values of the dependent variable (formula (9.5)).*

**Y**The same coefficients can also be expressed with the help of regression coefficients β and the bivariate correlation values of the dependent variable with each of the independent variables:

It should be borne in mind, however, that the determined coefficient of determination * R * 2

*of the determination coefficient p2 for the general population. Therefore, in order to obtain a more realistic estimate of p2, some adjustments must be made to the calculations*

**is not an unbiased estimate***2. This is achieved using the formula*

**R**

The adjusted coefficient of determination is often called "shrunk", as its value is somewhat less than the value of * R * 2.

*independent variables is treated in a manner similar to what we already know with the example of two independent variables. Thus, the coefficient*

**k***2 is understood as that part of the variance of the dependent variable*

**sr***, which is associated with the variance of the independent variable*

**Y****X**j, minus that portion of the variance

*, which is simultaneously related to the variance of other independent variables. This can be expressed by the following equation:*

**Y**

Here denotes the proportion of the variance of the dependent variable * Y * associated with the variance of all independent variables, except that its part, which is associated with the variance of

*j. In other words, the value means the value of the determination coefficient without taking into account the contribution of the considered variable, for which the correlation of the part is calculated. If this value is subtracted from unity, we get the*

**X***variable*

**tolerance***j*

**X**

**: < img border = 0 src="images/image748.jpg">** The very value of the correlation of honor for the variable * X * j can be found on the basis of the standardized regression coefficient βj and the corresponding tolerance:

Also, the correlation value of a part can be calculated based on the available values of the partial correlation coefficients for the variable * X * j and the determination coefficient

*2*

**R**

**:**

Formulas (8.7) and (8.8) can be useful if the statistical program by which regression analysis is performed does not directly yield correlation values of the part.

** Private correlations ** for * k * independent variables are interpreted in the same way as for two independent variables. As we recall,

*2 represents the ratio of that part of the variance of the dependent variable that is related to the variance of a given independent variable and is not simultaneously related to the variance of other independent variables, variable, i.e. the variance of the dependent variable that is not associated with any independent variable. Formally, this definition of the partial correlation can be expressed by the following relationship:*

**pr**

The part of the variance of the dependent variable that is related to the variance of the considered independent variable and for which the value of the partial correlation is calculated and at the same time it is not related to the variance of other independent variables is by definition the correlation square of the part. Thus, the square of the private correlation * pr * 2 variables

*j with the dependent variable*

**X***can be expressed as follows:*

**Y**(9.8)

Note that formula (9.8) demonstrates, among other things, the fact that under no particular conditions a particular correlation can be less than the correlation of a part. On the contrary, its value is almost always greater than the correlation value of the part.

## Also We Can Offer!

##### Essay Writing

[...]

- Argumentative essay
- Best college essays
- Buy custom essays online
- Buy essay online
- Cheap essay
- Cheap essay writing service
- Cheap writing service
- College essay
- College essay introduction
- College essay writing service
- Compare and contrast essay
- Custom essay
- Custom essay writing service
- Custom essays writing services
- Death penalty essay
- Do my essay
- Essay about love
- Essay about yourself
- Essay help
- Essay writing help
- Essay writing service reviews
- Essays online
- Fast food essay
- George orwell essays
- Human rights essay
- Narrative essay
- Pay to write essay
- Personal essay for college
- Personal narrative essay
- Persuasive writing
- Write my essay
- Write my essay for me cheap
- Writing a scholarship essay

##### Case Study

[...]

##### Citation Style

[...]

##### Additional Services

[...]

##### CV

[...]

##### Assignment Help

[...]

##### Admission Services

[...]

##### Custom Paper

[...]

## How to ...

**We made your life easier**with putting together a big number of articles and guidelines on how to plan and write different types of assignments (Essay, Research Paper, Dissertation etc)