Laws and Approximations as Stages of Deduction - History...

Laws and Approximations as Deduction Steps

When the scientific concept is "put in order", it is obvious that in conceptual terms the principles precede the laws. After analyzing the nature, it is reasonable to refer to the status of laws. It is necessary to distinguish hypothetical (deductive) and inductive laws. Consideration of experimental facts allows us to identify laws in the process of induction. They are called experimental laws. But induction is not an experiment. Strictly speaking, the laws discovered in the process of induction should be called inductive rather than experimental positions. Often scientists talk about universal laws. Universal physical and chemical laws are expressed by the so-called universal conditional statement & quot ;. Its simplest type is written as follows:

(12.7)

This expression is read as follows: for any x, if x has the sign of P, then it also has the sign of Q. The law expresses the relationship of the signs of all in connection with which symbolic variables are used (), i runs through a number of integer values ​​from 1 to where n is the total number of x.

In the records of laws, the concept of the class of elements is always used. Elements form a certain class if they have at least one common feature. This condition is met. The law deals with the classes of elements, and not with an arbitrary sample of their number. In the event of refusal to consider the classes of elements, science would acquire an exceptionally unusual form. The history of its development unambiguously testifies to the expediency of considering precisely the classes of elements. But this circumstance determines the far from obvious status of scientific laws.

Above we considered the concept of a universal law. As it turns out, it is not all satisfactory. In science, when encountering previously unknown laws, strictly speaking, they are guided not by universal laws but by hypothetical-deductive laws. They are considered valid only for the phenomena under study. It is entirely assumed that the results of cognition will force us to abandon hypothetical laws. Thus, they are subject to certain restrictions.

For a long time scientists believed that hypothetical laws are verified (confirmed) in experiments. Much was clarified after the speeches of the critical rationalist K. Popper, who never tired of emphasizing that a hypothetical law is not certified, but is falsified. Criticism of Popper was directed against neopositivists, in particular, R. Carnap. Under the pressure of Popper, they had to retreat. But, strangely enough, both sides admitted a certain mistake. The fact is that an inductive law is established, and a hypothetico-deductive law is introduced through the operation of abduction. Both Carnap and Popper did not make a clear distinction between deductive and inductive laws. The hypothetical law is falsified by experiment. In relation to the inductive law, the experiment is its condition.

Now let us consider the deductive transition from hypothetical laws to predictable facts. In this connection, the operation of approximation, which will be discussed below, takes on special significance.

It was noted above that scan theory acts as a transduction. As applied to quantum chemistry, this means that it is not enough to just write down the Schrodinger equation (law), it also needs to be solved. The history of the development of quantum chemistry shows that in this connection it is impossible to do without approximations (from Latin approximare - to approach). By approximation in science it is usually understood the expression of some quantities through others, which are considered more simple. Suppose that we are considering N electrons. In the case under consideration, we mean that we should simplify the N-electron wave function ( N is the number of electrons) in such a way that it becomes computable. Usually this circumstance is interpreted as follows: with N & gt; 2 the correct description of the electronic wave function is impossible, so there is nothing else to do but go on to simplifications, it is impossible to preserve theoretical chastity, at least with the current level of science.

According to the author, this kind of argument is not deep. Indeed, if the so-called flawlessly correct approach were known, then it would be possible to characteristically characterize the departure from it. But since it is unknown, we should refrain from describing its opposite, incorrect approach. Theoretically, meaningful approximations should be understood not as simplifications, but as necessary stages of transduction. In this context, the topic of simplifications is of secondary importance. In support of this conclusion, we give this argument.

The fairly often used approximations are in excellent agreement with the experimental data. In this case, researchers do not need to insist on their incorrectness. However, this idyll is invariably broken, and then it is necessary to introduce more refined approximations. How is this to be understood? As a continuation of scientific transduction, which involves the growth of scientific knowledge. Thus, since transduction can not be realized without approximations, they act as its legitimate features. The growth of scientific knowledge forces us to reconsider the relevance of not all approximations, but only those of them, whose consistency has been disproved. The dynamics of scientific knowledge is often interpreted as a series of endless delusions. In fact, it is a string of achievements. The growth of scientific knowledge is ensured not by errors, but by achievements.

So, the approximations should be interpreted only in the context of transduction. It is no accident that the approximations, as a rule, are the result of the exclusively selfless work of the researchers.

The Hartree-Fock method takes center stage among all the approximations used in quantum chemistry, so it makes sense to address it first.

Historical excursion

The history of the development of the Hartree-Fock method is very indicative. E. Schrodinger recorded his famous equation in 1926. The following year D. Hartree proposed a method for solving it. In this method, the wave function of a multielectron atom is represented as a product of the wave functions of individual electrons corresponding to their different quantum states in an atom. The motion of each electron is determined by the field created by all other particles averaged in a certain way and given by some potentials. Hartree's intention was to strive to give a solution to the Schrödinger equation ab initio, that is, based on the fundamental quantum-mechanical principles. The significance of his theoretical innovations was realized far from immediately. This happened only after J. Slater showed that the Hartree method puts in the theoretical form a variational principle: one-electron wave functions are chosen from the condition of minimum mean energy. In 1930, VA Fock perfected the Hartree method, giving the wave functions a symmetry form ensuring the fulfillment of the Pauli principle, that is, he took into account the presence of spins in electrons. As a result, Fock linked the method under consideration with the theory of groups. In 1935, Hartree was able to give his method a form suitable for mathematical calculations. But their effectiveness was revealed only in the early 1950's, after the advent of electronic computers. Thus, only a quarter of a century after the initial development of the Hartree-Fock method, its effectiveness was revealed.

The electronic Schrödinger equation for molecular systems is often resolved in accordance with the so-called valence bond method. In this case, the wave function of the molecule is expressed in terms of the wave functions of its constituent atoms. To each valence bond there corresponds not one-electron but two-electron function:

(12.8)

where X is the spatial wave and σ is the spin wave function, the numbers 1 and 2 refer to two electrons.

In the description of molecular systems, as a rule, linear combinations of the wave functions of several valence bonds are used. The coefficients in the linear combination are determined by the variational method from the condition of minimum energy.

The Hartree-Fock method is often matched with perturbation theory, which uses a representation of the unperturbed, and perturbed, , the Hamiltonians. The difference between them is considered as a perturbation, and only corrections of lower orders are taken into account from the corrections depending on this difference. This is sufficient to obtain results compatible with the experimental data.

In the theory of molecular formations containing many-electron atoms, the density functional method occupies a central place. The main goal of the theory of the density functional is to replace the many-electron wave function by electron density. This leads to an essential simplification of the problem, since the many-electron wave function depends on 3 N variables - 3 spatial coordinates for each of the electrons, while the density is a function of only three spatial coordinates. But this method is correct only in the case of a fairly uniform distribution of the electron density. Its undoubted merit lies in the possibility of calculating molecular systems consisting of hundreds and sometimes thousands of atoms. Of course, it does not dispense with the use of different approximations.

The theory of density functional has always been suspected of departing from the ideals of quantum chemistry. Thanks to the research of P. Hohenberg and V. Kohn, the groundlessness of these suspicions is largely shown. The theory of the density functional goes back to the works of Thomas L. 1927 and E. Fermi in 1928, who were able to calculate the energy of an atom according to the concept of electron density. It was believed that their method was surpassed by the Hartree-Fock method. But the desire to cope with the calculation of the many-electron system forced the chemists to return to the ideas of Thomas and Fermi. Their quantum nature is explained in many respects by the second Hoenberg-Kohn theorem (1964), according to which the energy of the electronic subsystem recorded as an electron density functional has a minimum equal to the ground-state energy, that is, it is a variational principle of quantum mechanics. As proved in the above theorem, the wave function of the ground state F0 is a functional of the electron density in the ground state . Thus, the concepts of wave function and electron density are closely related to each other. This is especially obvious for the ground state, but not just for him. Interestingly, the drag of the density functional method has two peaks, separated by a gap of three decades (1960s and 1990s). In both cases, they were associated with the development of computer technology.

A rather cursory review of chemical methods conducted by the author shows a nontrivial content of different ways of carrying out transduction in quantum chemistry. Η. F. Stepanov and Yu. V. Novakovskaya quite rightly point out the necessity of manifesting the "proper attention to which methods and in what approximation can and should be used in solving a particular problem." The path from the fundamental laws, in particular the Schrödinger equation, to direct contact with the experimental data is both difficult and thorny. Here the conceptual surprises are waiting for the researcher at every step. But, which is extremely important, all the steps of deduction are interconnected.

Unfortunately, very often transduction at the stage of deduction is reduced to the use of approximate methods, which allegedly do not correspond to the original strictness of the theory. This erroneous opinion is considered further on the example of certain interpretations of the problem of approximations in quantum chemistry.

Scientists argue

In this regard, an extremely interesting article is presented by V. Ostrovsky "Towards a philosophy of approximations in" exact "theories." Correctly noting that the problem of approximations is not given due attention in the philosophical literature, he ends his article with the following four conclusions, which we give here in a condensed form.

1. It is inadmissible to regard the approximations as weaknesses of exact sciences, they are everywhere in it. This conclusion is not refuted by the presence of unjustified approximations.

2. Scientifically justified approximations are not the lowest in theories, but a reflection of the characteristics of its nature. The hierarchy of approximations creates a unique way of recreating scientific images of a qualitative nature.

3. They are the most significant results of scientific research, which must be considered in the philosophy of science first.

4. The so-called quantitative methods and qualitative images that we owe to approximations complement each other in the sense of Bohr's complementarity principle.

According to the author, the theory of Ostrovsky approximations is worthy of high evaluation. Of course, it, like any other scientific position, deserves a critical examination.

According to Ostrovsky's point of view, all the basic scientific concepts are approximations. In particular, the Schrödinger equation itself serves as an approximation, since it does not take into account relativistic effects. It is possible to take them into account, but then it will become clear that the size of the particles, etc., has not been taken into account. All principles are also approximations. In the author's opinion, the approximations take a certain place in transduction, namely, their hour comes when the transition from principles and laws to predictable variables is made. It is extremely important to express the metamorphosis of deduction, its conceptual switching.

The world of science is not reduced to mere approximations. Any theory is problematic, and therefore it deserves to be placed under the fire of scientific criticism. But there is no reason to identify the problematic theory with the presence of steps of approximation in transduction.

At this point, it makes sense to emphasize the appropriateness of distinguishing between approximations and approximations. They are usually identified. But in this case it is difficult to comprehend the conceptual content of transduction. Using approximations, the researcher deliberately, for example, pursuing didactic goals, abandons the most developed theory, which, nevertheless, hovers before his eyes. Approaches are, as a rule, simplifications, refusal to consider certain aspects of the reality being studied. The meaning of the same approximations is not in simplifications, but in the continuation of the transduction line, initiated by the presentation of principles and laws. Approximations are freed from congestion on the transduction line.

This circumstance is realized only in recent years. A vivid example of such an understanding is the theory of V. Ostrovsky. Historically, it happened that the approximations did not differ from the approximations, their meaning was interpreted in literal correspondence with the etymology of the Latin word approximare, meaning approximation. But in accordance with the scientific structure of the theory, approximation does not appear as an approximation to the law (equation), but as a development of its potential. The growth of scientific knowledge leads to a reassessment of the approximations already undertaken in the process of transduction, but this circumstance should not be misleading. The meaning of approximations and approximations is different.

Q. Ostrovsky very accurately characterizes the nature of the approximations by examining the sense of the Born-Oppenheimer approximation, considering the existence of forms in molecules and their motion along orbits. His line of reasoning, which he calls realistic, consists in the indispensable closure of his reasoning by the characteristic of the actual state of affairs. This is the correct way of argumentation, for it is inadmissible to interrupt the transduction already on the approaches to understanding the experimental results. In this regard, Ostrovsky is critical of the concept of a theoretical (subjective, or ideal) artifact, which is only a help in the activities of the researcher, which has no direct relationship to chemical reality.

The Born-Oppenheimer approximation takes into account the difference in the mass of nuclei and electrons () and their velocities () . If both conditions are met, then the cores are considered fixed, located at a certain distance from each other. But if the condition is not fulfilled, for example, with respect to some excited states of molecules, then the mentioned distance ceases to be a sign of atoms and molecules. Ostrovsky proves that the introduction of the idea of ​​the signs of atoms and molecules is always connected with some approximations, but all of them are not absolute in nature, because if they do not correspond to the chemical reality, then they should be abandoned.

The concept of a quantum orbit caused scientists to be extremely interested. Some methodologists of chemical science began to assert that they do not exist, but are just mathematical constructs and, therefore, can not be observable. And in assessing the question of the reality of quantum orbits, Ostrovsky's position seems very weighted. He notes that within the framework of the Hartree-Fock approximation, according to which each individual electron moves under the influence of the average field formed by nucleons and other electrons, the concept of quantum orbits is not only appropriate, but also inevitable. It has a physical meaning. As for observations of orbits, they are also possible, for example, in the energy approximation. The signs of chemical reality can be judged only on the basis of approximations. On the other hand, scientifically justified approximation in one form or another is indicative of the features of reality itself.

According to Ostrovsky, philosophical comprehension of the topic of approximations implies an appeal to the principle of complementarity N. Bohr. "Exact" quantitative methods and intuition-inspired approximations form an additional pair in the universal sense of the complementary relationships that exist, according to Niels Bohr, in society and nature. In this dual relation, quantitative methods represent the more objective side of nature, while the qualitative images generated by approximations remain on the subjective side of the interpretation of nature by researchers. Very often we progress in science due to the development of approximation methods & quot ;. Somewhat earlier, Ostrovsky explains the additionality he introduces in the following way: the "more precisely" equations, the less their explanatory power. Conversely, the higher the heuristic potential of approximations, the less "accurate".

According to the author, the appeal of V. Ostrovsky in an attempt to create a theory of approximations to Bohr's complementarity principle is a philosophical error. Quantitative and qualitative definitions are not in an additional relationship, in the sense of Bohr, to each other. This can be shown most simply by considering any chemical variable, for example, the mass of an atom of a chemical element m i. In this case, m is the quality, and its i-th quantity is the quantity, m i is some measure. There is no relation, presupposed by the principle of Bohr, according to which one decreases and the other, on the contrary, increases. The essence of the matter does not change with the transition to equations, since all the same variables appear in them. The more precise the solution, the more relevant the knowledge of the chemical reality. In this case, there is no reason to take the word more precisely in quotation marks. Ostrovsky always does not forget to put the words exact (for example, science), exact (in particular, the solution) in quotation marks. This shows his caution, for he perfectly understands that it is impossible to achieve exact solutions having no chemical meaning without approximations. But, turning to the principle of Bohr, V. Ostrovsky, forgetting about the need for scientific vigilance, compares the exact, quantitative (in inverted commas) with the qualitative (without the quotes). Only in this case is an additionality so attractive for him. The attempt of Ostrovsky to impose an objective quantity mainly on the department, and subjective on the quality department, is also unsuccessful. This attempt is declarative, because the categories of the subjective and the objective are considered casually, without proper argumentation.

The noted shortcomings of the theory of V. Ostrovsky's approximations do not undo her undoubted merits. In his interpretation, approximations appear as far from ordinary concepts of scientific theory. This conclusion, of course, deserves attention. But according to the author's argument, if we want to understand the approximations in a systematic form, then they should be considered in the context of transduction. However, there remain significant difficulties in understanding the internal mechanism of transduction, including in relation to approximations. In the author's opinion, it should be understood as a kind of probabilistic-game strategy.

Another interpretation of the approximations deserves consideration, namely, as a characteristic of the limited possibilities of cognition. According to the famous American physicist and cosmologist J. Hartle, our knowledge has limits of three kinds: a) the difference between the observed and the predicted (we mean that we can observe very complex phenomena, and predict relatively simple, because the laws are simple), b) impossibility to provide the desired volume of calculations, c) limited opportunities for knowledge of theories through induction and verification.

Starting from Hartle's ideas, the Italian chemist A. Tontini aims to establish the limits of chemical cognition, paying special attention to the inability to synthesize the desired chemical substance. According to the author, both Hartle and Tontini do not pay due attention to one extremely essential subtlety. The so-called restrictive theorems point not to the limits of the possibilities of our cognitive abilities, but to the structure of the reality under study. The Heisenberg uncertainty relation characterizes the chemical world itself, and not our cognitive abilities. The progress of knowledge indicates its unlimited possibilities. Neither in physics, nor in chemistry, such phenomena are indicated, the knowledge of which is inaccessible to man. The dilemma "the world is complex - the laws are simple is not a scientific, but a speculative contrast. On the basis of scientific material, it is only permissible to conclude that the complex world is learned through scientific laws, and knowledge itself is devoid of any boundaries. Cognition is unfinished, this is true, but it does not follow from this that it is powerless before anything. Approximations express the features of the phenomena studied, and not our powerlessness over their complexity.

Conclusions

1. Laws and approximations are stages of deduction in the composition of transduction.

2. The meaning of approximation is to ensure deduction.

Also We Can Offer!

Other services that we offer

If you don’t see the necessary subject, paper type, or topic in our list of available services and examples, don’t worry! We have a number of other academic disciplines to suit the needs of anyone who visits this website looking for help.

How to ...

We made your life easier with putting together a big number of articles and guidelines on how to plan and write different types of assignments (Essay, Research Paper, Dissertation etc)