Студопедия
Случайная страница | ТОМ-1 | ТОМ-2 | ТОМ-3
АрхитектураБиологияГеографияДругоеИностранные языки
ИнформатикаИсторияКультураЛитератураМатематика
МедицинаМеханикаОбразованиеОхрана трудаПедагогика
ПолитикаПравоПрограммированиеПсихологияРелигия
СоциологияСпортСтроительствоФизикаФилософия
ФинансыХимияЭкологияЭкономикаЭлектроника

II. Phases in the development of the SP

Week 6. Science. Information Science. Computer technology. | Information technology | Complete the summary below. Choose your answers from the box. You may use any of the words more than once | Causative constructions | Causatives with get | Focus on reading | Introduction to scientific method | WRITING A SCIENTIFIC RESEARCH ARTICLE | MATERIALS AND METHODS | ACKNOWLEDGMENTS |


Читайте также:
  1. A BRIEF OUTLINE OF THE DEVELOPMENT OF THE ENGLISH LITERARY (STANDARD) LANGUAGE
  2. AND CHILD DEVELOPMENT
  3. Architecture and the History of its Development
  4. Assistance for development
  5. CAREERS IN CHILD DEVELOPMENT
  6. Causes of Development of New Meanings
  7. Chapter 2 The Science of Life-Span Development

Several phases can be distinguished in the development of the SP. Some of these

are briefly described in what follows.

The first phase starts with Galileo and is the application of experimentation

and speculation to investigate the physical world under certain methodological

restrictions. This phase reaches its first climax with Newton’s theory of

gravitation and the establishment of mechanics as the primary model for natural

phenomena. During this phase the notion of determinism is prevalent. Mechanics

develops by physicists like Leibniz, Maupertuis, Lagrange, Hamilton and

Laplace in the form of classical mechanics, with celestial mechanics as its crown

achievement. The mathematical formulation goes under the name of analytical

mechanics.

A second phase begins with the realisation that there are non-deterministic

aspects that need to be taken into account for the description of phenomena in

nature. Maxwell and Darwin publish the same year, 1859, scientific works that

use chance and randomness as tools for modelling processes in nature. This

quickly develops into statistical mechanics by Boltzmann, Gibbs and others and

in biology into neo-Darwinism after the 1950s.

A third phase begins with relativity theory and quantum mechanics in the first

decades of the 20th century. During this phase, in which the atomic structure of

matter is established, one is forced to give up classical realism. In the theory of

relativity the notions of space and time being some independent background, as

conceived by Newton, is totally changed into a four-dimensional space-time

continuum where time and space mix, implying a change in the notion of

simultaneity, and in general relativity both space and time are dynamical

variables that interact with matter.

A fourth phase starts after World War II with the development of biology as a

form of micro mechanics. The discovery of the DNA alpha-helix structure and

the structure of proteins and their use for the description of the theory of

evolution marks its entry on the scene.

A fifth phase starts with Hubble’s discovery in 1925 of the expansion of the

universe, the development of elementary particle physics and modern cosmology

leading to the attempt at formulating a theory of everything (TOE).

In this greatly simplified scheme, in the face of which many physicists might

feel unhappy not to find themselves clearly represented (like solid state physics

which is included in quantum physics/mechanics) there is a clear trend from

determinism, that is mechanical and without purpose, to the addition of

randomness, that is also without visible purpose, to the implementation of the

mechanistic scheme to describe all of nature - classical mechanics, celestial

mechanics, solid mechanics, fluid mechanics, continuum mechanics, quantum

mechanics etc. - including modern biology. There are of course good reasons for

this this course of events. The reasons are to be found in the new notion of

knowledge that is introduced in the SP and that will be elucidated below.

III. What is needed to establish a scheme of knowledge?

Many scientists are sceptical to metaphysics and of its possible relevance to

science. This is in my opinion a mistake, since in any serious self-reflection on

ones practice in science such aspects enter. To put up a scheme of knowledge we

need at least three elements:

1.An aim for the knowledge,

2.A set of criteria to validate a knowledge claim according to the aim,

3.A common practice to obtain the same, or at least as similar as possible,

experiences for us to reach consensus on what the basic observations or so called

facts should be.

The aim determines the validation criteria and will influence the practice. In the

SP the archetype of the common practice is the work in the laboratory or the

observatory. This includes the understanding of the operation and use of the

instruments that extend our senses into the micro or the macro world.

a. An aim for the knowledge

Some people would argue that there is no need for an aim with knowledge. We

seek knowledge for its own sake and more knowledge is better than less. In fact

they would probably argue that they simply seek the truth about nature. The

problem is not only that even this is an aim, but that we have not the slightest

clue as to what could represent the truth. Our senses play tricks with us all the

time. It is in order to diminish this problem that both Galileo and after him

Descartes insist on that science should occupy itself with only the so called primary qualities of matter, like length, time, mass, velocity etc. The secondary

qualities such as smell, colour, taste etc. are understood to depend upon our

senses and should be avoided. Even so it is not possible to formulate criteria for

a knowledge claim that pretends to aim for truth, since we don’t know how truth

would manifest itself for us, even in the laboratory or the observatory, and not

even if we stick to the primary qualities of matter. The history of physical

theories, and how they are turned over, presents a clear illustration of this. As

soon as we formulate a subordinate aim for the knowledge we seek, we can set

up a set of criteria for it. The result is then a series of maps of nature in

accordance with the aim.

Several schemes of knowledge have been set up in the past. Not all of them

were complete in all three aspects. However Aristotle devised a rather complete

scheme. His aim was to find and establish the foreordained hierarchical order of

everything in the universe. To this end he required information of four “causes”:

the material, formal, efficient and final cause. Only then do we, according to

Aristotle, have knowledge and understanding of a thing in relation to the

universe. For him basically all human experiences could be used as common

practice although even the Greeks understood the problem with the subjectivity

of the senses. It seems to be the lack of a common practice that to some extent

stops the Greeks from advancing in a systematic way into science as we

understand it today. We therefore should denote the Greek science as the proto scientific

phase.

Francis Bacon suggested that the aim of the Scienza Nova, the new science, the

new knowledge of nature, should be to obtain power over nature. He wanted

science “to bring relief of man's estate“ and to use science to develop utilities for

man. He then realised that brooding into possible aims with nature stood in the

way of this cause. “Inquiry into final causes is sterile, and like a virgin

consecrated to God, produces nothing” (Bacon 1623) was his incisive

formulation of the predicament we were into. Respecting nature as God’s

creation would stop man from exploiting her. Therefore final aims should be

avoided. Rene Descartes was of the same opinion: final causes are to be

banished from science. Ever since, scientists have adhered to this view, and

decided to study nature as if devoid of intention or meaning. This prevails into

our times.

Modern science has reformulated the aim slightly into a less provocative one:

the aim is to control and obtain power over nature and to predict the future.

Power obviously makes you able to predict the future to some extent. The idea of

predictability then comes into the foreground. The criteria for knowledge

validation are then subject to this aim.

b. A set of criteria to validate a knowledge claim according to the aim

Which processes can be easily predicted? Clearly the repeatable ones! We

therefore insist upon the study of repeatable events and that each experiment

should be possible to repeat several times. We also seek to establish cause-effect

relations in the processes in nature, essentially the efficient causes of Aristotle.

The prototype for this is the relation between the set up of the experiment and its

outcome. The set up is the cause and the outcome the effect of the cause. In this

way each effect could be the cause of a new cause-effect relation in a chain ad

infinitum. Normally it is believed that this chain can be cut at an arbitrary place

and then from this place on the new situation is the cause of the following

events. We do not need to consider any memory of the past. We also limit

ourselves to what is called locality in space and time: all causes should be found

in a limited region in space and time, otherwise the whole universe would be a

potential cause including all its history. This is obviously not possible to have

access to.

In the experimental process there are things that cannot be changed and

altered. These aspects are referred to as universal laws, like the law of

gravitation or Newton’s law, and constants of nature, such as the value of the

electric elementary charge, the gravitational constant, the speed of light in

vacuum, etc. The other circumstances in the cause are referred to as boundary

conditions. It is the experimentalist that must find out how much of the boundary

conditions that can be varied and how much relevant information that we need to

control to perform the experiment in a successful and repeatable way within the

given accuracy of measurement. The theory is then suggested as soon as enough

data are collected. Too few data do not in general make a good starting material

for suggesting a theory. Too many theories can be suggested.

Francis Bacon suggested the inductive method to be used in science: by

measuring more and more accurately we would finally find out the laws of

nature. This strategy has largely been abandoned in favour of the complementary

strategy to guess or hypostatize a law and then subjects it to tests. This

hypothetic-deductive method has been highly successful.

It now becomes clear how the insistence upon determinism enters at the

inception of the project in order to ensure predictability. With determinism we

mean one-to-one mappings. Given a certain situation, another unique and

calculable situation should be reached in the next step in time, within the limits

of experimental uncertainties. In general there are also one-to-many, many-toone

and many-to-many mappings. In quantum mechanics one-to-many situations

are the general case. This diminishes the predictability, and we can only predict

the statistical outcome of many experiments. Still, for technological applications,

this often turns out to be enough.

We also use a principle called Occam's razor: we do not introduce more

hypotheses than necessary to suggest at theory for the phenomena. The least

number of assumptions that give an adequate description of the data are adopted.

The theory is then subjected to further tests of its predictions.

An important element in the scientific process is that we communicate our

findings in an honest manner as to what we have found relevant for the

experiment. It should in principle be possible for another scientist to repeat the

experiment.

Again, final causes are not supposed to be used as elements in the description

or explanation of an experiment. The material causes of Aristotle are of course

there as well as the formal causes. Since nature sometimes, if not always, show

up one-to-many situations, the idea to use chance to model them is in line with

the effort to avoid "final causes". Chance can by definition not have any final

cause or meaning.

In quantum mechanics even the notion of classical realism had to be

abandoned. Quanta cannot be assigned arbitrarily exact locations and velocities

simultaneously according to Heisenberg's uncertainty principle. This implies e.g.

that electrons do not move in orbits around the atomic nucleus. We cannot in a

classical way picture how they move, or rather behave. We can only predict the

possibility to find them at various positions or with various momenta (= mass

times velocity) at a certain time. “It is wrong to think that the task of physics is

to find out how nature is. Physics concerns what we can say about nature” was

Niels Bohr’s conclusion of the situation.

In biology Darwin's theory of evolution is reformulated in modern times by the

notion of DNA, as the carrier of the information of the individual. The model

states that random mutations (of DNA) and the impact of the

physical/biological/geological environment, providing a “natural selection” of

the fittest, are the forces that drive the evolution. From this it follows that

evolution has no direction and no aim or purpose. This is built into the model.

c. A common practice

As to the third element in the structure of knowledge we use rational thinking

and study cause effect relations in the laboratory or observatory. With rational

thinking is here basically meant mathematical modelling. This is important,

since mathematics is the only contradictory free language we know. Once

relations between observable quantities have been formulated in mathematical

terms, we usually agree on these relations. An example is given by quantum

mechanics. Most physicists agree on the mathematical formulation of quantum

mechanics and how to compute various predictions with it. However, as Max

Jammer (1966) has pointed out there are a large number of different ways to

interpret what the mathematical formulation means in ordinary words.

Science has since the time of Galileo been understood as a process of

measurements of the primary qualities of matter. However, as time goes on we

can see that it is rather the experimental situation that characterizes science. Of

course everything is still a kind of measurement: measurement relative to some

standard of the primary qualities of matter. Nevertheless the configuration of the

experimental set up is a decisive aspect of the experimental situation and of the

interpretation of what the measurement says. This becomes even clearer in

quantum mechanics, where the measuring device and the object become

entangled after their interaction in a measuring situation.

In each discipline it takes time to learn to perform experiments and understand

the workings of the instruments or even suggest the construction of new

instruments for various investigations. It is not always easy for the untrained eye

or mind to use scientific instrument and to understand what they show. A well

known example is the astronomical tube of Galileo, which even contemporary

philosophers refused to look into. Another example is the microscope. When

freshmen look on biological specimens in the microscope, they rarely can see

anything useful and clear to being with. Only after learning what to look for and

how to complete the fragments of the picture into an intelligible form can they

discern what the specimen is showing. Similar difficulties are presented with

radiography.

In the theory building we also require that the new theory is consistent, within

the experimental error limits, with older established findings. This most often

means that when the situation is adequate for the old theory to hold, then both

the old and the new theory should give the same predictions within the error

limits.

Above, I have concentrated on the physical side of the SP at the expense of in

particular chemistry. Chemistry has been very important for the development of

the structure of matter, both in the way its structural simplification has been

worked out to base chemistry on the stable basic elements in the periodic table,

but also in the special way the carbon chemistry has been worked out in organic

chemistry. Also the chemistry of gases was instrumental in establishing the

theory of atoms. In quantum chemistry the science of chemistry and physics

meet. In addition to chemistry, I have left out the other sciences like geology,

meteorology, etc. But they can be seen to rely to a large extent on the findings in

the other sciences as applied to various domains of interest.


Дата добавления: 2015-11-14; просмотров: 54 | Нарушение авторских прав


<== предыдущая страница | следующая страница ==>
Role definition| IV. Consequences of the SP 1

mybiblioteka.su - 2015-2024 год. (0.025 сек.)