Already a subscriber? - Login here
Not yet a subscriber? - Subscribe here

Browse by:



Displaying: 1-20 of 35 documents


articles in english

1. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Grzegorz Bugajak

abstract | view |  rights & permissions | cited by
The notion of chance plays an important role in some philosophical analyses and interpretations of scientific theories. The most obvious examples of that are the theories of evolution and quantum mechanics. This notion, however seems to be notoriously vague. Its application in such analyses, more often than not refers to its common-sense understanding, which, by definition, cannot be sufficient when it comes to sound philosophical interpretations of scientific achievements. The paper attempts at formulating a ‘typology of chance’. It distinguishes eight different meanings of this notion. Those meanings can be found in classical philosophical accounts of chance, in the common usage of this term, or form logical possibilities of its understanding. Subsequently, the paper points to those forms of the notion in question which may and may not be properly applied to scientific theories and ideas – given usual characteristics of natural sciences. It also shows – by the examples of particular theories mentioned above – which of the distinguished forms of the notion of chance are actually applicable in the context of these theories.
2. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Georges Chapouthier

abstract | view |  rights & permissions | cited by
The present thesis, compatible with Darwinian theory, endeavours to provide original answers to the question of why the evolution of species leads to beings more complex than those existing before. It is based on the repetition of two main principles alleged to play a role in evolution towards complexity, i.e. "juxtaposition" and "integration". Juxtaposition is the addition of identical entities. Integration is the modification, or specialisation, of these entities, leading to entities on a higher level, which use the previous entities as units. Several concrete examples of the process are given, at the genetic level (introns), at the anatomical level and at the social level. Structures where integration at one level leaves the units at a lower level in a state of relative autonomy can be describedusing the metaphor of the "mosaic", and the description can also be applied to the human brain and functioning of thought, where essential functions such as language or memory have a mosaic structure.
3. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Xiaoping Chen

abstract | view |  rights & permissions | cited by
Thomas Kuhn’s theory of paradigm reveals a pattern of scientific progress, in which normal science alternates with scientific revolution, but he underrated too much the function of scientific test in his pattern. Wesley C. Salmon pointed out that, on criticizing the so-called testing pattern of science, Kuhn focused all his attention on a single testing model, namely hypothetico–deductive (H–D) schema. However, as a matter of fact, many philosophers of science had already abandoned that schema and taken Bayesian schema as a proper testing model. The main difference between Bayesian schema and the H–D schema lies in that the former is a testing model for more than one theory while the latter just for a single theory. Since Kuhn, multi-theoretical testing model has become aconsensus among experts, that is, a theory and its rivals should be faced with testing together, rather than a theory being tested in isolation. Kuhn was correct in finding the H–D schema not appropriate to scientific test, but didn’t catch the propriety of Bayesian schema in this field. This led to his disapproval of the logic or method of scientific test. I agrees largely with Salmon’s appraisal of Kuhn’s view on scientific test, and gives a further argument for it. I’ll employs Bayesian schema to re-examine Kuhn’s theory of paradigm, uncover its logical, or rational, components, and thereby illustrate the tension structure of logic and belief, rationality and irrationality, and comparability and incommensurability in the process of scientific revolution.
4. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Ruey-Lin Chen

abstract | view |  rights & permissions | cited by
This paper discusses Hacking’s experimental realism and suggests a concept of realization to the issue about realism. I first rephrase Hacking’s experimental realism by reconstructing them into two theses and three arguments. Then I consider that Resnik’s objection to Hacking’s experimental realism. According to my understanding of Hacking’s experimental realism, Resnik’s objection failed because of his position at theory realism. Nevertheless, I think that there are still two problems about the experimental aspect of the experimental realism. They are the pessimistic induction of experimental science argument and the combination of apparatus argument. I attempt to give a new perspective on the realism issue by proposing a set of related concepts containing categorization, model, and realization. Last, I show that this conceptual scheme can give a better solution of the two problems and cast a new light on the realism issue.
5. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Finn Collin

abstract | view |  rights & permissions | cited by
The strong programme in the sociology of science is officially "inductively" based, generalizing a number of highly acclaimed case studies into a general approach to the social study of science. However, at a critical juncture, the programme allies itself with certain radical ideas in philosophical semantics, notablyWittgenstein's "rule following considerations". The result is an implausible, radical conventionalist view of natural science which undermines the empirical programme.
6. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Alberto Cordero

abstract | view |  rights & permissions | cited by
The success of a scientific theory T is not an all-or-nothing matter; nor is a theory something one can usually accept or reject in toto (i.e. one may take T as being "approximately true", or take as true just certain "parts" of it, without necessarily affirming every posit and claim specific to T as being either completely right or completely wrong). This, however, raises questions about precisely which parts of T deserve to be taken as approximately true. on the basis of its success. A line of thinkers, particularly Kitcher, Leplin and Psillos, variously look for parts of a theory they can claim to have been "essentially" implicated in its distinctivesuccess, which they regard as primary candidates for realist truth ascription. But, how is one to determine which parts of any theory are "central" or "peripheral", "essential" or "idle" in the required sense? Attempts at spelling out relevant synchronic links between successful predictions and correct partial theorizing increasingly look like a misguided effort. As an alternative, this paper proposes a weaker, but arguably powerful enough, version of the realist relation between success and truth. Focusing on a pivotal case study in recent debates between realists and anti-realists (the conceptual changes undergone by theories oflight in the 19th century), a promising link between success and partial theoretical representation is located in the expansion and stabilization of approximately correct partial theoretical models of the theory's intended domain. The realist link is then formulated accordingly. In the resulting approach (a) predictive success is preserved as a marker of cumulative theoretical gain, but (b) specification of the latter is a diachronic rather than synchronic matter (i.e. gains become clear only after generations of theory change; specification of the particular loci of theoretical gain in connection with a given line of predictive success is not assumed to be generally possible at the time of the success in question). The truth ascriptions that get licensed are partial-of a piece-meal and retrospective sort, focused on methodologically specifiable theoretical subplots from past science.
7. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Alberto Cordero

abstract | view |  rights & permissions | cited by
A major realist response to Laudan-type historical arguments against scientific realism by seeking to identify parts of a successful scientific theory one can claim to have been "essentially" implicated in the theory’s distinctive success, which they regard as primary candidates for realist truth ascription. But, how is one to determine which parts of any theory are "central" or "peripheral", "essential" or "idle" in the required sense? Attempts at spelling out relevant synchronic links between successful predictions and correct partial theorizing increasingly look like a misguided effort. This paper proposes a weaker, but arguably still powerful version of the relation between success and growth of cumulative truth. Focusing on a pivotal case study in recent debates between realists and anti-realists (theories of light in the 19th century), a promising link between success and partial theoretical representation is located in the expansion and stabilization of approximately correct partial modeling of intended domains. The realist link is then formulated accordingly. In the resulting approach (a) predictive success is preserved as a marker of cumulative theoretical gain, but (b) specific gain identification is a diachronic rather than synchronic matter (i.e. specification of particular loci of theoretical gain associated with a given line of predictive success is not assumed to be generally possible at the time of the success in question). The truth ascriptions that get licensed are partial-of a piece-meal and retrospective sort, focused on methodologically specifiable theoretical subplots from past science.
8. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Masaki Hrada

abstract | view |  rights & permissions | cited by
Fundamental notions Husserl introduced in Ideen I, such as epochè, reality, and empty X as substrate, might be useful for elucidating how mathematical physics concepts are produced. However, this is obscured in the context of Husserl’s phenomenology itself. For this possibility, the author modifies Husserl’s fundamental notions introduced for pure phenomenology, which found all sciences on the absolute Ego. Subsequently, the author displaces Husserl's phenomenological notions toward the notions operating inside scientific activities themselves and shows this using a case study of the construction of noncommutative geometry. The perspective in Ideen I about geometry and mathematical physics includes points that are inappropriate to modern geometry and to modern physics, especially to noncommutative geometry and to quantum physics. The first point relates to the intuitive character of geometrical objects in Husserl. The second is linked to the notion of locality related to the notion of extension, by which Husserl characterizes the essence of physical things. The points show that the notion of empty X as a substrate, developed in “Phenomenology of Reason” in Ideen I, is helpful for considering the notions of physical reality and of geometrical space, especially reality in quantum physics and space in noncommutative geometry. The salient conclusions include the proposition that aphilosophical study of the relationship between the physical object X, which imparts a unity to what is given to sensibility, and the geometrical space X, which imparts a unity of sense to various mathematical operations, opens a reinterpretation of Husserl’s interpretation, supporting an epistemology of mathematical physics.
9. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Robert G. Hudson

abstract | view |  rights & permissions | cited by
Recent scholarship (by mainly Michael Friedman, but also by Thomas Uebel) on the philosophy of Rudolf Carnap covering the period from the publication of Carnap’s’ 1928 book Der Logische Aufbau der Welt through to the mid to late 1930’s has tended to view Carnap as espousing a form of conventionalism (epitomized by his adoption of the principle of tolerance) and not a form of empirical foundationalism. On this view, it follows that Carnap’s 1934 The Logical Syntax of Language is the pinnacle of his work during this era, this book having developed in its most complete form the conventionalist approach to dissolving the pseudoproblems that often attend philosophical investigation. My task in this paper, in opposition to this trend, is to resuscitate the empiricist interpretation of Carnap’s work during this time period. The crux of my argument is that Carnap’s 1934 book, by eschewing for the most part the empiricism he espouses in the Aufbau and in his 1932 The Unity of Science, is led to a form of conventionalism that faces the serious hazard of collapsing into epistemological relativism. My speculation is that Carnap came to recognize this deficiency in his 1934 book, and in subsequent work (“Testability and Meaning”, published in 1936/37) felt the need to re-instate his empiricist agenda. This subsequent work provides a much improved empiricist epistemology from Carnap’s previous efforts and, ashistory informs us, sets the standard for future research in the theory of confirmation.
10. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Valentin Karpovitch

abstract | view |  rights & permissions | cited by
Postpositivist epistemology treats science as merely a matter of consensus. The main reason for that is the lack of objectivity. We argue that objectivity is not an essential claim for a scientific methodology. Science as an institutional enterprise is characterized mainly by progressive discourse and not by objectivity. In turn, progressiveness depends on a set of norms and regulative principles. This view of science as progressive discourse provides a more adequate basis for dealing with opinion conflicts, scientific methodology, and questions of authority in science than does the consensus view.
11. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Joonsung Kim

abstract | view |  rights & permissions | cited by
Glennan (2002) argues for the mechanism theory of causation that it explicates both type-level and token-level causation in terms of mechanism. I argue against the mechanism theory that it is not sufficient for explicating cause-effect relations at the token-level. I put forth two counterexamples (first, absence of causes and second, a cause preempting another cause) to the theory, and show that descriptions of a mechanism are inert in explicating cause-effect relations at the token level. I point out that the problems with the mechanism theory are due to explicating cause-effect relation in monolithic ways.
12. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Chunghyoung Lee

abstract | view |  rights & permissions | cited by
I challenge the assumption in the debate on the conventionality of simultaneity that a simultaneity relation of special relativity should be defined relative to a single inertial observer, not relative to multiple inertial observers as such. I construct an example of a simultaneity relation relative to two inertial observers, and demonstrate that it is explicitly definable in terms of the causal connectibility relation and the world lines of the two observers. I argue that, consequently, thesimultaneity relation of special relativity is not uniquely definable from the causal connectibility relation.
13. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Arkadiy Lipkin

abstract | view |  rights & permissions | cited by
The "object theoretic operational view" suggests a new structure of physical knowledge. This view takes branches of physics as basic units. Its main concepts are primary (PIO) and secondary (SIO) ideal objects with the explicit definition of SIO through PIO and the implicit definition of PIOs within appropriate systems of statements, called a "nucleus of a branch of physics" (NBP). Within an NBP (which has a definite structure) the focus shifts from discovering "laws of nature" to definition of a physical object (system) and its states, and the distinct notion "measurable" replaces the vague notion "observable". On this basis the roles of physical models and measurements within physics, as well as two types (PIO- and SIO- type) of theories, activity, and experiment are discussed, and a different junction of "realism” and "constructivism" is presented.
14. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Mamchur Elena

abstract | view |  rights & permissions | cited by
The paper analyzes the role of epistemology in contemporary science study. According to the representatives of cultural approach to scientific cognition the latter should be considered regardless of the issues of falsity or truth, which excludes epistemology from the sphere of science investigation. The paper argues, that though the inquiry of science as an aspect of human culture is quite possible, this sort of analysis is insufficient. In order to understand the nature of scientific cognition one has to supplement it by the results of epistemological consideration.
15. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Shunkichi Matsumoto

abstract | view |  rights & permissions | cited by
In this paper, I will take advantage of the controversy on the legitimacy of adaptationism in evolutionary biology to further investigate the nature of adaptationistic thinking, or biological explanations in general. To this end, first I will look at the famous and provocative criticism made by Gould and Lewontin (1979) against then-prevalent adaptationism --- a research strategy for accounting for the origin of traits of organisms seemingly adapted to the environment by appealing primarily to natural selection. Then I will consider its counterarguments put forward by Dennett (1995), one of the proponents of adaptationism, in order toscrutinize the intrinsically hypothetical character of adaptationistic thinking. By amplifying Dennett’s points, I will finally reach the conclusion that there are two senses --- objective and subjective --- in which adaptationistic thinking is said to be hypothetical, which nonetheless do not prevent it from qualifying as scientific practice. In the process, I will also gain an insight into the sense in which the theory of natural selection is said to be mechanistic, as a spin-off.
16. Proceedings of the XXII World Congress of Philosophy: Volume > 43
L.A. Minasyan

abstract | view |  rights & permissions | cited by
Analytical reflections on tasks and functions of philosophy in the modern world, as well as, efforts deriving novel vision of practically all areas of the philosophical thought may become sound only after consideration of the innovations with which modern natural science has crossed the 20—21 centuries boundary. Discoveries in astrophysics at the end of the 20th century offer new and unprecedented perceptions of our world. In this world only 4% of the total Universe energy is attributed to the known forms of the matter, 20% constitutes “dark matter” and 76% is in “dark energy”. The first decade of the 21st century will go on record inhistory of the civilization as the one associated with breakthrough experiments shedding light on the nature of the mysterious types of the matter, and construction of the unified theory of field. It is safe to regard modern science to be on the verge of profound transformations. These changes are bound to alter our outside world comprehension, redefining human being’s place in it. That advances a set of new serious requirements to philosophy as a science that must present ability to adapt to provide adequate and meaningful methodological interpretations to anticipated discoveries. This article addresses the characteristic features of modern conceptual knowledge of the world, with an attempt of offering their philosophical and methodological comprehension.
17. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Kiyokazu Nakatomi

abstract | view |  rights & permissions | cited by
It is said that the theory of relativity and quantum theory are independent of each other. Their relationship is like water and oil. Now, it is very important for modern physics to synthesize them. In Physics and mathematics, Super String theory is studied, but instead of it, the tendimensional world appears. Our world is a three-dimensional world . What is the ten-dimensional world? It is more difficult than the string which is of Plank length. In the ten dimensional world, physics is facing darkness and nothingness which man can not explain with the traditional physical words.The solution depends upon philosophy. I tried to synthesize themand succeeded.The following is an outline of my synthesis. 1. Utility and relativity of mathematical truth Mathematical truth is not absolute but relative. In the universe ( outside the solar system ), there is no perfect line. Because, by the gravitation of large astronomical bodies, space and lines are curved. Mathematical figure and numeration depend upon the promise of mankind. These are not absolute. Physics, which is grounded upon mathematics in certainty, is also relative. It expresses not the whole of the universe but a part of the universe. 2. Community and difference between the theory of relativity and quantum theory Community is the negation of absoluteness of physical attributes. Difference is the assessment for mathematics. The theory of relativity relies on mathematics but quantumtheory does not always rely on it. According to circumstances, Niels Bohr and quantum physicists abandoned a frame of reference. 3. The origin of the theory of relativity 4. The origin of quantum theory In short, the theory of relativity and quantum theory are not perfect, they only irradiate a part of the universe. Man can reach the whole of the universe only by the philosophical intuition of nothingness and infinite (the principle of nothingness and love).
18. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Igor Nevvazhay

abstract | view |  rights & permissions | cited by
The problem of a final theory concern prospects of the contemporary science development. Arguments the adherents and opponents of the final theory idea appeal to philosophical and methodological beliefs. In my paper I am going, firstly, to analyze critically philosophical ideas lying on the basis of denying of the final theory project, and, secondly, to show that defense of that project demands reconsideration beliefs about a structure of a scientific theory and discussion a meaning of scientific laws.
19. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Ilkka Niiniluoto

abstract | view |  rights & permissions | cited by
According to the traditional requirement, formulated already by William Whewell in his account of the “consilience of inductions” in 1840, an explanatory scientific theory should be independently testable by new kinds of phenomena. A good theory should have unifying power in the sense that it explains and predicts several mutually independent phenomena. This paper studies the prospects of Bayesianism to motivate this kind of unification criterion for abductive confirmation.
20. Proceedings of the XXII World Congress of Philosophy: Volume > 43
Oue Yasuhiro

abstract | view |  rights & permissions | cited by
Ethics is a device which is produced by the human consciousness to regulate the human behaviour or society in a sound manner. Organisms are manipulated by techniques of molecular biology these days. Then, it is so difficult to recognize the problems of life manipulation by the ethical principle raised by our sensing level. To regulate the society greatly influenced by modern life sciences, it is time to utilize the mechanistic knowledge about organisms as a basic principle of ethics (Molecular ethics). Molecular ethics is not an ethics produced only by the humane logic, but adopts the property of genome operating system (OS) as a core principle clarified by the molecular biology. It is substantiated by life sciences, but is not a simple scientific decision-making means. It promotes scientific experiments with the integrity to knowledge. Scientific knowledges are projections of the rule of life. Molecular ethics is an ethics which seeks for the rules of evolution; i.e. rules for transcending human being. On the other hand, traditional ethics (individual ethics; ethics based on indivisualism) is an ethics which regards present human being as a terminus of evolution. In this respect, this is against the principle of genome OS (system of evolution). Life sciences and technologies has been developing to meet our wants, however, those activities nowadays seem to be against those rules of genome OS. Therefore, it is extremely important for us to consider the ethics applicable for the life manipulation era.