Already a subscriber? - Login here
Not yet a subscriber? - Subscribe here

Browse by:

Displaying: 1-20 of 31 documents

series introduction

1. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Ioanna Kuçuradi

view |  rights & permissions | cited by

volume introduction

2. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Stephen Voss, Berna Kilinç, Gürol Irzik

view |  rights & permissions | cited by

section: logic and philosophical logic

3. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Erdinç Sayan

abstract | view |  rights & permissions | cited by
Many wonder at the abundance of disputes, opposing views and schools in philosophy. This abundance is surprising in view of the fact that philosophers are known for their striving and high regard for rationality. (There are, of course, philosophers who attempt to oppose, mostly by rational argumentation, the view that philosophy should be a rational discipline.) Why are all these admirably smart and rational people in so much disagreement with each other? Suvar Köseraif argues that the explanation of this phenomenon may lie in the fact that when two perfectly rational agents A 1 and A 2 disagree about matters of truth, there seems no way they can settle their dispute in purely rational ways. For suppose A1 believes in the truth of claim Q on the basis of premises P and a valid argument P.'.Q, and A 2 believes that ~Q on the basis of premises R and a valid argument R.'. ~Q. Then it would seem on logical, hence rational, grounds that A 1 must reject A2 ' s reasons R, since P.'. ~R is also valid. Thus the reasons P, which led Ai to rationally accept Q, also constitute rational reasons for A 1 to reject R, and consequently reject the argument A2 adduces for ~Q. Symmetrically, A 2 cannot but reject the reasoning A1 adduces for Q. So the dispute between A1 and A 2 concerning the truth of Q cannot be resolved—unless either side compromises its rationality and yields to such nonrational methods as threats, brainwashing, offers of money, etc. If all this is right, we have (rational) reason to be pessimistic about the value of rationality not only in philosophy, but in any sphere of thought, including science. I attempt to offer a rational counterargument against Köseraif's.
4. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Leah Savion, Raymundo Morado

abstract | view |  rights & permissions | cited by
One of the key concepts in the Philosophy of Logic is the notion of inference. In this paper we expand the notion of logical inference and describe its role in a comprehensive theory of rationality. Some recent rationality theories either presuppose an unattainable logical capacity or they minimize the role of logic, in light of the vast amount of data on fallacious inferential performance. In this paper we defend the view that logical acuity, redefined to include heuristics, is a necessary factor in rationality. We evaluate some presuppositions of algorithmic models and some normative and metatheoretical properties of heuristic models, and defend our model against possible objections. Our revised notion of logical inference functions as the nucleus of the notion of logical acuity which in turn is a necessary building block for a realistic model of rationality. This model emphasizes the logical role of inferential heuristics, cognitive constraints and contextuality, introduces concepts such as "obvious inference", "cautious deductive closure", and "familiarity", and develops the notions of cognitive economy and contextual limitations as tools for evaluating and predicting rational behavior.
5. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Mark Weinstein

abstract | view |  rights & permissions | cited by
Informal logic offers a radical new perspective on the evaluation of arguments. But little work has been done on how deep concepts in the logical foundations of argument need to be modified in light of such efforts. This paper offers an indication of what might be done by sketching a new approach to the theory of entailment, truth and relevance.
6. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Claude Gratton

abstract | view |  rights & permissions | cited by
Henry W. Johnstone (1996) attempts to use a notion of postponement to give a general account of viciousness of infinite regresses. Though some of his examples suggest that his notion applies to only beginningless regresses (...eRdRcRbRa), I will show that it also applies to endless ones (aRbRcRdRe...). Unfortunately, despite this expanded application, it does not apply to all vicious regresses, even to some of his own examples; it is cumbersome and unnecessary, and it fails to explain how some infinite regresses entail a contradiction.
7. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Risto Vilkko

abstract | view |  rights & permissions | cited by
Many historians and philosophers of logic have claimed that during the modern classical era there was a long period of stagnation or even of decline in the field of logic. The aim of this paper is to convince the audience that this standard evaluation of the development of modern logic during the period from Leibniz to Frege is misdirected and needs to be corrected. Even though it is true that the now usual way of understanding logic merely as the doctrine of syntax and semantics of explicit languages would not have appealed even to most 19th century logicians, it is still not the case that there is nothing worth discussing with regard to the development of logic during the modern classical period. The algebraic period culminated with Schroder's contribution and neither Herbartian formal logic nor Trendelenburg's critical epistemology aroused much interest among the 20th century mathematical logicians and analytic philosophers. Nevertheless, the development of symbolic logic can only be understood properly by relating its emergence to the immediately preceding philosophically-oriented discussion about the reform of logic.
8. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Andrés Bobenrieth

abstract | view |  rights & permissions | cited by
The origin of Paraconsistent Logic is closely related with the argument that from the assertion of two mutually contradictory statements any other statement can be deduced, which can be referred to as ex contradict!one sequitur quodlibet (ECSQ). Despite its medieval origin, only in the 1930s did it become the main reason for the unfeasibility of having contradictions in a deductive system. The purpose of this paper is to study what happened before: from Principia Mathematica to that time, when it became well established. The main historical claims that I am going to advance are the following: the first explicit use of ECSQ as the main argument for supporting the necessity of excluding any contradiction from deductive systems is to be found in the first edition (1928) of the book Grundzüge der theoretischen Logik by Hilbert and Ackermann. At the end, I will suggest that the aim of the 20th century usage of ECSQ was to change from the centuries long philosophical discussion about contradictions to a more "technical" one. But with Paraconsistent Logic viewed as a technical solution to this restriction, the philosophical problem revives, but now with an improved understanding of it at one's disposal.
9. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Axel Arturo Barceló Aspeitia

abstract | view |  rights & permissions | cited by
Using conjunction as an example, I show a technical and philosophical problem when trying to conciliate the currently prevailing views on the meaning of logical connectives: the inferientialist (also called 'syntactic') one based on introduction and elimination rules, and the representationalist (also called 'semantic') one given through truth tables. Mostly I show that the widespread strategy of using the truth theoretical definition of logical consequence to collapse both definitions must be rejected by inferentialists. An important consequence of my argument is that there are different notions of conjunction at play in standard first order logic, and that the technical and philosophical connections between them are far from well established.
10. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Wen-Fang Wang

abstract | view |  rights & permissions | cited by
Gideon Rosen proposes a view called "modal fictionalism" which Rosen thinks has all the benefits of modal realism without its ontological costs. Whereas modal realists have a paraphrase r(0) of a modal claim "0", modal fictionalists claim that the correct translation of "0" is rather the result of prefixing "according to the hypothesis of a plurality of worlds" to r(0). Rosen takes the prefix to be primitive and defines other modal notions in terms of it. Bob Hale, however, thinks the fictionalist's project suffers from a "simple" dilemma. The purpose of this paper is to show that Rosen is right in taking the prefix as primitive and Hale is wrong in thinking fictionalism as being threatened by the dilemma.

section: philosophy of natural sciences

11. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Robin Attfield

abstract | view |  rights & permissions | cited by
In response to the arguments of Bill McKibben and of Stephen Vogel that nature is at an end and that the very concept of nature should be discarded, I argue that, far from this being the case, the concept of nature is indispensable. A third sense of 'nature' besides the two distinguished by Vogel, that of the nature of an organism, is brought to attention and shown, through five arguments, to be indispensable for environmental philosophy and ethics, and for ethics in general (veterinary and medical ethics included). But it is no coincidence that the same term is used for all three senses of'nature' in many languages. The indispensability of 'nature' in the third sense is used to suggest the indispensability of 'nature' in the second sense (things unaffected by human activity, a sense needed if we are to understand species whether wild or domesticated, because of their evolutionary history, and if we are to distinguish social systems from natural systems), and also of 'nature' in the first sense (things that are not supernatural, a sense needed if we are to ask metaphysical questions about whether 'nature' in this sense and in the other two might have a creator).
12. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Gregor Schiemann

abstract | view |  rights & permissions | cited by
From the point of view of the history and philosophy of science, the relationship of Descartes' to Aristotle's concept of nature has not been grasped in an entirely satisfactory way. In this article, the two concepts will be subjected to a comparative analysis, beginning with the outstanding feature that both concepts of nature are characterized by a contradistinction to the non-natural: Aristotle separates nature and technology; Descartes opposes nature to thinking. My thesis is that these meanings have found privileged application in specific contexts of experience: the field of application especially suitable for the Aristotelian concept is the experience of everyday life, while for the Cartesian concept it is subjective experience. Historically, the relationship between meaning and experience is of help in understanding the conditions in which the two concepts arose. The topical relevance of the concepts to modern society is a consequence of the continued existence of the favored contexts of experience. Roughly stated, we sometimes still perceive in an Aristotelian way and at other times think in a Cartesian way.
13. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Jan Such

abstract | view |  rights & permissions | cited by
In this essay I shall try to offer an outline of an answer to the question of which subject matter and which methodological peculiarities of cosmology caused cosmology only in this century to be transformed into one of the scientific branches of physics in spite of the fact that cosmological considerations on the Universe, and particularly on its origin, are present in the most archaic cultures and so belong to some of the oldest springs of human thought.
14. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Małgorzata Szcześniak

abstract | view |  rights & permissions | cited by
In meinem Referat werde ich an das philosophische Problem der Ewigkeit der Welt anknüpfen. Wenn wir dieses Problem philosophisch betrachten, müssen wir uns auf das frühe Stadium der Weltevolution ziehen, d.h. auf das „präphysische" Stadium. Als „präphysisch" bezeichne ich das Frühstadium der kosmologischen Evolution, das angesichts der damaligen extremen Verhältnisse durch begründete fundamentale Theorien der gegenwärtigen Physik, wie Quantenmechanik, Relativitätstheorie oder Thermodynamik nicht zu beschreiben oder zu erklären ist. Heute kennen wir nur die obere Zeitgrenze dieses Stadiums, die sog. Plancksche Schwelle, die 10~^3 s. beträgt. Das Hauptproblem im Prozess der ontologischen Charakteristik dieses Stadiums ist der Versuch, seine untere Zeitgrenze festzulegen. Mit diesem Problem ist nämlich eine der ältesten und viel diskutierten philosphischen Fragen verbunden, und zwar das Problem der Ewigkeit der Welt und : (1) das Problem des Zeitraums des „präphysischen" Stadiums (es geht um das Festlegen seiner unteren Zeitgrenze - dauerte es unendlich lang oder aber nur einen winzigen Sekundenteil?), (2) ist der Anfang der Zeit (falls es ihn gäbe) zugleich der Anfang der Welt?, (3) bildet der Grosse Ausbruch den absoluten Anfang der Welt oder nur den Anfang eines von mehreren Stadien in ihrer Evolution, (4) setzt die Möglichkeit des absoluten Anfangs der Welt unbedingt die Kreation Gottes voraus (die supranatürliche Kreation) oder läbt eine Möglichkeit die Entstehung der Welt auf eine natürliche Weise zu (die natürliche Kreation), (5) das Problem des „sonderlichen" Moments. Alle diese Fragen, stelle ich auf der Basis der neueren Errungenschaften der Physik und Kosmologie dar.
15. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Manuel Liz Gutiérrez

abstract | view |  rights & permissions | cited by
There are important cases where properties not referred to by expressions from the languages of physics are enabled in certain times and circumstances to get causal control over some kinds of physical events. I will argue that in those cases we would have to transfer to those properties the causal sufficiency to bring about these events. This would offer a principle of causal inheritance in sharp contrast with the inheritance principle for the causal sufficiency of second order properties defended by Jaegwon Kim in his recent discussion of the causal exclusion problem concerning mental properties. The two principles would be very different. Their domains of application would be distinct. Kim's principle would transfer causal sufficiency to the more "concrete" physical properties able to realize mental properties understood as second order properties. Our principle would transfer causal sufficiency to the more "general" properties able to cause the relevant physical effects in the times and circumstances in question. That way, it would be possible to give a quite simple answer to the problem of causal exclusion posed by Kim in relation to mental properties. Our approach also would have very important consequences in relation to ordinary macrophysical causation.
16. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Ivan Z. Tsekhmistro

abstract | view |  rights & permissions | cited by
The main idea of quantum mechanics, whether formulated in terms of the Planck constant or the noncommutativity of certain observables, must be tied to the recognition of the relativity and nonuniversality of the abstract concept of set (manifold) in the description of quantum systems. This entails the necessarily probabilistic description of quantum systems: since a quantum system ultimately cannot be decomposed into elements or sets, we have to describe it in terms of probabilities of only a relative selection of certain elements or sets in its structure. This gives rise to the potential possibilities of quantum systems in an actual physical situation, and as a result the corresponding probabilities are ontologically real, like any other physically verifiable relationships. In this way, the quantum potential possibilities (and probabilities as their measure) are no less objectively real than the conventional reality which we identify with the physically directly verifiable elements, particles, etc. Indeed, the distribution of probabilities described by the nonfactorizable wave function is as objectively real and concrete as chairs, walls and all other physical things. In the pure quantum state the probabilities of selection of elements from the ultimately detailed state of the system are mutually coordinated and correlated by the phenomenon of wholeness of the system, and form an implicative logical structure governed by this phenomenon of wholeness. This idea of the implicative logical organization of the probabilistic structure of a quantum system in the so-called pure (non-detailable) state, and the governing role of the phenomenon of wholeness (in the redistribution of probabilities depending on the nature of the development of the real experiment), is in good agreement with the results of quantum correlation experiments (for example, the experiments of Alain Aspect, Nicolas Gisin and others).
17. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Mehmet Elgin

abstract | view |  rights & permissions | cited by
Much ink has been spent on Popper's falsificationism. Why, then, am I writing another paper on this subject? This paper is neither a new kind of criticism nor a new kind of defense of falsificationism. Recent debate about the legitimacy of adaptationism among biologists centers on the question of whether Popper's falsificationism or Lakatos' methodology of scientific research programs (SRP) is adequate in understanding science. S. Jay Gould and Richard C. Lewontin (1978) argue that adaptationism is unfalsifiable since it easily invites ad hoc adjustments when it makes false predictions. William A. Mitchell and Thomas J. Valone (1990) argue that adaptationism is a research program, and that the charge of falsifiability does not apply to a research program. Although both sides make use of the theories of scientific methodology proposed by Karl R. Popper (1934, 1957, 1963, 1971) and Imre Lakatos (1965, 1974, 1978), the differences and the similarities between these philosophers are overlooked. The purpose of the present paper is to explicate the differences and the similarities between the two philosophies of science.

section: philosophy of social sciences

18. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Margarita Vázquez Campos, Manuel Liz Gutierrez

abstract | view |  rights & permissions | cited by
In contrast with the development of big theories in the context of social sciences, there is nowadays an increasing interest in the construction of simulation models for complex phenomena. Those simulation models suggest a certain image of social sciences as a kind of, let us say, "patchwork". In that image, an increase in understanding about the phenomena modeled is obtained through a certain sort of aggregation. There is not an application of sound, established theories to all the phenomena of a certain kind, but an aggregation of the structures supposed, and of the results obtained, when particular systems are modeled. The recent case of the "El Farol Bar" problem, and the models built in order to face this problem, are a good example of this. We will analyze that case, trying to make clear what would be implied by the image above mentioned. Special attention will be paid to the need to take seriously the notion of a bounded rationality, linked to the special circumstances generating each decision problem, and to the existence of an irreducible pluralism of models.
19. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Antti Saaristo

abstract | view |  rights & permissions | cited by
In this paper I argue, contrary to the modern paradigm of rational choice theory in sociological theorising, that Dürkheim was correct to think that collectivistic notions are required if there is to be sui generis social science. However, Durkheim's anti-individualism must be naturalised to be compatible with modern monistic ontology. I argue that the required naturalisation is offered by the notion of humans as strongly social animals in general and the notion of collective intentionality in particular. I argue that such a collectivistic but ontologically naturalistic notion is (i) a fundamental building block of social reality, (ii) supported by empirical studies, (iii) required by theoretical analyses of social action dilemmas and, finally, (iv) capable of providing the key to the construction of a more humane world of the future.
20. The Proceedings of the Twenty-First World Congress of Philosophy: Volume > 5
Hans-Herbert Kögler

abstract | view |  rights & permissions | cited by
The paper explores the extent to which 'postmodernism' has affected our conception of social theory, especially with regard to the normative assumptions involved in cultural and social interpretation. It makes a proposal about how to redefine normativity after the postmodern challenge. Postmodernist theorists engage in the rejection of trans-contextual notions of truth and universalistic moralities. Yet since these efforts themselves involve commitments to truth and normativity, we might be inclined to reject them as inherently incoherent. A different, more promising road would consist in taking seriously the postmodern critique of reason, and to inquire whether, instead of necessarily implying a total rejection of reason, it suggests a reformulation of the scope and nature of truth and normativity. In this paper, I prepare such a reconceptualization with regard to the issue of normativity. The aim is to sketch a theory of normative commitment as built into our interpretive practices, if understood as the dialogical reconstruction—and thus recognition—of the other's beliefs and assumptions. To make the case for this proposal, I first present, by means of a comparison between modern and postmodern conceptions of social science, an interpretation of the relevance of the postmodern challenge with regard to modern social theory. Based on this, I will sketch a fourfold discursive field of positions addressing the justification of normative perspectives after postmodernism. This discussion will serve as a backdrop against which the concept of a hermeneutic competence of dialogical perspective-assumption can emerge as a plausible candidate for grounding our normative intuitions.