139 resultados para historiography of philosophy


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study is to analyze and develop various forms of abduction as a means of conceptualizing processes of discovery. Abduction was originally presented by Charles S. Peirce (1839-1914) as a "weak", third main mode of inference -- besides deduction and induction -- one which, he proposed, is closely related to many kinds of cognitive processes, such as instincts, perception, practices and mediated activity in general. Both abduction and discovery are controversial issues in philosophy of science. It is often claimed that discovery cannot be a proper subject area for conceptual analysis and, accordingly, abduction cannot serve as a "logic of discovery". I argue, however, that abduction gives essential means for understanding processes of discovery although it cannot give rise to a manual or algorithm for making discoveries. In the first part of the study, I briefly present how the main trend in philosophy of science has, for a long time, been critical towards a systematic account of discovery. Various models have, however, been suggested. I outline a short history of abduction; first Peirce's evolving forms of his theory, and then later developments. Although abduction has not been a major area of research until quite recently, I review some critiques of it and look at the ways it has been analyzed, developed and used in various fields of research. Peirce's own writings and later developments, I argue, leave room for various subsequent interpretations of abduction. The second part of the study consists of six research articles. First I treat "classical" arguments against abduction as a logic of discovery. I show that by developing strategic aspects of abductive inference these arguments can be countered. Nowadays the term 'abduction' is often used as a synonym for the Inference to the Best Explanation (IBE) model. I argue, however, that it is useful to distinguish between IBE ("Harmanian abduction") and "Hansonian abduction"; the latter concentrating on analyzing processes of discovery. The distinctions between loveliness and likeliness, and between potential and actual explanations are more fruitful within Hansonian abduction. I clarify the nature of abduction by using Peirce's distinction between three areas of "semeiotic": grammar, critic, and methodeutic. Grammar (emphasizing "Firstnesses" and iconicity) and methodeutic (i.e., a processual approach) especially, give new means for understanding abduction. Peirce himself held a controversial view that new abductive ideas are products of an instinct and an inference at the same time. I maintain that it is beneficial to make a clear distinction between abductive inference and abductive instinct, on the basis of which both can be developed further. Besides these, I analyze abduction as a part of distributed cognition which emphasizes a long-term interaction with the material, social and cultural environment as a source for abductive ideas. This approach suggests a "trialogical" model in which inquirers are fundamentally connected both to other inquirers and to the objects of inquiry. As for the classical Meno paradox about discovery, I show that abduction provides more than one answer. As my main example of abductive methodology, I analyze the process of Ignaz Semmelweis' research on childbed fever. A central basis for abduction is the claim that discovery is not a sequence of events governed only by processes of chance. Abduction treats those processes which both constrain and instigate the search for new ideas; starting from the use of clues as a starting point for discovery, but continuing in considerations like elegance and 'loveliness'. The study then continues a Peircean-Hansonian research programme by developing abduction as a way of analyzing processes of discovery.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work offers a systematic phenomenological investigation of the constitutive significance of embodiment. It provides detailed analyses of subjectivity in relation to itself, to others, and to objective reality, and it argues that these basic structures cannot be made intelligible unless one takes into account how they are correlated with an embodied subject. The methodological and conceptual starting point of the treatise is the philosophy of Edmund Husserl. The investigation employs the phenomenological method and uses the descriptions and analyses provided by Husserl and his successors. The treatise is motivated and outlined systematically, and textual exegesis serves as a means for the systematic phenomenological investigation. The structure of the work conforms to the basic relations of subjectivity. The first part of the thesis explores the intimate relation between lived-body and selfhood, analyzes the phenomena of localization, and argues that self-awareness is necessarily and fundamentally embodied self-awareness. The second part examines the intersubjective dimensions of embodiment, investigates the corporal foundations of empathy, and unravels the bodily aspects of transcendental intersubjectivity. The third part scrutinizes the role of embodiment in the constitution of the surrounding objective reality: it focuses on the complex relationship between transcendental subjectivity and transcendental intersubjectivity, carefully examines the normative aspects of genetic and generative self-constitution, and argues eventually that what Husserl calls the paradox of subjectivity originates in a tension between primordial and intersubjective normativity. The work thus reinterprets the paradox of subjectivity in terms of a normative tension, and claims that the paradox is ultimately rooted in the structures of embodiment. In this manner, as a whole, the work discloses the constitutive significance of embodiment, and argues that transcendental subjectivity must be fundamentally embodied.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Bertrand Russell (1872 1970) introduced the English-speaking philosophical world to modern, mathematical logic and foundational study of mathematics. The present study concerns the conception of logic that underlies his early logicist philosophy of mathematics, formulated in The Principles of Mathematics (1903). In 1967, Jean van Heijenoort published a paper, Logic as Language and Logic as Calculus, in which he argued that the early development of modern logic (roughly the period 1879 1930) can be understood, when considered in the light of a distinction between two essentially different perspectives on logic. According to the view of logic as language, logic constitutes the general framework for all rational discourse, or meaningful use of language, whereas the conception of logic as calculus regards logic more as a symbolism which is subject to reinterpretation. The calculus-view paves the way for systematic metatheory, where logic itself becomes a subject of mathematical study (model-theory). Several scholars have interpreted Russell s views on logic with the help of the interpretative tool introduced by van Heijenoort,. They have commonly argued that Russell s is a clear-cut case of the view of logic as language. In the present study a detailed reconstruction of the view and its implications is provided, and it is argued that the interpretation is seriously misleading as to what he really thought about logic. I argue that Russell s conception is best understood by setting it in its proper philosophical context. This is constituted by Immanuel Kant s theory of mathematics. Kant had argued that purely conceptual thought basically, the logical forms recognised in Aristotelian logic cannot capture the content of mathematical judgments and reasonings. Mathematical cognition is not grounded in logic but in space and time as the pure forms of intuition. As against this view, Russell argued that once logic is developed into a proper tool which can be applied to mathematical theories, Kant s views turn out to be completely wrong. In the present work the view is defended that Russell s logicist philosophy of mathematics, or the view that mathematics is really only logic, is based on what I term the Bolzanian account of logic . According to this conception, (i) the distinction between form and content is not explanatory in logic; (ii) the propositions of logic have genuine content; (iii) this content is conferred upon them by special entities, logical constants . The Bolzanian account, it is argued, is both historically important and throws genuine light on Russell s conception of logic.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

One of the most fundamental questions in the philosophy of mathematics concerns the relation between truth and formal proof. The position according to which the two concepts are the same is called deflationism, and the opposing viewpoint substantialism. In an important result of mathematical logic, Kurt Gödel proved in his first incompleteness theorem that all consistent formal systems containing arithmetic include sentences that can neither be proved nor disproved within that system. However, such undecidable Gödel sentences can be established to be true once we expand the formal system with Alfred Tarski s semantical theory of truth, as shown by Stewart Shapiro and Jeffrey Ketland in their semantical arguments for the substantiality of truth. According to them, in Gödel sentences we have an explicit case of true but unprovable sentences, and hence deflationism is refuted. Against that, Neil Tennant has shown that instead of Tarskian truth we can expand the formal system with a soundness principle, according to which all provable sentences are assertable, and the assertability of Gödel sentences follows. This way, the relevant question is not whether we can establish the truth of Gödel sentences, but whether Tarskian truth is a more plausible expansion than a soundness principle. In this work I will argue that this problem is best approached once we think of mathematics as the full human phenomenon, and not just consisting of formal systems. When pre-formal mathematical thinking is included in our account, we see that Tarskian truth is in fact not an expansion at all. I claim that what proof is to formal mathematics, truth is to pre-formal thinking, and the Tarskian account of semantical truth mirrors this relation accurately. However, the introduction of pre-formal mathematics is vulnerable to the deflationist counterargument that while existing in practice, pre-formal thinking could still be philosophically superfluous if it does not refer to anything objective. Against this, I argue that all truly deflationist philosophical theories lead to arbitrariness of mathematics. In all other philosophical accounts of mathematics there is room for a reference of the pre-formal mathematics, and the expansion of Tarkian truth can be made naturally. Hence, if we reject the arbitrariness of mathematics, I argue in this work, we must accept the substantiality of truth. Related subjects such as neo-Fregeanism will also be covered, and shown not to change the need for Tarskian truth. The only remaining route for the deflationist is to change the underlying logic so that our formal languages can include their own truth predicates, which Tarski showed to be impossible for classical first-order languages. With such logics we would have no need to expand the formal systems, and the above argument would fail. From the alternative approaches, in this work I focus mostly on the Independence Friendly (IF) logic of Jaakko Hintikka and Gabriel Sandu. Hintikka has claimed that an IF language can include its own adequate truth predicate. I argue that while this is indeed the case, we cannot recognize the truth predicate as such within the same IF language, and the need for Tarskian truth remains. In addition to IF logic, also second-order logic and Saul Kripke s approach using Kleenean logic will be shown to fail in a similar fashion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this study is to examine the relationship of the Roman villa to its environment. The villa was an important feature of the countryside intended both for agricultural production and for leisure. Manuals of Roman agriculture give instructions on how to select a location for an estate. The ideal location was a moderate slope facing east or south in a healthy area and good neighborhood, near good water resources and fertile soils. A road or a navigable river or the sea was needed for transportation of produce. A market for selling the produce, a town or a village, should have been nearby. The research area is the surroundings of the city of Rome, a key area for the development of the villa. The materials used consist of archaeological settlement sites, literary and epigraphical evidence as well as environmental data. The sites include all settlement sites from the 7th century BC to 5th century AD to examine changes in the tradition of site selection. Geographical Information Systems were used to analyze the data. Six aspects of location were examined: geology, soils, water resources, terrain, visibility/viewability and relationship to roads and habitation centers. Geology was important for finding building materials and the large villas from the 2nd century BC onwards are close to sources of building stones. Fertile soils were sought even in the period of the densest settlement. The area is rich in water, both rainfall and groundwater, and finding a water supply was fairly easy. A certain kind of terrain was sought over very long periods: a small spur or ridge shoulder facing preferably south with an open area in front of the site. The most popular villa resorts are located on the slopes visible from almost the entire Roman region. A visible villa served the social and political aspirations of the owner, whereas being in the villa created a sense of privacy. The area has a very dense road network ensuring good connectivity from almost anywhere in the region. The best visibility/viewability, dense settlement and most burials by roads coincide, creating a good neighborhood. The locations featuring the most qualities cover nearly a quarter of the area and more than half of the settlement sites are located in them. The ideal location was based on centuries of practical experience and rationalized by the literary tradition.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The subject of doctoral thesis is the analysis and interpretation of instrumental pieces composed by Einojuhani Rautavaara (b. 1928) that have been given angelic titles: Archangel Michael Fighting the Antichrist from the suite Icons (1955)/Before the Icons (2006), Angels and Visitations (1978), the Double Bass Concerto Angel of Dusk (1980), Playgrounds for Angels (1981)and the Seventh Symphony Angel of Light (1994). The aim of the work is to find those musical elements common to these pieces that distinguish them from Rautavaara s other works and to determine if they could be thought of as a series. I prove that behind the common elements and titles stands the same extramusical idea the figure of an angel that the composer has described in his commentaries. The thesis is divided into three parts. Since all of the compositions possess titles that refer to the spiritual symbol of an angel, the first part offers a theoretical background to demonstrate the significant role played by angels in various religions and beliefs, and the means by which music has attempted to represent this symbol throughout history. This background traces also Rautavaara s aesthetic attitude as a spiritual composer whose output can be studied with reference to his extramusical interests including literature, psychology, painting, philosophy and myths. The second part focuses on the analysis of the instrumental compositions with angelic titles, without giving consideration to their commentaries and titles. The analyses concentrate in particular on those musical features that distinguish these pieces from Rautavaara s other compositions. In the third part these musical features are interpreted as symbols of the angel through comparison with vocal and instrumental pieces which contain references to the character of an angel, structures of mythical narration, special musical expressions, use of instruments and aspects of brightness. Finally I explore the composer s interpretative codes, drawing on Rilke s cycle of poems Ten Duino Elegies and Jung s theory of archetypes, and analyze the instrumental pieces with angelic titles in the light of the theory of musical ekphrasis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study is an inquiry into three related topics in Aristotle’s psychology: the perception of seeing, the perception of past perception, and the perception of sleeping. Over the past decades, Aristotle’s account of the perception of perception has been studied in numerous articles and chapters of books. However, there is no monograph that attempts to give a comprehensive analysis of this account and to assess its relation and significance to Aristotle’s psychological theory in general as well as to other theories pertaining to the topics (e.g. theories of consciousness), be they ancient, medieval, modern, or contemporary. This study intends to fill this gap and to further the research into Aristotle’s philosophy and into the philosophy of mind. The present study is based on an accurate analysis of the sources, on their Platonic background, and on later interpretations within the commentary tradition up to the present. From a methodological point of view, this study represents systematically orientated research into the history of philosophy, in which special attention is paid to the philosophical problems inherent in the sources, to the distinctions drawn, and to the arguments put forward as well as to their philosophical assessment. In addition to contributing many new findings concerning the topics under discussion, this study shows that Aristotle’s account of the perception of perception substantially differs from many later theories of consciousness. This study also suggests that Aristotle be regarded as a consistent direct realist, not only in respect of sense perception, but also in respect of memory.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The attempt to refer meaningful reality as a whole to a unifying ultimate principle - the quest for the unity of Being - was one of the basic tendencies of Western philosophy from its beginnings in ancient Greece up to Hegel's absolute idealism. However, the different trends of contemporary philosophy tend to regard such a speculative metaphysical quest for unity as obsolete. This study addresses this contemporary situation on the basis of the work of Martin Heidegger (1889-1976). Its methodological framework is Heidegger's phenomenological and hermeneutical approach to the history of philosophy. It seeks to understand, in terms of the metaphysical quest for unity, Heidegger's contrast between the first (Greek) beginning or "onset" (Anfang) of philosophy and another onset of thinking. This other onset is a possibility inherent in the contemporary situation in which, according to Heidegger, the metaphysical tradition has developed to its utmost limits and thereby come to an end. Part I is a detailed interpretation of the surviving fragments of the Poem of Parmenides of Elea (fl. c. 500 BC), an outstanding representative of the first philosophical beginning in Heidegger's sense. It is argued that the Poem is not a simple denial of apparent plurality and difference ("mortal acceptances," doxai) in favor of an extreme monism. Parmenides' point is rather to show in what sense the different instances of Being can be reduced to an absolute level of truth or evidence (aletheia), which is the unity of Being as such (to eon). What in prephilosophical human experience is accepted as being is referred to the source of its acceptability: intelligibility as such, the simple and undifferentiated presence to thinking that ultimately excludes unpresence and otherness. Part II interprets selected key texts from different stages in Heidegger's thinking in terms of the unity of Being. It argues that one aspect of Heidegger's sustained and gradually deepening philosophical quest was to think the unity of Being as singularity, as the instantaneous, context-specific, and differential unity of a temporally meaningful situation. In Being and Time (1927) Heidegger articulates the temporal situatedness of the human awareness of meaningful presence. His later work moves on to study the situational correlation between presence and the human awareness. Heidegger's "postmetaphysical" articulation seeks to show how presence becomes meaningful precisely as situated, in an event of differentiation from a multidimensional context of unpresence. In resigning itself to this irreducibly complicated and singular character of meaningful presence, philosophy also faces its own historically situated finitude. This resignation is an essential feature of Heidegger's "other onset" of thinking.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study concentrates on the contested concept of pastiche in literary studies. It offers the first detailed examination of the history of the concept from its origins in the seventeenth century to the present, showing how pastiche emerged as a critical concept in interaction with the emerging conception of authorial originality and the copyright laws protecting it. One of the key results of this investigation is the contextualisation of the postmodern debate on pastiche. Even though postmodern critics often emphasise the radical novelty of pastiche, they in fact resuscitate older positions and arguments without necessarily reflecting on their historical conditions. This historical background is then used to analyse the distinction between the primarily French conception of pastiche as the imitation of style and the postmodern notion of it as the compilation of different elements. The latter s vagueness and inclusiveness detracts from its value as a critical concept. The study thus concentrates on the notion of stylistic pastiche, challenging the widespread prejudice that it is merely an indication of lack of talent. Because it is multiply based on repetition, pastiche is in fact a highly ambiguous or double-edged practice that calls into question the distinction between repetition and original, thereby undermining the received notion of individual unique authorship as a fundamental aesthetic value. Pastiche does not, however, constitute a radical upheaval of the basic assumptions on which the present institution of literature relies, since, in order to mark its difference, pastiche always refers to a source outside itself against which its difference is measured. Finally, the theoretical analysis of pastiche is applied to literary works. The pastiches written by Marcel Proust demonstrate how it can become an integral part of a writer s poetics: imitation of style is shown to provide Proust with a way of exploring the role of style as a connecting point between inner vision and reality. The pastiches of the Sherlock Holmes stories by Michael Dibdin, Nicholas Meyer and the duo Adrian Conan Doyle and John Dickson Carr illustrate the functions of pastiche within a genre detective fiction that is itself fundamentally repetitive. A.S. Byatt s Possession and D.M. Thomas s Charlotte use Victorian pastiches to investigate the conditions of literary creation in the age of postmodern suspicion of creativity and individuality. The study thus argues that the concept of pastiche has valuable insights to offer to literary criticism and theory, and that literary pastiches, though often dismissed in reviews and criticism, are a particularly interesting object of study precisely because of their characteristic ambiguity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper concentrates on Heraclitus, Parmenides and Lao Zi. The focus is on their ideas on change and whether the world is essentially One or if it is composed of many entities. In the first chapter I go over some general tendences in Greek and Chinese philosophy. The differences in the cultural background have an influence in the ways philosophy is made, but the paper aims to show that two questions can be brought up when comparing the philosophies of Heraclitus, Parmenides and Lao Zi. The questions are; is the world essentially One or Many? Is change real and if it is, what is the nature of it and how does it take place? For Heraclitus change is real, and as will be shown later in the chapter, quite essential for the sustainability of the world-order (kosmos). The key-concept in the case of Heraclitus is Logos. Heraclitus uses Logos in several senses, most well known relating to his element-theory. But another important feature of the Logos, the content of real wisdom, is to be able to regard everything as one. This does not mean that world is essentially one for Heraclitus in the ontological sense, but that we should see the underlying unity of multiple phenomena. Heraclitus regards this as hen panta: All from One, One from All. I characterize Heraclitus as epistemic monist and an ontological pluralist. It is plausible that the views of Heraclitus on change were the focus of Parmenides’ severe criticism. Parmenides held the view that the world is essentially one and that to see it as consisting of many entities was the error of mortals, i.e. the common man and his philosophical predecessors. For Parmenides what-is, can be approached by two routes; The Way of Truth (Aletheia) and The Way of Seeming (Doxa). Aletheia essentially sees the world as one, where even time is an illusion. In Doxa Parmenides is giving an explanation of the world seen as consisting of many entities and this is his contribution to the line of thought of his predecessors. It should be noted that a strong emphasis is given to the Aletheia, whereas the world-view given is in Doxa is only probable. I go on to describe Parmenides as ontological monist, who gives some plausibility to pluralistic views. In the work of Lao Zi world can be seen as One or as consisting of Many entities. In my interpretation, Lao Zi uses Dao in two different senses; Dao is the totality of things or the order in change. The wu-aspect (seeing-without-form) attends the world as one, whereas the you-aspect attends the world of many entities. In wu-aspect, Dao refers to the totality of things, when in you-aspect Dao is the order or law in change. There are two insights in Lao Zi regarding the relationship between wu- and- you-apects; in ch.1 it is stated that they are two separate aspects in seeing the world, the other chapters regarding that you comes from wu. This naturally brings in the question whether the One is the peak of seeing the world as many. In other words, is there a way from pluralism to monism. All these considerations make it probable that the work attributed to Lao Zi has been added new material or is a compilation of oral sayings. In the end of the paper I will go on to give some insights on how Logos and Dao can be compared in a relevant manner. I also compare Parmenides holistic monism to Lao Zi’s Dao as nameless totality (i.e. in its wu-aspect). I briefly touch the issues of Heidegger and the future of comparative philosophy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Epistemological foundationalism has for centuries attempted to unify all scientific inquiry into the context of one grand science, the first philosophy. One of the most important tasks of this tradition has been to ground all knowledge on absolutely certain foundations. In this master s thesis I ask the following question: To what extent and under what conditions is it possible to achieve absolute certainty in the sense of the attempts of Cartesian foundationalism? By examining how the 20th century philosophers, Edmund Husserl (1859-1938), Hannah Arendt (1906-1975) and Maurice Merleau-Ponty (1908-1961) interpret the epistemological methodology of René Descartes, I claim that the Cartesian achievement of absolute certainty rests on the implicit presupposition of an epistemologically prior form of faith in the world and trust (pistis) in other conscious beings. I show that knowledge is possible only within the context of a common world that is inhabited by several conscious beings that share a common linguistic system. This threefold element is shown to be the bedrock condition for any kind of philosophical inquiry. The main literature sources for this thesis are The Life of the Mind by Hannah Arendt, Le Visible et l invisible by Maurice Merleau-Ponty, Meditationes de Prima Philosophiae by René Descartes and Erfahrung und Urteil by Edmund Husserl.