911 resultados para Nussbaum, Martha Craven, 1947- -- Contributions in philosophy


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a bibliographic analysis of the vision of Marshal McLuhan and the vision adopted by diverse current authors regarding the use of new interactive learning technologies. The paper also analyzes the transformation that will have to take place in the formal surroundings of education in order to improve their social function. The main points of view and contributions made by diverse authors are discussed. It is important that all actors involved in the educational process take in consideration these contributions in order to be ready for future changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this essay I will argue that natural selection is more important to functional explanations than what has been thought in some of the literature in philosophy of biology. I start by giving a brief overview of the two paradigms cases of functional explanations: etiological functions and causal-role functions. i then consider one particular attempt to conciliate both perspectives given by David Buller (1998). Buller's trial to conciliate both etiological functions and causal-role functions results in what he calls a weak etiological theory. I argue that Buller has not succeeded in his construal of the weak etiological theory: he underestimates the role that selective processes have in functional explanations and so his theory may not be classified as an etiologial theory. As an alternative, I consider the account of etiological functions given by Ruth Millikan (1984) and I argue that Millikan's theory is more comprehensive to assess contentious case in biology like exaptations. Finally, I conclude by analyzing where the adoption of Millikan's theory leave us. I argue, contrary to Millikan and others, that once we assume the importance of natural selection in functional explanations, there is no strong reason to resist a linguistic reform of the word function and hence that the attempts to conciliate both etiological functions and causal-role functions are misplaced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis takes seriously the proposition that existentialism is a lived philosophy. While Descartes' proof for the existence of God initially sparked my interest in philosophy, the insights of existentialism have allowed me to appropriate philosophy as a way of life. I apply the insights of Kierkegaard's writings to my spiritual and philosophy development. Philosophy is personal, and Kierkegaard's writings deal with the development of the person in his aesthetic, ethical and religious dimensions. Philosophy is a struggle, and this thesis, reveals the existential struggle of the individual in despair. The thesis argues that authentic faith actually entails faith. The existential believer has this faith whereas the religious believer does not. The subjectively reflective existential believer recognizes that a leap of faith is needed; anything else, is just historical, speculative knowledge. The existential believer or, the Knight of Faith, realizes that a leap of faith is needed to become open in inwardness to receive the condition to understand the paradoxes that faith presents. I will present Kierkegaard's "Analogy of a House" which is in essence, the backbone of his philosophy. I will discuss the challenge of moving from one floor to the next. More specifically, I will discuss the anxiety that is felt in the very moment of the transition from the first floor to the second floor. I will outline eight paradoxes that must me resolved in order for the individual to continue on his journey to the top floor of the house. I will argue that Kierkegaard's example of Abraham as a Knight of Faith is incorrect, that Abraham was in fact not a Knight of Faith. I will also argue that we should find our own exemplars in our own lives by looking for Knight of Faith traits in people we know and then trying to emulate those people. I will also discuss Unamuno's "paradoxical faith" and argue that this kind of faith is a strong alternative to those who find that Kierkegaard's existential faith is not a possibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kierkegaardian Intersubjectivity and the Question of Ethics and Responsibility By Kevin Krumrei. Kierkegaard's contributions to philosophy are generally admitted and recognized as valuable in the history of Western philosophy, both as one of the great anti-Hegelians, as the founder (arguably) of existentialism, and as a religious thinker. However valid this may be, there is similarly a generally admitted critique of Kierkegaard in the Western tradition, that Kierkegaard's philosophy of the development of the self leads the individual into an isolated encounter with God, to the abandonment of the social context. In other words, a Kierkegaardian theory of intersubjectivity is a contradiction in terms. This is voiced eloquently by Emmanuel Levinas, among others. However, Levinas' own intersubjective ethics bears a striking resemblance to Kierkegaard's, with respect to the description and formulation of the basic problem for ethics: the problem of aesthetic egoism. Further, both Kierkegaard and Levinas follow similar paths in responding to the problem, from Kierkegaard's reduplication in Works of Love, to Levinas' notion of substitution in Otherwise than Being. In this comparison, it becomes evident that Levinas' reading of Kierkegaard is mistaken, for Kierkegaard's intersubjective ethics postulates, in fact, the inseparability and necessity of the self s responsible relation to others in the self s relation to God, found in the command, "you shall love your neighbour as yourself."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Molec ul ar dynamics calculations of the mean sq ua re displacement have been carried out for the alkali metals Na, K and Cs and for an fcc nearest neighbour Lennard-Jones model applicable to rare gas solids. The computations for the alkalis were done for several temperatures for temperature vol ume a swell as for the the ze r 0 pressure ze ro zero pressure volume corresponding to each temperature. In the fcc case, results were obtained for a wide range of both the temperature and density. Lattice dynamics calculations of the harmonic and the lowe s t order anharmonic (cubic and quartic) contributions to the mean square displacement were performed for the same potential models as in the molecular dynamics calculations. The Brillouin zone sums arising in the harmonic and the quartic terms were computed for very large numbers of points in q-space, and were extrapolated to obtain results ful converged with respect to the number of points in the Brillouin zone.An excellent agreement between the lattice dynamics results was observed molecular dynamics and in the case of all the alkali metals, e~ept for the zero pressure case of CSt where the difference is about 15 % near the melting temperature. It was concluded that for the alkalis, the lowest order perturbation theory works well even at temperat ures close to the melting temperat ure. For the fcc nearest neighbour model it was found that the number of particles (256) used for the molecular dynamics calculations, produces a result which is somewhere between 10 and 20 % smaller than the value converged with respect to the number of particles. However, the general temperature dependence of the mean square displacement is the same in molecular dynamics and lattice dynamics for all temperatures at the highest densities examined, while at higher volumes and high temperatures the results diverge. This indicates the importance of the higher order (eg. ~* ) perturbation theory contributions in these cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Responding to a series of articles in sport management literature calling for more diversity in terms of areas of interest or methods, this study warns against the danger of excessively fragmenting this field of research. The works of Kuhn (1962) and Pfeffer (1993) are taken as the basis of an argument that connects convergence with scientific strength. However, being aware of the large number of counterarguments directed at this line of reasoning, a new model of convergence, which focuses on clusters of research contributions with similar areas of interest, methods, and concepts, is proposed. The existence of these clusters is determined with the help of a bibliometric analysis of publications in three sport management journals. This examination determines that there are justified reasons to be concerned about the level of convergence in the field, pointing out to a reduced ability to create large clusters of contributions in similar areas of interest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’objectif de ce mémoire est de comprendre comment une certaine vision du monde, basée sur des croyances théologiques, a contribué à la composition du concerto pour violon Offertorium de Sofia Gubaïdulina. C’est par le biais de cette œuvre qu’est explorée l’idée du dialogue musicothéologique, en proposant des façons par lesquelles la pièce musicale en question peut servir de porteuse ou d’interprète d’une pensée théologique. Afin d’appuyer cette idée, la démarche intertextuelle employée par Heidi Epstein est utilisée. Cette méthode permet de faciliter non seulement le travail interdisciplinaire, mais aussi la lecture théologique de l’œuvre musicale. Le premier chapitre explore les sources, les questions et la problématique qui entoure le dialogue musicothéologique. La conclusion tirée est que l’étude d’Offertorium nécessite une approche équilibrée. Nous entendons par cela, une approche qui prend en ligne de compte la réflexion théologique autant que la recherche musicologique tout en respectant les contributions théologiques que l’œuvre musicale peut apporter en soi. Dans le deuxième chapitre, une analyse thématique d’Offertorium a été tentée ainsi qu’une étude du discours théologique et spirituel de la compositrice. Il a été conclu que l’arrière-plan russe orthodoxe de Gubaidulina a beaucoup influencé sa vision du monde et son approche artistique. Le concerto est porteur d’idées et de symboles liturgiques ou théologiques de l’Orthodoxie dans sa structure et dans sa construction thématique. Le troisième chapitre explore les parallèles entre la pensée de Gubaidulina et les écritures de plusieurs théologiens russes orthodoxes du 20e siècle. La conclusion de ce chapitre démontre que, même s’il est improbable que la compositrice connaisse bien ces auteurs, sa compréhension théologique et spirituelle sort du climat religieux de l’Église Orthodoxe. Cette idée explique les complémentarités et les similarités entre son discours, son œuvre et les propos des théologiens discutés. Le quatrième chapitre évalue la validité d’Offertorium comme moyen d’expression théologique ainsi que de générateur de réflexion théologique. La conclusion de la recherche est qu’Offertorium peut bel et bien être un espace théologique. Ce qui veut dire que des idées théologiques peuvent être communiquées par le biais de l’expérience sonore que ce soit par la mélodie ou l’ambiance générale. Également, cela implique que la musique devient un partenaire égal, quoique différent des méthodes de réflexion traditionnelles au sein de la conversation théologique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thèse diffusée initialement dans le cadre d'un projet pilote des Presses de l'Université de Montréal/Centre d'édition numérique UdeM (1997-2008) avec l'autorisation de l'auteur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’apprentissage supervisé de réseaux hiérarchiques à grande échelle connaît présentement un succès fulgurant. Malgré cette effervescence, l’apprentissage non-supervisé représente toujours, selon plusieurs chercheurs, un élément clé de l’Intelligence Artificielle, où les agents doivent apprendre à partir d’un nombre potentiellement limité de données. Cette thèse s’inscrit dans cette pensée et aborde divers sujets de recherche liés au problème d’estimation de densité par l’entremise des machines de Boltzmann (BM), modèles graphiques probabilistes au coeur de l’apprentissage profond. Nos contributions touchent les domaines de l’échantillonnage, l’estimation de fonctions de partition, l’optimisation ainsi que l’apprentissage de représentations invariantes. Cette thèse débute par l’exposition d’un nouvel algorithme d'échantillonnage adaptatif, qui ajuste (de fa ̧con automatique) la température des chaînes de Markov sous simulation, afin de maintenir une vitesse de convergence élevée tout au long de l’apprentissage. Lorsqu’utilisé dans le contexte de l’apprentissage par maximum de vraisemblance stochastique (SML), notre algorithme engendre une robustesse accrue face à la sélection du taux d’apprentissage, ainsi qu’une meilleure vitesse de convergence. Nos résultats sont présent ́es dans le domaine des BMs, mais la méthode est générale et applicable à l’apprentissage de tout modèle probabiliste exploitant l’échantillonnage par chaînes de Markov. Tandis que le gradient du maximum de vraisemblance peut-être approximé par échantillonnage, l’évaluation de la log-vraisemblance nécessite un estimé de la fonction de partition. Contrairement aux approches traditionnelles qui considèrent un modèle donné comme une boîte noire, nous proposons plutôt d’exploiter la dynamique de l’apprentissage en estimant les changements successifs de log-partition encourus à chaque mise à jour des paramètres. Le problème d’estimation est reformulé comme un problème d’inférence similaire au filtre de Kalman, mais sur un graphe bi-dimensionnel, où les dimensions correspondent aux axes du temps et au paramètre de température. Sur le thème de l’optimisation, nous présentons également un algorithme permettant d’appliquer, de manière efficace, le gradient naturel à des machines de Boltzmann comportant des milliers d’unités. Jusqu’à présent, son adoption était limitée par son haut coût computationel ainsi que sa demande en mémoire. Notre algorithme, Metric-Free Natural Gradient (MFNG), permet d’éviter le calcul explicite de la matrice d’information de Fisher (et son inverse) en exploitant un solveur linéaire combiné à un produit matrice-vecteur efficace. L’algorithme est prometteur: en terme du nombre d’évaluations de fonctions, MFNG converge plus rapidement que SML. Son implémentation demeure malheureusement inefficace en temps de calcul. Ces travaux explorent également les mécanismes sous-jacents à l’apprentissage de représentations invariantes. À cette fin, nous utilisons la famille de machines de Boltzmann restreintes “spike & slab” (ssRBM), que nous modifions afin de pouvoir modéliser des distributions binaires et parcimonieuses. Les variables latentes binaires de la ssRBM peuvent être rendues invariantes à un sous-espace vectoriel, en associant à chacune d’elles, un vecteur de variables latentes continues (dénommées “slabs”). Ceci se traduit par une invariance accrue au niveau de la représentation et un meilleur taux de classification lorsque peu de données étiquetées sont disponibles. Nous terminons cette thèse sur un sujet ambitieux: l’apprentissage de représentations pouvant séparer les facteurs de variations présents dans le signal d’entrée. Nous proposons une solution à base de ssRBM bilinéaire (avec deux groupes de facteurs latents) et formulons le problème comme l’un de “pooling” dans des sous-espaces vectoriels complémentaires.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

“A Shine of Truth in the ‘universal delusional context of reification’ (Theodor W. Adorno)” comprend sept chapitres, un prologue et un épilogue. Chaque partie se construit à deux niveaux : (1) à partir des liens qui se tissent entre les phrases contiguës ; et (2) à partir des liens qui se tissent entre les phrases non contiguës. Les incipit des paragraphes forment l’argument principal de la thèse. Le sujet de la thèse, Schein (apparence, illusion, clarté) est abordé de manière non formaliste, c’est à dire, de manière que la forme donne d’elle-même une idée de la chose : illusion comme contradiction imposée. Bien que le sujet de la thèse soit l’illusion, son but est la vérité. Le Chapitre I présente une dialectique de perspectives (celles de Marx, de Lukács, de Hegel, de Horkheimer et d'Adorno) pour arriver à un critère de vérité, compte tenu du contexte d’aveuglement universel de la réification ; c’est la détermination de la dissolution de l’apparence. Le Chapitre II présente le concept d’apparence esthétique—une apparence réversible qui s’oppose à l’apparence sociale générée par l’industrie de la culture. Le Chapitre III cherche à savoir si la vérité en philosophie et la vérité en art sont deux genres distincts de vérités. Le Chapitre IV détermine si l’appel à la vérité comme immédiateté de l’expression, fait par le mouvement expressionniste du 20e siècle, est nouveau, jugé à l’aune d’un important antécédent à l’expressionisme musical : « Der Dichter spricht » de Robert Schumann. Le Chapitre V se penche sur la question à savoir si le montage inorganique est plus avancé que l’expressionisme. Le Chapitre VI reprend là où Peter Bürger clôt son essai Theorie de l’avant-garde : ce chapitre cherche à savoir à quel point l’oeuvre d’art après le Dada et le Surréalisme correspond au modèle hégélien de la « prose ». Le Chapitre VII soutient que Dichterliebe, op. 48, (1840), est une oeuvre d’art vraie. Trois conclusions résultent de cette analyse musicale détaillée : (1) en exploitant, dans certains passages, une ambigüité dans les règles de l’harmonie qui fait en sorte tous les douze tons sont admis dans l’harmonie, l’Opus 48 anticipe sur Schoenberg—tout en restant une musique tonale ; (2) l’Opus 48, no 1 cache une tonalité secrète : à l'oeil, sa tonalité est soit la majeur, soit fa-dièse mineur, mais une nouvelle analyse dans la napolitaine de do-dièse majeur est proposée ici ; (3) une modulation passagère à la napolitaine dans l’Opus 48, no 12 contient l’autre « moitié » de la cadence interrompue à la fin de l’Opus 48, no 1. Considérés à la lumière de la société fausse, l’Allemagne des années 1930, ces trois aspects anti-organiques témoignent d’une conscience avancée. La seule praxis de vie qu’apporte l’art, selon Adorno, est la remémoration. Mais l’effet social ultime de garder la souffrance vécue en souvenir est non négligeable : l’émancipation universelle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Learning Disability (LD) is a general term that describes specific kinds of learning problems. It is a neurological condition that affects a child's brain and impairs his ability to carry out one or many specific tasks. The learning disabled children are neither slow nor mentally retarded. This disorder can make it problematic for a child to learn as quickly or in the same way as some child who isn't affected by a learning disability. An affected child can have normal or above average intelligence. They may have difficulty paying attention, with reading or letter recognition, or with mathematics. It does not mean that children who have learning disabilities are less intelligent. In fact, many children who have learning disabilities are more intelligent than an average child. Learning disabilities vary from child to child. One child with LD may not have the same kind of learning problems as another child with LD. There is no cure for learning disabilities and they are life-long. However, children with LD can be high achievers and can be taught ways to get around the learning disability. In this research work, data mining using machine learning techniques are used to analyze the symptoms of LD, establish interrelationships between them and evaluate the relative importance of these symptoms. To increase the diagnostic accuracy of learning disability prediction, a knowledge based tool based on statistical machine learning or data mining techniques, with high accuracy,according to the knowledge obtained from the clinical information, is proposed. The basic idea of the developed knowledge based tool is to increase the accuracy of the learning disability assessment and reduce the time used for the same. Different statistical machine learning techniques in data mining are used in the study. Identifying the important parameters of LD prediction using the data mining techniques, identifying the hidden relationship between the symptoms of LD and estimating the relative significance of each symptoms of LD are also the parts of the objectives of this research work. The developed tool has many advantages compared to the traditional methods of using check lists in determination of learning disabilities. For improving the performance of various classifiers, we developed some preprocessing methods for the LD prediction system. A new system based on fuzzy and rough set models are also developed for LD prediction. Here also the importance of pre-processing is studied. A Graphical User Interface (GUI) is designed for developing an integrated knowledge based tool for prediction of LD as well as its degree. The designed tool stores the details of the children in the student database and retrieves their LD report as and when required. The present study undoubtedly proves the effectiveness of the tool developed based on various machine learning techniques. It also identifies the important parameters of LD and accurately predicts the learning disability in school age children. This thesis makes several major contributions in technical, general and social areas. The results are found very beneficial to the parents, teachers and the institutions. They are able to diagnose the child’s problem at an early stage and can go for the proper treatments/counseling at the correct time so as to avoid the academic and social losses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El concepto de legalización fue desarrollado recientemente por el neoliberalismo institucional como una forma especial de institucionalización de las relaciones internacionales. Los autores neoliberales caracterizan la legalización a partir de las categorías utilizadas por H. L. A. Hart para distinguir el derecho de otros mecanismos de control social, como el poder y la moral. En Hart, estas categorías responden a una finalidad normativa: reconstruir teóricamente el derecho como un sistema independiente de la voluntad y de las convicciones de quienes lo interpretan y aplican. Sin embargo, esta separación entre lenguaje y práctica jurídica desconoce importantes contribuciones de la tradición analítica en filosofía del lenguaje, en cuanto a la relación entre lenguaje y realidad. En particular, termina reduciendo el derecho a simples formas y textos vacíos, y con ello desconoce que a través de las prácticas jurídicas se va dando significado a los textos normativos.Adoptar esta visión del derecho al estudio de las relaciones internacionales tiene, al menos, una consecuencia metodológica: el simple análisis formal del texto de los tratados no permite comprender el efecto del derecho internacional en el comportamiento de los Estados. Para entender las relaciones entre el derecho internacional y el comportamiento estatal es necesario describir la manera como se construye el significado de los textos a través de la práctica jurídica de los Estados. En tal sentido, resultaría útil redefinir la agenda de investigación neoliberal en relación con la legalización y enfocarse en la forma como los Estados y los tribunales internacionales construyen el significado de los tratados y demás normas internacionales.-----The concept of legalization was recently developed by neoliberal institutionalism as a special kind of institutionalization of international politics. Neoliberals built the concept of legalization using the analytical tools developed by H. L. A. Hart to distinguish law from other mechanisms of social control, like power and morals. Within Hart’s theory, such tools have a normative function: theoretically rebuilding law as a system of rules that is independent from the will and the beliefs of those who interpret and apply legal rules. However, Hart’s resulting separation of legal texts from legal practice obscures important contributions that the analytical tradition in philosophy of language has made to the understanding of the relation between language and reality. Specifically, such a separation reduces law to simple forms and texts disregarding the extent to which legal practice gives meaning to legal texts.Adapting Hart’s conception of law to International Relations has at least one important methodological consequence: the formal analysis of treaties cannot explain the influence of international law over state behavior. To understand the influence of international law on state behavior, one must previously describe the relation between legal practice and the meaning of legal texts. Thus, a redefinition of neoliberal research agenda on legalization should focus on the way States and international courts construct the meaning of treaties and other international norms.