26 resultados para Normative theories

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Democratic Legitimacy and the Politics of Rights is a research in normative political theory, based on comparative analysis of contemporary democratic theories, classified roughly as conventional liberal, deliberative democratic and radical democratic. Its focus is on the conceptual relationship between alternative sources of democratic legitimacy: democratic inclusion and liberal rights. The relationship between rights and democracy is studied through the following questions: are rights to be seen as external constraints to democracy or as objects of democratic decision making processes? Are individual rights threatened by public participation in politics; do constitutionally protected rights limit the inclusiveness of democratic processes? Are liberal values such as individuality, autonomy and liberty; and democratic values such as equality, inclusion and popular sovereignty mutually conflictual or supportive? Analyzing feminist critique of liberal discourse, the dissertation also raises the question about Enlightenment ideals in current political debates: are the universal norms of liberal democracy inherently dependent on the rationalist grand narratives of modernity and incompatible with the ideal of diversity? Part I of the thesis introduces the sources of democratic legitimacy as presented in the alternative democratic models. Part II analyses how the relationship between rights and democracy is theorized in them. Part III contains arguments by feminists and radical democrats against the tenets of universalist liberal democratic models and responds to that critique by partly endorsing, partly rejecting it. The central argument promoted in the thesis is that while the deconstruction of modern rationalism indicates that rights are political constructions as opposed to externally given moral constraints to politics, this insight does not delegitimize the politics of universal rights as an inherent part of democratic institutions. The research indicates that democracy and universal individual rights are mutually interdependent rather than oppositional; and that democracy is more dependent on an unconditional protection of universal individual rights when it is conceived as inclusive, participatory and plural; as opposed to robust majoritarian rule. The central concepts are: liberalism, democracy, legitimacy, deliberation, inclusion, equality, diversity, conflict, public sphere, rights, individualism, universalism and contextuality. The authors discussed are e.g. John Rawls, Jürgen Habermas, Seyla Benhabib, Iris Young, Chantal Mouffe and Stephen Holmes. The research focuses on contemporary political theory, but the more classical work of John S. Mill, Benjamin Constant, Isaiah Berlin and Hannah Arendt is also included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines different ways in which the concept of media pluralism has been theorized and used in contemporary media policy debates. Access to a broad range of different political views and cultural expressions is often regarded as a self-evident value in both theoretical and political debates on media and democracy. Opinions on the meaning and nature of media pluralism as a theoretical, political or empirical concept, however, are many, and it can easily be adjusted to different political purposes. The study aims to analyse the ambiguities surrounding the concept of media pluralism in two ways: by deconstructing its normative roots from the perspective of democratic theory, and by examining its different uses, definitions and underlying rationalities in current European media policy debates. The first part of the study examines the values and assumptions behind the notion of media pluralism in the context of different theories of democracy and the public sphere. The second part then analyses and assesses the deployment of the concept in contemporary European policy debates on media ownership and public service media. Finally, the study critically evaluates various attempts to create empirical indicators for measuring media pluralism and discusses their normative implications and underlying rationalities. The analysis of contemporary policy debates indicates that the notion of media pluralism has been too readily reduced to an empty catchphrase or conflated with consumer choice and market competition. In this narrow technocratic logic, pluralism is often unreflectively associated with quantitative data in a way that leaves unexamined key questions about social and political values, democracy, and citizenship. The basic argument advanced in the study is that media pluralism needs to be rescued from its depoliticized uses and re-imagined more broadly as a normative value that refers to the distribution of communicative power in the public sphere. Instead of something that could simply be measured through the number of media outlets available, the study argues that media pluralism should be understood in terms of its ability to challenge inequalities in communicative power and create a more democratic public sphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines both theoretically an empirically how well the theories of Norman Holland, David Bleich, Wolfgang Iser and Stanley Fish can explain readers' interpretations of literary texts. The theoretical analysis concentrates on their views on language from the point of view of Wittgenstein's Philosophical Investigations. This analysis shows that many of the assumptions related to language in these theories are problematic. The empirical data show that readers often form very similar interpretations. Thus the study challenges the common assumption that literary interpretations tend to be idiosyncratic. The empirical data consists of freely worded written answers to questions on three short stories. The interpretations were made by 27 Finnish university students. Some of the questions addressed issues that were discussed in large parts of the texts, some referred to issues that were mentioned only in passing or implied. The short stories were "The Witch à la Mode" by D. H. Lawrence, "Rain in the Heart" by Peter Taylor and "The Hitchhiking Game" by Milan Kundera. According to Fish, readers create both the formal features of a text and their interpretation of it according to an interpretive strategy. People who agree form an interpretive community. However, a typical answer usually contains ideas repeated by several readers as well as observations not mentioned by anyone else. Therefore it is very difficult to determine which readers belong to the same interpretive community. Moreover, readers with opposing opinions often seem to pay attention to the same textual features and even acknowledge the possibility of an opposing interpretation; therefore they do not seem to create the formal features of the text in different ways. Iser suggests that an interpretation emerges from the interaction between the text and the reader when the reader determines the implications of the text and in this way fills the "gaps" in the text. Iser believes that the text guides the reader, but as he also believes that meaning is on a level beyond words, he cannot explain how the text directs the reader. The similarity in the interpretations and the fact that the agreement is strongest when related to issues that are discussed broadly in the text do, however, support his assumption that readers are guided by the text. In Bleich's view, all interpretations have personal motives and each person has an idiosyncratic language system. The situation where a person learns a word determines the most important meaning it has for that person. In order to uncover the personal etymologies of words, Bleich asks his readers to associate freely on the basis of a text and note down all the personal memories and feelings that the reading experience evokes. Bleich's theory of the idiosyncratic language system seems to rely on a misconceived notion of the role that ostensive definitions have in language use. The readers' responses show that spontaneous associations to personal life seem to colour the readers' interpretations, but such instances are rather rare. According to Holland, an interpretation reflects the reader's identity theme. Language use is regulated by shared rules, but everyone follows the rules in his or her own way. Words mean different things to different people. The problem with this view is that if there is any basis for language use, it seems to be the shared way of following linguistic rules. Wittgenstein suggests that our understanding of words is related to the shared ways of using words and our understanding of human behaviour. This view seems to give better grounds for understanding similarity and differences in literary interpretations than the theories of Holland, Bleich, Fish and Iser.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The PhD dissertation "Bucking Glances: On Body, Gender, Sexuality and Visual Culture Research" consists of theoretical introduction and five articles published between 2002-2005. The articles analyze the position of visual representations in the processes of knowledge production on acceptable genders, bodies, and sexualities in contemporary Wes¬tern societies. The research material is heterogeneous, consisting of representations of contemporary art, advertisements, and fashion images. The ideological starting point of the PhD dissertation is the politics of the gaze and the methods used to expose this are the concepts of oppositional gaze, close reading, and resisting reading. The study situates visual representations in dialogue with the concepts of the grotesque and androgyny, as well as with queer-theory and theories of the gaze. The research challenges normative meanings of visual representations and opens up space for more non-conventional readings attached to femininity and masculinity. The visual material is read as troubling the prevailing heteronormative gender system. The dissertation also indicates how visual culture research utilizing the approach of queer theory can be fruitful in opposing and re-visioning changes in the repressive gender system. The article "A Heroic Male and A Beautiful Woman. Teemu Mäki, Orlan and the Ambivalence of the Grotesque Body" problematizes the concept of heroic masculinity through the analysis of the Finnish artist Teemu Mäki's masochistic performance The Good Friday (1989). It also analyzes cosmetic surgery, undertaken by the French artist Orlan, as a cultural tool in constructing and visualizing the contemporary, com¬mercial ideals of female beauty. The article "Boys Will Be Girls Will Be Boys Will Be Girls. The Ambivalence of Androgyny in Calvin Klein' Advertisements" is a close reading of the Calvin Klein perfume advertisement One (1998) in reference to the concept of androgyny. The critical point of the article is that androgynous male bodies allow the extension of the categorical boundaries of masculinity and homosexuality, whereas representations of androgynous women feed into the prevailing stereotypes of femininity, namely the fear of fat. The article "See-through Closet: Female Androgyny in the 1990s Fashion Images, New Woman and Lesbian Chic" analyzes the late 1990s fashion advertisements through the concept of female androgyny. The article argues that the figures of the masculine female androgynes in the late 1990s fashion magazines do not problematize the dichotomous gender binary. The women do not pass as men but produce a variation of heterosexual desirability. At the same time, the representations open up space for lesbian gazing and desiring. The article "Why are there no lesbian advertisements?" addresses the issue of femme gaze and desire in relation to heterosexual fashion advertisements from the British edition of the mainstream fashion magazine Vogue. The article considers possibilities for resistant femme visibility, identification, and desire. The article "Woman, Food, Home. Pirjetta Brander's and Heidi Romo's Works as Bucking Representations of Femininity" analyses the production and queering of heteronormative femininity and family through the analysis of art works. The article discusses how the term queer has been translated into Finnish. The article also introduces a new translation for the term queer: the noun vikuuri, i.e. faulty form and the verb vikuroida, i.e. to buck. In Finnish, the term vikuuri is the vernacular or broken form of the term figure, i.e. figuuri. Vikuuri represents all forms situated outside the norm and the normative.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study concentrates on the contested concept of pastiche in literary studies. It offers the first detailed examination of the history of the concept from its origins in the seventeenth century to the present, showing how pastiche emerged as a critical concept in interaction with the emerging conception of authorial originality and the copyright laws protecting it. One of the key results of this investigation is the contextualisation of the postmodern debate on pastiche. Even though postmodern critics often emphasise the radical novelty of pastiche, they in fact resuscitate older positions and arguments without necessarily reflecting on their historical conditions. This historical background is then used to analyse the distinction between the primarily French conception of pastiche as the imitation of style and the postmodern notion of it as the compilation of different elements. The latter s vagueness and inclusiveness detracts from its value as a critical concept. The study thus concentrates on the notion of stylistic pastiche, challenging the widespread prejudice that it is merely an indication of lack of talent. Because it is multiply based on repetition, pastiche is in fact a highly ambiguous or double-edged practice that calls into question the distinction between repetition and original, thereby undermining the received notion of individual unique authorship as a fundamental aesthetic value. Pastiche does not, however, constitute a radical upheaval of the basic assumptions on which the present institution of literature relies, since, in order to mark its difference, pastiche always refers to a source outside itself against which its difference is measured. Finally, the theoretical analysis of pastiche is applied to literary works. The pastiches written by Marcel Proust demonstrate how it can become an integral part of a writer s poetics: imitation of style is shown to provide Proust with a way of exploring the role of style as a connecting point between inner vision and reality. The pastiches of the Sherlock Holmes stories by Michael Dibdin, Nicholas Meyer and the duo Adrian Conan Doyle and John Dickson Carr illustrate the functions of pastiche within a genre detective fiction that is itself fundamentally repetitive. A.S. Byatt s Possession and D.M. Thomas s Charlotte use Victorian pastiches to investigate the conditions of literary creation in the age of postmodern suspicion of creativity and individuality. The study thus argues that the concept of pastiche has valuable insights to offer to literary criticism and theory, and that literary pastiches, though often dismissed in reviews and criticism, are a particularly interesting object of study precisely because of their characteristic ambiguity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Design embraces several disciplines dedicated to the production of artifacts and services. These disciplines are quite independent and only recently has psychological interest focused on them. Nowadays, the psychological theories of design, also called design cognition literature, describe the design process from the information processing viewpoint. These models co-exist with the normative standards of how designs should be crafted. In many places there are concrete discrepancies between these two in a way that resembles the differences between the actual and ideal decision-making. This study aimed to explore the possible difference related to problem decomposition. Decomposition is a standard component of human problem-solving models and is also included in the normative models of design. The idea of decomposition is to focus on a single aspect of the problem at a time. Despite its significance, the nature of decomposition in conceptual design is poorly understood and has only been preliminary investigated. This study addressed the status of decomposition in conceptual design of products using protocol analysis. Previous empirical investigations have argued that there are implicit and explicit decomposition, but have not provided a theoretical basis for these two. Therefore, the current research began by reviewing the problem solving and design literature and then composing a cognitive model of the solution search of conceptual design. The result is a synthetic view which describes recognition and decomposition as the basic schemata for conceptual design. A psychological experiment was conducted to explore decomposition. In the test, sixteen (N=16) senior students of mechanical engineering created concepts for two alternative tasks. The concurrent think-aloud method and protocol analysis were used to study decomposition. The results showed that despite the emphasis on decomposition in the formal education, only few designers (N=3) used decomposition explicitly and spontaneously in the presented tasks, although the designers in general applied a top-down control strategy. Instead, inferring from the use of structured strategies, the designers always relied on implicit decomposition. These results confirm the initial observations found in the literature, but they also suggest that decomposition should be investigated further. In the future, the benefits and possibilities of explicit decomposition should be considered along with the cognitive mechanisms behind decomposition. After that, the current results could be reinterpreted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our present-day understanding of fundamental constituents of matter and their interactions is based on the Standard Model of particle physics, which relies on quantum gauge field theories. On the other hand, the large scale dynamical behaviour of spacetime is understood via the general theory of relativity of Einstein. The merging of these two complementary aspects of nature, quantum and gravity, is one of the greatest goals of modern fundamental physics, the achievement of which would help us understand the short-distance structure of spacetime, thus shedding light on the events in the singular states of general relativity, such as black holes and the Big Bang, where our current models of nature break down. The formulation of quantum field theories in noncommutative spacetime is an attempt to realize the idea of nonlocality at short distances, which our present understanding of these different aspects of Nature suggests, and consequently to find testable hints of the underlying quantum behaviour of spacetime. The formulation of noncommutative theories encounters various unprecedented problems, which derive from their peculiar inherent nonlocality. Arguably the most serious of these is the so-called UV/IR mixing, which makes the derivation of observable predictions especially hard by causing new tedious divergencies, to which our previous well-developed renormalization methods for quantum field theories do not apply. In the thesis I review the basic mathematical concepts of noncommutative spacetime, different formulations of quantum field theories in the context, and the theoretical understanding of UV/IR mixing. In particular, I put forward new results to be published, which show that also the theory of quantum electrodynamics in noncommutative spacetime defined via Seiberg-Witten map suffers from UV/IR mixing. Finally, I review some of the most promising ways to overcome the problem. The final solution remains a challenge for the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this research, the cooperation between Finnish municipalities and Evangelical Lutheran parishes is studied from the standpoint of institutional interaction. The most essential theoretical background for the study is the differentiation thesis of the secularization theory. Cooperation from the viewpoints of both organizations is examined using the functional approach. Furthermore, the market theory and other theories are applied in order to place the studied phenomenon in the wider context of the theories of the sociology of religion. Sacralization in modern society and its relationship with the differentiation thesis of the secularization theory are in the theoretical foci. In addition, along with a descriptive examination of cooperation, the normative sides of the phenomenon are discussed. The survey was conducted among all municipalities and parishes in continental Finland. The questionnaires were sent to all municipal managers of youth work and afternoon activities and to all managers of child, youth and social work in the parishes. The response rate for the municipalities was 73.9 % and for the parishes 69.5 %. In addition, two qualitative data were utilized. The aim of the study is to scrutinize what kind of limitations of differentiation can be caused by the interaction between the secular and the religious. In order to solve the problem, an empirical study of sacralization in the modern context is required. For this purpose, the survey was carried out to determine the effects of the religious on the secular and the impact of the secular on the religious. In the articles of the study the following relationships are discussed: the positions of municipalities and parishes in relation to the state and civil society; cooperation in relation to differentiation; sacralization in relation to the differentiation thesis and cooperation in relation to pluralism. The results of the study highlighted the significance of the cooperation, which was contrary to the secularization theory connected to religious sacralization. The acceptance of the appearance of religion in cooperation and parishes support for municipal function was high in municipalities. Religious cooperation was more active than secular cooperation within all fields. This was also true between fields: religiously orientated child work was more active than the societally orientated social work of the church. Religious cooperation in modern fields of activity underlined sacralization. However, the acceptance of sacralization was weaker in cities than rural areas. Positive relationships between the welfare function of municipalities and the religious function of parishes emphasized the incompleteness of differentiation and the importance of sacralization. The relationship of the function of municipalities with parishes was neither negative nor neutral. Thus, in the most active fields, that is, child work and the traditional social work of the church, the orientation of parishes in cooperation supported the functions of both organizations. In more passive fields, that is, youth work and the societal social work of the church, parishes were orientated towards supporting the municipal function. The orientation of municipalities to religion underlined the perception that religious function is necessary for cooperation. However, the official character of cooperation supported accommodation to the requirements of societal pluralism. According to the results, sacralization can be effective also at the institutional level. The religious effect of voluntary cooperation means that religious sacralization can also readjust to modern society. At the same time, the results of the study stressed the importance of institutional autonomy. Thus, the public sector has a central role in successful cooperation. The conditions of cooperation are weakened if there is no official support of cooperation or adjustment to the individual rights of modern society. The results called into question the one-directional assumptions in the secularization paradigm and the modernization theory in the background. In these assumptions, religion that represents the traditional is seen to give way to the modern, especially at the institutional level. Lack of an interactional view was identified as a central weakness of the secularization paradigm. In the theoretical approach created in the study, an interactional view between religious and secular institutions was made possible by limiting the core of the differentiation thesis to autonomy. The counter forces of differentiation are despecialization and sacralization. These changes in the secularization theory bring about new interactivity on the institutional level. In addition to the interactional approach, that is, the secularization and sacralization theory created as a synthesis of the study, interaction between the religious and the secular is discussed from the standpoint of multiple modernities. The spiritual welfare role of religion is seen as a potential supporter of secular institutions. Religion is set theoretically amongst other ideologies and agents, which can create communal bonds in modern society. Key words: cooperation, municipalities, parishes, sacralization, secularization, modernization, multiple modernities, differentiation, interaction, democracy, secularism, pluralism, civil society

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In What We Owe to Each Other, T.M. Scanlon formulated a new version of the ethical theory called contractualism. This theory took reasons considerations that count in favour of judgment-sensitive attitudes to be the fundamental normative notion. It then used normative reasons to first account for evaluative properties. For an object to be valuable, on this view, is for it to have properties that provide reasons to have favourable attitudes towards the bearer of value. Scanlon also used reasons to account for moral wrongness. His contractualism claims that an act is morally wrong if it is forbidden by any set of moral principles that no one could reasonably reject. My thesis consists of five previously published articles which attempt to clarify Scanlon s theory and to defend it against its critics. The first article defends the idea that normative reason-relations are fundamental against Joshua Gert. Gert argues that rationality is a more basic notion than reasons and that reasons can be analysed in terms of their rationally requiring and justifying dimensions. The second article explores the relationship between value and reasons. It defends Scanlon s view according to which reasons are the more basic than value against those who think that reasons are based on the evaluative realm. The last three articles defend Scanlon s views about moral wrongness. The first one of them discusses a classic objection to contractualist theories. This objection is that principles which no one could reasonably reject are redundant in accounting for wrongness. This is because we need a prior notion of wrongness to select those principles and because such principles are not required to make actions wrong or to provide reasons against wrong actions. The fourth article explores the distinctive reasons which contractualists claim there are for avoiding the wrong actions. The last article argues against the critics of contractualism who claim that contractualism has implausible normative consequences for situations related to the treatment of different-sized groups of people.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study explores new ideational changes in the information strategy of the Finnish state between 1998 and 2007, after a juncture in Finnish governing in the early 1990s. The study scrutinizes the economic reframing of institutional openness in Finland that comes with significant and often unintended institutional consequences of transparency. Most notably, the constitutional principle of publicity (julkisuusperiaate), a Nordic institutional peculiarity allowing public access to state information, is now becoming an instrument of economic performance and accountability through results. Finland has a long institutional history in the publicity of government information, acknowledged by law since 1951. Nevertheless, access to government information became a policy concern in the mid-1990s, involving a historical narrative of openness as a Nordic tradition of Finnish governing Nordic openness (pohjoismainen avoimuus). International interest in transparency of governance has also marked an opening for institutional re-descriptions in Nordic context. The essential added value, or contradictory term, that transparency has on the Finnish conceptualisation of governing is the innovation that public acts of governing can be economically efficient. This is most apparent in the new attempts at providing standardised information on government and expressing it in numbers. In Finland, the publicity of government information has been a concept of democratic connotations, but new internationally diffusing ideas of performance and national economic competitiveness are discussed under the notion of transparency and its peer concepts openness and public (sector) information, which are also newcomers to Finnish vocabulary of governing. The above concepts often conflict with one another, paving the way to unintended consequences for the reforms conducted in their name. Moreover, the study argues that the policy concerns over openness and public sector information are linked to the new drive for transparency. Drawing on theories of new institutionalism, political economy, and conceptual history, the study argues for a reinvention of Nordic openness in two senses. First, in referring to institutional history, the policy discourse of Nordic openness discovers an administrative tradition in response to new dilemmas of public governance. Moreover, this normatively appealing discourse also legitimizes the new ideational changes. Second, a former mechanism of democratic accountability is being reframed with market and performance ideas, mostly originating from the sphere of transnational governance and governance indices. Mobilizing different research techniques and data (public documents of the Finnish government and international organizations, some 30 interviews of Finnish civil servants, and statistical time series), the study asks how the above ideational changes have been possible, pointing to the importance of nationalistically appealing historical narratives and normative concepts of governing. Concerning institutional developments, the study analyses the ideational changes in central steering mechanisms (political, normative and financial steering) and the introduction of budget transparency and performance management in two cases: census data (Population Register Centre) and foreign political information (Ministry for Foreign Affairs). The new policy domain of governance indices is also explored as a type of transparency. The study further asks what institutional transformations are to be observed in the above cases and in the accountability system. The study concludes that while the information rights of citizens have been reinforced and recalibrated during the period under scrutiny, there has also been a conversion of institutional practices towards economic performance. As the discourse of Nordic openness has been rather unquestioned, the new internationally circulating ideas of transparency and the knowledge economy have entered this discourse without public notice. Since the mid 1990s, state registry data has been perceived as an exploitable economic resource in Finland and in the EU public sector information. This is a parallel development to the new drive for budget transparency in organisations as vital to the state as the Population Register Centre, which has led to marketization of census data in Finland, an international exceptionality. In the Finnish Ministry for Foreign Affairs, the post-Cold War rhetorical shift from secrecy to performance-driven openness marked a conversion in institutional practices that now see information services with high regards. But this has not necessarily led to the increased publicity of foreign political information. In this context, openness is also defined as sharing information with select actors, as a trust based non-public activity, deemed necessary amid the global economic competition. Regarding accountability system, deliberation and performance now overlap, making it increasingly difficult to identify to whom and for what the public administration is accountable. These evolving institutional practices are characterised by unintended consequences and paradoxes. History is a paradoxical component in the above institutional change, as long-term institutional developments now justify short-term reforms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dissertation consists of four essays and a comprehensive introduction that discusses the topics, methods, and most prominent theories of philosophical moral psychology. I distinguish three main questions: What are the essential features of moral thinking? What are the psychological conditions of moral responsibility? And finally, what are the consequences of empirical facts about human nature to normative ethics? Each of the three last articles focuses on one of these issues. The first essay and part of the introduction are dedicated to methodological questions, in particular the relationship between empirical (social) psychology and philosophy. I reject recent attempts to understand the nature of morality on the basis of empirical research. One characteristic feature of moral thinking is its practical clout: if we regard an action as morally wrong, we either refrain from doing it even against our desires and interests, or else feel shame or guilt. Moral views seem to have a conceptual connection to motivation and emotions – roughly speaking, we can’t conceive of someone genuinely disapproving an action, but nonetheless doing it without any inner motivational conflict or regret. This conceptual thesis in moral psychology is called (judgment) internalism. It implies, among other things, that psychopaths cannot make moral judgments to the extent that they are incapable of corresponding motivation and emotion, even if they might say largely the words we would expect. Is internalism true? Recently, there has been an explosion of interest in so-called experimental philosophy, which is a methodological view according to which claims about conceptual truths that appeal to our intuitions should be tested by way of surveys presented to ordinary language users. One experimental result is that the majority of people are willing to grant that psychopaths make moral judgments, which challenges internalism. In the first article, ‘The Rise and Fall of Experimental Philosophy’, I argue that these results pose no real threat to internalism, since experimental philosophy is based on a too simple conception of the relationship between language use and concepts. Only the reactions of competent users in pragmatically neutral and otherwise conducive circumstances yield evidence about conceptual truths, and such robust intuitions remain inaccessible to surveys for reasons of principle. The epistemology of folk concepts must still be based on Socratic dialogue and critical reflection, whose character and authority I discuss at the end of the paper. The internal connection between moral judgment and motivation led many metaethicists in the past century to believe along Humean lines that judgment itself consists in a pro-attitude rather than a belief. This expressivist view, as it is called these days, has far-reaching consequences in metaethics. In the second essay I argue that perhaps the most sophisticated form of contemporary expressivism, Allan Gibbard’s norm-expressivism, according to which moral judgments are decisions or contingency plans, is implausible from the perspective of the theory of action. In certain circumstances it is possible to think that something is morally required of one without deciding to do so. Morality is not a matter of the will. Instead, I sketch on the basis of Robert Brandom’s inferentialist semantics a weak form of judgment internalism, according to which the content of moral judgment is determined by a commitment to a particular kind of practical reasoning. The last two essays in the dissertation emphasize the role of mutual recognition in the development and maintenance of responsible and autonomous moral agency. I defend a compatibilist view of autonomy, according to which agents who are unable to recognize right and wrong or act accordingly are not responsible for their actions – it is not fair to praise or blame them, since they lacked the relevant capacity to do otherwise. Conversely, autonomy demands an ability to recognize reasons and act on them. But as a long tradition in German moral philosophy whose best-known contemporary representative is Axel Honneth has it, both being aware of reasons and acting on them requires also the right sort of higher-order attitudes toward the self. Without self-respect and self-confidence we remain at the mercy of external pressures, even if we have the necessary normative competence. These attitudes toward the self, in turn, are formed through mutual recognition – we value ourselves when those who we value value us. Thus, standing in the right sort of relations of recognition is indirectly necessary for autonomy and moral responsibility. Recognition and valuing are concretely manifest in actions and institutions, whose practices make possible participation on an equal footing. Seeing this opens the way for a kind of normative social criticism that is grounded in the value of freedom and automomy, but is not limited to defending negative rights. It thus offers a new way to bridge the gap between liberalism and communitarianism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Einstein's general relativity is a classical theory of gravitation: it is a postulate on the coupling between the four-dimensional, continuos spacetime and the matter fields in the universe, and it yields their dynamical evolution. It is believed that general relativity must be replaced by a quantum theory of gravity at least at extremely high energies of the early universe and at regions of strong curvature of spacetime, cf. black holes. Various attempts to quantize gravity, including conceptually new models such as string theory, have suggested that modification to general relativity might show up even at lower energy scales. On the other hand, also the late time acceleration of the expansion of the universe, known as the dark energy problem, might originate from new gravitational physics. Thus, although there has been no direct experimental evidence contradicting general relativity so far - on the contrary, it has passed a variety of observational tests - it is a question worth asking, why should the effective theory of gravity be of the exact form of general relativity? If general relativity is modified, how do the predictions of the theory change? Furthermore, how far can we go with the changes before we are face with contradictions with the experiments? Along with the changes, could there be new phenomena, which we could measure to find hints of the form of the quantum theory of gravity? This thesis is on a class of modified gravity theories called f(R) models, and in particular on the effects of changing the theory of gravity on stellar solutions. It is discussed how experimental constraints from the measurements in the Solar System restrict the form of f(R) theories. Moreover, it is shown that models, which do not differ from general relativity at the weak field scale of the Solar System, can produce very different predictions for dense stars like neutron stars. Due to the nature of f(R) models, the role of independent connection of the spacetime is emphasized throughout the thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Arguments arising from quantum mechanics and gravitation theory as well as from string theory, indicate that the description of space-time as a continuous manifold is not adequate at very short distances. An important candidate for the description of space-time at such scales is provided by noncommutative space-time where the coordinates are promoted to noncommuting operators. Thus, the study of quantum field theory in noncommutative space-time provides an interesting interface where ordinary field theoretic tools can be used to study the properties of quantum spacetime. The three original publications in this thesis encompass various aspects in the still developing area of noncommutative quantum field theory, ranging from fundamental concepts to model building. One of the key features of noncommutative space-time is the apparent loss of Lorentz invariance that has been addressed in different ways in the literature. One recently developed approach is to eliminate the Lorentz violating effects by integrating over the parameter of noncommutativity. Fundamental properties of such theories are investigated in this thesis. Another issue addressed is model building, which is difficult in the noncommutative setting due to severe restrictions on the possible gauge symmetries imposed by the noncommutativity of the space-time. Possible ways to relieve these restrictions are investigated and applied and a noncommutative version of the Minimal Supersymmetric Standard Model is presented. While putting the results obtained in the three original publications into their proper context, the introductory part of this thesis aims to provide an overview of the present situation in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the last decades mean-field models, in which large-scale magnetic fields and differential rotation arise due to the interaction of rotation and small-scale turbulence, have been enormously successful in reproducing many of the observed features of the Sun. In the meantime, new observational techniques, most prominently helioseismology, have yielded invaluable information about the interior of the Sun. This new information, however, imposes strict conditions on mean-field models. Moreover, most of the present mean-field models depend on knowledge of the small-scale turbulent effects that give rise to the large-scale phenomena. In many mean-field models these effects are prescribed in ad hoc fashion due to the lack of this knowledge. With large enough computers it would be possible to solve the MHD equations numerically under stellar conditions. However, the problem is too large by several orders of magnitude for the present day and any foreseeable computers. In our view, a combination of mean-field modelling and local 3D calculations is a more fruitful approach. The large-scale structures are well described by global mean-field models, provided that the small-scale turbulent effects are adequately parameterized. The latter can be achieved by performing local calculations which allow a much higher spatial resolution than what can be achieved in direct global calculations. In the present dissertation three aspects of mean-field theories and models of stars are studied. Firstly, the basic assumptions of different mean-field theories are tested with calculations of isotropic turbulence and hydrodynamic, as well as magnetohydrodynamic, convection. Secondly, even if the mean-field theory is unable to give the required transport coefficients from first principles, it is in some cases possible to compute these coefficients from 3D numerical models in a parameter range that can be considered to describe the main physical effects in an adequately realistic manner. In the present study, the Reynolds stresses and turbulent heat transport, responsible for the generation of differential rotation, were determined along the mixing length relations describing convection in stellar structure models. Furthermore, the alpha-effect and magnetic pumping due to turbulent convection in the rapid rotation regime were studied. The third area of the present study is to apply the local results in mean-field models, which task we start to undertake by applying the results concerning the alpha-effect and turbulent pumping in mean-field models describing the solar dynamo.