184 resultados para Personal theories

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines both theoretically an empirically how well the theories of Norman Holland, David Bleich, Wolfgang Iser and Stanley Fish can explain readers' interpretations of literary texts. The theoretical analysis concentrates on their views on language from the point of view of Wittgenstein's Philosophical Investigations. This analysis shows that many of the assumptions related to language in these theories are problematic. The empirical data show that readers often form very similar interpretations. Thus the study challenges the common assumption that literary interpretations tend to be idiosyncratic. The empirical data consists of freely worded written answers to questions on three short stories. The interpretations were made by 27 Finnish university students. Some of the questions addressed issues that were discussed in large parts of the texts, some referred to issues that were mentioned only in passing or implied. The short stories were "The Witch à la Mode" by D. H. Lawrence, "Rain in the Heart" by Peter Taylor and "The Hitchhiking Game" by Milan Kundera. According to Fish, readers create both the formal features of a text and their interpretation of it according to an interpretive strategy. People who agree form an interpretive community. However, a typical answer usually contains ideas repeated by several readers as well as observations not mentioned by anyone else. Therefore it is very difficult to determine which readers belong to the same interpretive community. Moreover, readers with opposing opinions often seem to pay attention to the same textual features and even acknowledge the possibility of an opposing interpretation; therefore they do not seem to create the formal features of the text in different ways. Iser suggests that an interpretation emerges from the interaction between the text and the reader when the reader determines the implications of the text and in this way fills the "gaps" in the text. Iser believes that the text guides the reader, but as he also believes that meaning is on a level beyond words, he cannot explain how the text directs the reader. The similarity in the interpretations and the fact that the agreement is strongest when related to issues that are discussed broadly in the text do, however, support his assumption that readers are guided by the text. In Bleich's view, all interpretations have personal motives and each person has an idiosyncratic language system. The situation where a person learns a word determines the most important meaning it has for that person. In order to uncover the personal etymologies of words, Bleich asks his readers to associate freely on the basis of a text and note down all the personal memories and feelings that the reading experience evokes. Bleich's theory of the idiosyncratic language system seems to rely on a misconceived notion of the role that ostensive definitions have in language use. The readers' responses show that spontaneous associations to personal life seem to colour the readers' interpretations, but such instances are rather rare. According to Holland, an interpretation reflects the reader's identity theme. Language use is regulated by shared rules, but everyone follows the rules in his or her own way. Words mean different things to different people. The problem with this view is that if there is any basis for language use, it seems to be the shared way of following linguistic rules. Wittgenstein suggests that our understanding of words is related to the shared ways of using words and our understanding of human behaviour. This view seems to give better grounds for understanding similarity and differences in literary interpretations than the theories of Holland, Bleich, Fish and Iser.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study concentrates on the contested concept of pastiche in literary studies. It offers the first detailed examination of the history of the concept from its origins in the seventeenth century to the present, showing how pastiche emerged as a critical concept in interaction with the emerging conception of authorial originality and the copyright laws protecting it. One of the key results of this investigation is the contextualisation of the postmodern debate on pastiche. Even though postmodern critics often emphasise the radical novelty of pastiche, they in fact resuscitate older positions and arguments without necessarily reflecting on their historical conditions. This historical background is then used to analyse the distinction between the primarily French conception of pastiche as the imitation of style and the postmodern notion of it as the compilation of different elements. The latter s vagueness and inclusiveness detracts from its value as a critical concept. The study thus concentrates on the notion of stylistic pastiche, challenging the widespread prejudice that it is merely an indication of lack of talent. Because it is multiply based on repetition, pastiche is in fact a highly ambiguous or double-edged practice that calls into question the distinction between repetition and original, thereby undermining the received notion of individual unique authorship as a fundamental aesthetic value. Pastiche does not, however, constitute a radical upheaval of the basic assumptions on which the present institution of literature relies, since, in order to mark its difference, pastiche always refers to a source outside itself against which its difference is measured. Finally, the theoretical analysis of pastiche is applied to literary works. The pastiches written by Marcel Proust demonstrate how it can become an integral part of a writer s poetics: imitation of style is shown to provide Proust with a way of exploring the role of style as a connecting point between inner vision and reality. The pastiches of the Sherlock Holmes stories by Michael Dibdin, Nicholas Meyer and the duo Adrian Conan Doyle and John Dickson Carr illustrate the functions of pastiche within a genre detective fiction that is itself fundamentally repetitive. A.S. Byatt s Possession and D.M. Thomas s Charlotte use Victorian pastiches to investigate the conditions of literary creation in the age of postmodern suspicion of creativity and individuality. The study thus argues that the concept of pastiche has valuable insights to offer to literary criticism and theory, and that literary pastiches, though often dismissed in reviews and criticism, are a particularly interesting object of study precisely because of their characteristic ambiguity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our present-day understanding of fundamental constituents of matter and their interactions is based on the Standard Model of particle physics, which relies on quantum gauge field theories. On the other hand, the large scale dynamical behaviour of spacetime is understood via the general theory of relativity of Einstein. The merging of these two complementary aspects of nature, quantum and gravity, is one of the greatest goals of modern fundamental physics, the achievement of which would help us understand the short-distance structure of spacetime, thus shedding light on the events in the singular states of general relativity, such as black holes and the Big Bang, where our current models of nature break down. The formulation of quantum field theories in noncommutative spacetime is an attempt to realize the idea of nonlocality at short distances, which our present understanding of these different aspects of Nature suggests, and consequently to find testable hints of the underlying quantum behaviour of spacetime. The formulation of noncommutative theories encounters various unprecedented problems, which derive from their peculiar inherent nonlocality. Arguably the most serious of these is the so-called UV/IR mixing, which makes the derivation of observable predictions especially hard by causing new tedious divergencies, to which our previous well-developed renormalization methods for quantum field theories do not apply. In the thesis I review the basic mathematical concepts of noncommutative spacetime, different formulations of quantum field theories in the context, and the theoretical understanding of UV/IR mixing. In particular, I put forward new results to be published, which show that also the theory of quantum electrodynamics in noncommutative spacetime defined via Seiberg-Witten map suffers from UV/IR mixing. Finally, I review some of the most promising ways to overcome the problem. The final solution remains a challenge for the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Democratic Legitimacy and the Politics of Rights is a research in normative political theory, based on comparative analysis of contemporary democratic theories, classified roughly as conventional liberal, deliberative democratic and radical democratic. Its focus is on the conceptual relationship between alternative sources of democratic legitimacy: democratic inclusion and liberal rights. The relationship between rights and democracy is studied through the following questions: are rights to be seen as external constraints to democracy or as objects of democratic decision making processes? Are individual rights threatened by public participation in politics; do constitutionally protected rights limit the inclusiveness of democratic processes? Are liberal values such as individuality, autonomy and liberty; and democratic values such as equality, inclusion and popular sovereignty mutually conflictual or supportive? Analyzing feminist critique of liberal discourse, the dissertation also raises the question about Enlightenment ideals in current political debates: are the universal norms of liberal democracy inherently dependent on the rationalist grand narratives of modernity and incompatible with the ideal of diversity? Part I of the thesis introduces the sources of democratic legitimacy as presented in the alternative democratic models. Part II analyses how the relationship between rights and democracy is theorized in them. Part III contains arguments by feminists and radical democrats against the tenets of universalist liberal democratic models and responds to that critique by partly endorsing, partly rejecting it. The central argument promoted in the thesis is that while the deconstruction of modern rationalism indicates that rights are political constructions as opposed to externally given moral constraints to politics, this insight does not delegitimize the politics of universal rights as an inherent part of democratic institutions. The research indicates that democracy and universal individual rights are mutually interdependent rather than oppositional; and that democracy is more dependent on an unconditional protection of universal individual rights when it is conceived as inclusive, participatory and plural; as opposed to robust majoritarian rule. The central concepts are: liberalism, democracy, legitimacy, deliberation, inclusion, equality, diversity, conflict, public sphere, rights, individualism, universalism and contextuality. The authors discussed are e.g. John Rawls, Jürgen Habermas, Seyla Benhabib, Iris Young, Chantal Mouffe and Stephen Holmes. The research focuses on contemporary political theory, but the more classical work of John S. Mill, Benjamin Constant, Isaiah Berlin and Hannah Arendt is also included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Einstein's general relativity is a classical theory of gravitation: it is a postulate on the coupling between the four-dimensional, continuos spacetime and the matter fields in the universe, and it yields their dynamical evolution. It is believed that general relativity must be replaced by a quantum theory of gravity at least at extremely high energies of the early universe and at regions of strong curvature of spacetime, cf. black holes. Various attempts to quantize gravity, including conceptually new models such as string theory, have suggested that modification to general relativity might show up even at lower energy scales. On the other hand, also the late time acceleration of the expansion of the universe, known as the dark energy problem, might originate from new gravitational physics. Thus, although there has been no direct experimental evidence contradicting general relativity so far - on the contrary, it has passed a variety of observational tests - it is a question worth asking, why should the effective theory of gravity be of the exact form of general relativity? If general relativity is modified, how do the predictions of the theory change? Furthermore, how far can we go with the changes before we are face with contradictions with the experiments? Along with the changes, could there be new phenomena, which we could measure to find hints of the form of the quantum theory of gravity? This thesis is on a class of modified gravity theories called f(R) models, and in particular on the effects of changing the theory of gravity on stellar solutions. It is discussed how experimental constraints from the measurements in the Solar System restrict the form of f(R) theories. Moreover, it is shown that models, which do not differ from general relativity at the weak field scale of the Solar System, can produce very different predictions for dense stars like neutron stars. Due to the nature of f(R) models, the role of independent connection of the spacetime is emphasized throughout the thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arguments arising from quantum mechanics and gravitation theory as well as from string theory, indicate that the description of space-time as a continuous manifold is not adequate at very short distances. An important candidate for the description of space-time at such scales is provided by noncommutative space-time where the coordinates are promoted to noncommuting operators. Thus, the study of quantum field theory in noncommutative space-time provides an interesting interface where ordinary field theoretic tools can be used to study the properties of quantum spacetime. The three original publications in this thesis encompass various aspects in the still developing area of noncommutative quantum field theory, ranging from fundamental concepts to model building. One of the key features of noncommutative space-time is the apparent loss of Lorentz invariance that has been addressed in different ways in the literature. One recently developed approach is to eliminate the Lorentz violating effects by integrating over the parameter of noncommutativity. Fundamental properties of such theories are investigated in this thesis. Another issue addressed is model building, which is difficult in the noncommutative setting due to severe restrictions on the possible gauge symmetries imposed by the noncommutativity of the space-time. Possible ways to relieve these restrictions are investigated and applied and a noncommutative version of the Minimal Supersymmetric Standard Model is presented. While putting the results obtained in the three original publications into their proper context, the introductory part of this thesis aims to provide an overview of the present situation in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last decades mean-field models, in which large-scale magnetic fields and differential rotation arise due to the interaction of rotation and small-scale turbulence, have been enormously successful in reproducing many of the observed features of the Sun. In the meantime, new observational techniques, most prominently helioseismology, have yielded invaluable information about the interior of the Sun. This new information, however, imposes strict conditions on mean-field models. Moreover, most of the present mean-field models depend on knowledge of the small-scale turbulent effects that give rise to the large-scale phenomena. In many mean-field models these effects are prescribed in ad hoc fashion due to the lack of this knowledge. With large enough computers it would be possible to solve the MHD equations numerically under stellar conditions. However, the problem is too large by several orders of magnitude for the present day and any foreseeable computers. In our view, a combination of mean-field modelling and local 3D calculations is a more fruitful approach. The large-scale structures are well described by global mean-field models, provided that the small-scale turbulent effects are adequately parameterized. The latter can be achieved by performing local calculations which allow a much higher spatial resolution than what can be achieved in direct global calculations. In the present dissertation three aspects of mean-field theories and models of stars are studied. Firstly, the basic assumptions of different mean-field theories are tested with calculations of isotropic turbulence and hydrodynamic, as well as magnetohydrodynamic, convection. Secondly, even if the mean-field theory is unable to give the required transport coefficients from first principles, it is in some cases possible to compute these coefficients from 3D numerical models in a parameter range that can be considered to describe the main physical effects in an adequately realistic manner. In the present study, the Reynolds stresses and turbulent heat transport, responsible for the generation of differential rotation, were determined along the mixing length relations describing convection in stellar structure models. Furthermore, the alpha-effect and magnetic pumping due to turbulent convection in the rapid rotation regime were studied. The third area of the present study is to apply the local results in mean-field models, which task we start to undertake by applying the results concerning the alpha-effect and turbulent pumping in mean-field models describing the solar dynamo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantum chromodynamics (QCD) is the theory describing interaction between quarks and gluons. At low temperatures, quarks are confined forming hadrons, e.g. protons and neutrons. However, at extremely high temperatures the hadrons break apart and the matter transforms into plasma of individual quarks and gluons. In this theses the quark gluon plasma (QGP) phase of QCD is studied using lattice techniques in the framework of dimensionally reduced effective theories EQCD and MQCD. Two quantities are in particular interest: the pressure (or grand potential) and the quark number susceptibility. At high temperatures the pressure admits a generalised coupling constant expansion, where some coefficients are non-perturbative. We determine the first such contribution of order g^6 by performing lattice simulations in MQCD. This requires high precision lattice calculations, which we perform with different number of colors N_c to obtain N_c-dependence on the coefficient. The quark number susceptibility is studied by performing lattice simulations in EQCD. We measure both flavor singlet (diagonal) and non-singlet (off-diagonal) quark number susceptibilities. The finite chemical potential results are optained using analytic continuation. The diagonal susceptibility approaches the perturbative result above 20T_c$, but below that temperature we observe significant deviations. The results agree well with 4d lattice data down to temperatures 2T_c.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines different ways in which the concept of media pluralism has been theorized and used in contemporary media policy debates. Access to a broad range of different political views and cultural expressions is often regarded as a self-evident value in both theoretical and political debates on media and democracy. Opinions on the meaning and nature of media pluralism as a theoretical, political or empirical concept, however, are many, and it can easily be adjusted to different political purposes. The study aims to analyse the ambiguities surrounding the concept of media pluralism in two ways: by deconstructing its normative roots from the perspective of democratic theory, and by examining its different uses, definitions and underlying rationalities in current European media policy debates. The first part of the study examines the values and assumptions behind the notion of media pluralism in the context of different theories of democracy and the public sphere. The second part then analyses and assesses the deployment of the concept in contemporary European policy debates on media ownership and public service media. Finally, the study critically evaluates various attempts to create empirical indicators for measuring media pluralism and discusses their normative implications and underlying rationalities. The analysis of contemporary policy debates indicates that the notion of media pluralism has been too readily reduced to an empty catchphrase or conflated with consumer choice and market competition. In this narrow technocratic logic, pluralism is often unreflectively associated with quantitative data in a way that leaves unexamined key questions about social and political values, democracy, and citizenship. The basic argument advanced in the study is that media pluralism needs to be rescued from its depoliticized uses and re-imagined more broadly as a normative value that refers to the distribution of communicative power in the public sphere. Instead of something that could simply be measured through the number of media outlets available, the study argues that media pluralism should be understood in terms of its ability to challenge inequalities in communicative power and create a more democratic public sphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the possibility of extending the Quantization Condition of Dirac for Magnetic Monopoles to noncommutative space-time is investigated. The three publications that this thesis is based on are all in direct link to this investigation. Noncommutative solitons have been found within certain noncommutative field theories, but it is not known whether they possesses only topological charge or also magnetic charge. This is a consequence of that the noncommutative topological charge need not coincide with the noncommutative magnetic charge, although they are equivalent in the commutative context. The aim of this work is to begin to fill this gap of knowledge. The method of investigation is perturbative and leaves open the question of whether a nonperturbative source for the magnetic monopole can be constructed, although some aspects of such a generalization are indicated. The main result is that while the noncommutative Aharonov-Bohm effect can be formulated in a gauge invariant way, the quantization condition of Dirac is not satisfied in the case of a perturbative source for the point-like magnetic monopole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dissertation analyzes and elaborates upon the changing map of U.S. ethno-racial formation from the vantage point of North American Studies, multi-disciplinary cultural studies, and the criticism of visual culture. The focus is on four contemporary Mexican American (Chicana) women photographers, whose art production is discussed, on the one hand, in the context of the Euro-American history of photographic genres and, on the other hand, in the context of so-called decolonizing cultural and academic discourses produced by Mexican Americans themselves. The manuscript consists of two parts. Part I outlines the theoretical and methodological domain of the study, positioning it in the interstices of American studies, European postmodern criticism, postcolonial feminist theory, and the theories of visual culture, particularly of art photography. In addition, the main issues and paradigms of Chicano Studies (Mexican American ethnic studies) are introduced. Part II consists of seven essays, each of which discusses rather independently a particular photographic work or a series of photographs, formulating and defending arguments about their meaning, position in the history of photographic genres, and their cultural and socio-political significance. The study closes with a discussion about ethno-racial identity formation and the role of Chicana photography therein - in embodying and reproducing new subjectivities, alternative categories of knowledge, and open ended historical narratives. It is argued that, symbolically, the "Wild Zone" of gendered and race-specific knowledge becomes associated with the body of the mother, a recurrent image in Chicana art works under discussion. Embedded in this image, the construction of an alternative notion of a family thus articulates the parameters of a matrifocal ethno-racial community unified by the proliferation of differences rather than by conformities typical of nationalistic ideologies. While focusing on art photography, the study as a whole simultaneously constructs, from a European vantage point, a "thick" description of Mexican American history, identities, communities, cultural practices, and self-representations about which very little is known in Finland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the study is to explain how paradise beliefs are born from the viewpoint of mental functions of the human mind. The focus is on the observation that paradise beliefs across the world are mutually more similar than dissimilar. By using recent theories and results from the cognitive and evolutionary study of religion as well as from studies of environmental preferences, I suggest that this is because pan-human unconscious motivations, the architecture of mind, and the way the human mind processes information constrain the possible repertoire of paradise beliefs. The study is divided into two parts, theoretical and empirical. The arguments in the theoretical part are tested with data in the empirical part with two data sets. The first data set was collected using an Internet survey. The second data set was derived from literary sources. The first data test the assumption that intuitive conceptions of an environment of dreams generally follow the outlines set by evolved environmental preferences, but that they can be tweaked by modifying the presence of desirable elements. The second data test the assumption that familiarity is a dominant factor determining the content of paradise beliefs. The results of the study show that in addition to the widely studied belief in supernatural agents, belief in supernatural environments wells from the natural functioning of the human mind attesting the view that religious thinking and ideas are natural for human species and are produced by the same mental mechanisms as other cultural information. The results also help us to understand that the mental structures behind the belief in the supernatural have a wider scope than has been previously acknowledged.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study is about governance in contemporary China. The focus is on Qinghai Province, one of the twelve provincial-level units included in the western region development strategy launched in 2000 by the government of China. Qinghai, the subject of the case study, is not a very well-known province. Hence, this study is significant, because it provides new knowledge about the province of Qinghai, its governance and diverse challenges, and deepens one s overall knowledge regarding China. Qinghai province is one of the slowest developing regions of China. My research problem is to analyze to what extent provincial development correlates with the quality of governance. The central concept of this research is good governance. This dissertation employs a grounded theory approach while the theoretical framework of this study is built on the Three World s approach of analyzing the three main themes, namely, the environment, economic development, and cultural diversity, and to support the empirical work. Philosophical issues in the humanities and contemporary theories of governance are brought in to provide deeper understanding of governance, and to understand to what extent and how characteristics of good governance (derived from the Western canon) are combined with Chinese tradition. A qualitative research method is chosen to provide a deeper understanding of the contemporary challenges of Qinghai (and China) and to provide some insight into the role and impact of governance on provincial development. It also focuses on the Tibetan ethnic group in order to develop as full an understanding as possible about the province. The challenges faced by Qinghai concern in particular its environment, economic development, and cultural diversity, all of which are closely interrelated. The findings demonstrate that Qinghai Province is not a powerful actor, because it has weak communications with the central government and weak collaboration with its stakeholders and civil society. How Qinghai s provincial government conducts provincial development remains a key question in terms of shaping the province s future. The question is how is Qinghai s government best able to govern in a way that is beneficial for the people. This study demonstrates that this is a significant question that challenges governance everywhere, and particularly in China given the absence of democracy. This study provides the ingredients for reflection as to how provincial government can be motivated to choose to govern in a sustainable way, instead of leaning on growth factors with too little consideration about the impact on the environment and the people.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines the Chinese press discussion about democratic centralism in 1978-1981 in newspapers, political journals and academic journals distributed nationwide. It is thus a study of intellectual trends during the Hua Guofeng period and of methods, strategies, and techniques of public political discussion of the time. In addition, this study presents democratic centralism as a comprehensive theory of democracy and evaluates this theory. It compares the Chinese theory of democratic centralism with Western traditions of democracy, not only with the standard liberal theory but also with traditions of participatory and deliberative democracy, in order to evaluate whether the Chinese theory of democratic centralism forms a legitimate theory of democracy. It shows that the Chinese theory comes close to participatory types of democracy and shares a conception of democracy as communication with the theory of deliberative democracy. Therefore, the Chinese experience provides some empirical evidence of the practicability of these traditions of democracy. Simultaneously, this study uses experiences of participatory democracies outside of China to explain some earlier findings about the Chinese practices. This dissertation also compares Chinese theory with some common Western theories and models of Chinese society as well as with Western understandings of Chinese political processes. It thus aims at opening more dialogue between Chinese and Western political theories and understandings about Chinese polity. This study belongs to scholarly traditions of the history of ideas, political philosophy, comparative politics, and China studies. The main finding of this study is that the Chinese theory of democratic centralism is essentially a theory about democracy, but whether its scrupulous practicing alone would be sufficient for making a country a democracy depends on which established definition of democracy one applies and on what kind of democratic deficits are seen as being acceptable within a truly democratic system. Nevertheless, since the Chinese theory of democratic centralism fits well with some established definitions of democracy and since democratic deficits are a reality in all actual democracies, the Chinese themselves are talking about democracy in terms acceptable to Western political philosophy as well.