33 resultados para DOUBLE-GYRE FLOW

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study concentrates on the contested concept of pastiche in literary studies. It offers the first detailed examination of the history of the concept from its origins in the seventeenth century to the present, showing how pastiche emerged as a critical concept in interaction with the emerging conception of authorial originality and the copyright laws protecting it. One of the key results of this investigation is the contextualisation of the postmodern debate on pastiche. Even though postmodern critics often emphasise the radical novelty of pastiche, they in fact resuscitate older positions and arguments without necessarily reflecting on their historical conditions. This historical background is then used to analyse the distinction between the primarily French conception of pastiche as the imitation of style and the postmodern notion of it as the compilation of different elements. The latter s vagueness and inclusiveness detracts from its value as a critical concept. The study thus concentrates on the notion of stylistic pastiche, challenging the widespread prejudice that it is merely an indication of lack of talent. Because it is multiply based on repetition, pastiche is in fact a highly ambiguous or double-edged practice that calls into question the distinction between repetition and original, thereby undermining the received notion of individual unique authorship as a fundamental aesthetic value. Pastiche does not, however, constitute a radical upheaval of the basic assumptions on which the present institution of literature relies, since, in order to mark its difference, pastiche always refers to a source outside itself against which its difference is measured. Finally, the theoretical analysis of pastiche is applied to literary works. The pastiches written by Marcel Proust demonstrate how it can become an integral part of a writer s poetics: imitation of style is shown to provide Proust with a way of exploring the role of style as a connecting point between inner vision and reality. The pastiches of the Sherlock Holmes stories by Michael Dibdin, Nicholas Meyer and the duo Adrian Conan Doyle and John Dickson Carr illustrate the functions of pastiche within a genre detective fiction that is itself fundamentally repetitive. A.S. Byatt s Possession and D.M. Thomas s Charlotte use Victorian pastiches to investigate the conditions of literary creation in the age of postmodern suspicion of creativity and individuality. The study thus argues that the concept of pastiche has valuable insights to offer to literary criticism and theory, and that literary pastiches, though often dismissed in reviews and criticism, are a particularly interesting object of study precisely because of their characteristic ambiguity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flow experience is often defined either as an experience of high concentration and enjoyment or as a situation, where high challenges are matched with high skills. According to core-emotion theories, the experience of any emotion contains two core emotions: valence and arousal. Through an accurate mathematical model, the present study investigated, whether the experience of concentration and enjoyment is related to situations where both challenge and skills are high and in balance. Further, it was investigated what sort of core emotions are related to differing relationships between challenge and skills. Finally, university students’ experiences of their natural study environments were described in terms of core emotions and in terms of relationships between challenge and skills. Participants were 55 university students who participated two weeks research period. Altogether 3367 questionnaire answers were collected with the CASS experience-sampling method, operating in 3G-mobile phones. The relationship between challenge and skills (competence) was defined in an exact way in polar coordinates. An enjoyable and concentrated flow experience was defined as a sum variable of absorption, interest and enthusiasm. Core emotions were calculated with factor analysis from nine emotion variables. As expected, an experience of concentration and enjoyment was, on average, related to the situations where both challenge and skills were high and in balance. This was not, however, the case in every situation. Thus, it should be taken into consideration how flow experience is operationalised in experience sampling studies. When flow experience was defined as a situation of high challenge and high skills, it was often related to high valence and arousal emotions such as excitement or enthusiasm. A happier or a more tranquil enjoyment was related to situations of moderate challenge and high skills. Experiences differed clearly between various natural study environments. At lectures students were often bored or mentally absent, and did not experience challenges. In a small group students were often excited or enthusiastic, and showed optimal balance between challenge and skills. At library students felt satisfied and were engaged in highly challenging work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of this study was to explore experiences induced by playing digital games (i.e. meaning of playing). In addition, the study aimed at structuring the larger entities of gaming experience. This was done by using theory-driven and data grounded approaches. Previously gaming experiences have not been explored as a whole. The consideration of gaming experiences on the basis of psychological theories and studies has also been rare. The secondary goal of this study was to clarify, whether the individual meanings of playing are connected with flow experience in an occasional gaming situation. Flow is an enjoyable experience and usually activities that induce flow are gladly repeated. Previously, flow has been proved to be an essential concept in the context of playing, but the relations between meanings of playing and flow have not been studied. The relations between gender and gaming experiences were examined throughout the study, as well as the relationship between gaming frequency and experiences. The study was divided into two sections, of which the first was composed according to the main goals. Its data was gathered by using an Internet questionnaire. The other section covered the themes that were formulated on the basis of the secondary aims. In that section, the participants played a driving game for 40 minutes and then filled in a questionnaire, which measured flow related experiences. In both sections, the participants were mainly young Finnish adults. All the participants in the second section (n = 60) had already participated in the first section (n = 267). Both qualitative and quantitative research techniques were used in the study. In the first section, freely described gaming experiences were classified according to the grounded theory. After that, the most common categories were further classified into the basic structures of gaming experience, some according to the existing theories of experience structure and some according to the data (i.e. grounded theory). In the other section flow constructs were measured and used as grouping variables in a cluster analysis. Three meaningful groups were compared regarding the meanings of gaming that were explored in the first section. The descriptions of gaming experiences were classified into four main categories, which were conceptions of the gaming process, emotions, motivations and focused attention. All the theory-driven categories were found in the data. This frame of reference can be utilized in future when reliability and validity of already existing methods for measuring gaming experiences are considered or new methods will be developed. The connection between the individual relevance of gaming and flow was minor. However, as the scope was specified to relations between primary meanings of playing and flow, it was noticed that attributing enjoyment to gaming did not lead to the strongest flow-experiences. This implies that the issue should be studied more in future. As a whole this study proves that gamer-related research from numerous vantage points can benefit from concentrating on gaming experiences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study explored the possibilities the psychophysiological methodology offer to flow research. Facial electromyography has often been used to index valence, and electrodermal activity to index arousal, the two basic dimensions of emotion. It was hypothesized that these measures can also be used to examine enjoyment, a basic component of flow experiment. A digital game was used to induce flow, and physiological activity of 32 subjects was measured continuously. Flow State Scale was used to assess flow. Activity of corrugator supercilii muscle, an index of negative valence, was negatively correlated with flow reports, as hypothesized. Contrary to hypothesis, skin conductance level, an index of arousal, was unrelated to self-reported flow. The results for association between flow and zygomaticus major and orbicularis oculi muscle activities, indices of positive valence, were inconclusive, possibly due to experimental design where only tonic measures were available. Psychophysiological methods are recommended for future studies of flow. Specifically, the time series approach may be particularly viable in examining the temporal aspects of flow, an area currently unexplored. Furthermore, it is suggested that digital game research would benefit from psychophysiological study of game-related flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of the immune system is to protect an organism against pathogens while maintaining tolerance against self. T cells are an essential component of the immune system and they develop in the thymus. The AIRE (autoimmune regulator) gene product plays an important role in T cell development, as it promotes expression of peripheral tissue antigens in the thymus. Developing T cells, thymocytes, which recognize self-antigens with high affinity are deleted. However, this deletion process is not perfect and not all autoreactive T cells are destroyed. When the distinction between self and non-self fails, tolerance breaks and the immune system attacks the host s own tissues. This results in autoimmunity. Regulatory T cells contribute to the maintenance of self-tolerance. They can actively suppress the function of autoreactive cells. Several populations of cells with regulatory properties have been described, but the best characterized population is the natural regulatory T cells (Treg cells), which develop in the thymus and express the transcription factor FOXP3. The thymic development of Treg cells in humans is the subject of this thesis. Thymocytes at different developmental stages were analyzed using flow cytometry. The CD4-CD8- double-negative (DN) thymocytes are the earliest T cell precursors in the T cell lineage. My results show that the Treg cell marker FOXP3 is up-regulated already in a subset of these DN thymocytes. FOXP3+ cells were also found among the more mature CD4+CD8+ double-positive (DP) cells and among the CD4+ and CD8+ single-positive (SP) thymocytes. The different developmental stages of the FOXP3+ thymocytes were isolated and their gene expression examined by quantitative PCR. T cell receptor (TCR) repertoire analysis was used to compare these different thymocyte populations. My data show that in humans commitment to the Treg cell lineage is an early event and suggest that the development of Treg cells follows a linear developmental pathway, FOXP3+ DN precursors evolving through the DP stage to become mature CD4+ Treg cells. Most T cells have only one kind of TCR on their cell surface, but a small fraction of cells expresses two different TCRs. My results show that the expression of two different TCRs is enriched among Treg cells. Furthermore, both receptors were capable of transmitting signals when bound by a ligand. By extrapolating flow cytometric data, it was estimated that the majority of peripheral blood Treg cells are indeed dual-specific. The high frequency of dual-specific cells among human Treg cells suggests that dual-specificity has a role in directing these cells to the Treg cell lineage. It is known that both genetic predisposition and environmental factors influence the development of autoimmunity. It is also known that the dysfunction or absence of Treg cells leads to the development of autoimmune manifestations. APECED (autoimmune polyendocrinopathy-candidiasis-ectodermal dystrophy) is a rare monogenic autoimmune disease, caused by mutations in the AIRE gene. In the absence of AIRE gene product, deletion of self-specific T cells is presumably disturbed and autoreactive T cells escape to the periphery. I examined whether Treg cells are also affected in APECED. I found that the frequency of FOXP3+ Treg cells and the level of FOXP3 expression were significantly lower in APECED patients than in controls. Additionally, when studied in cell cultures, the suppressive capacity of the patients' Treg cells was impaired. Additionally, repertoire analysis showed that the TCR repertoire of Treg cells was altered. These results suggest that AIRE contributes to the development of Treg cells in humans and the selection of Treg cells is impaired in APECED patients. In conclusion, my thesis elucidates the developmental pathway of Treg cells in humans. The differentiation of Tregs begins early during thymic development and both the cells dual-specificity and AIRE probably affect the final commitment of Treg cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Standards have been placed to regulate the microbial and preservative contents to assure that foods are safe to the consumer. In a case of a food-related disease outbreak, it is crucial to be able to detect and identify quickly and accurately the cause of the disease. In addition, for every day control of food microbial and preservative contents, the detection methods must be easily performed for numerous food samples. In this present study, quicker alternative methods were studied for identification of bacteria by DNA fingerprinting. A flow cytometry method was developed as an alternative to pulsed-field gel electrophoresis, the golden method . DNA fragment sizing by an ultrasensitive flow cytometer was able to discriminate species and strains in a reproducible and comparable manner to pulsed-field gel electrophoresis. This new method was hundreds times faster and 200,000 times more sensitive. Additionally, another DNA fingerprinting identification method was developed based on single-enzyme amplified fragment length polymorphism (SE-AFLP). This method allowed the differentiation of genera, species, and strains of pathogenic bacteria of Bacilli, Staphylococci, Yersinia, and Escherichia coli. These fingerprinting patterns obtained by SE-AFLP were simpler and easier to analyze than those by the traditional amplified fragment length polymorphism by double enzyme digestion. Nisin (E234) is added as a preservative to different types of foods, especially dairy products, around the world. Various detection methods exist for nisin, but they lack in sensitivity, speed or specificity. In this present study, a sensitive nisin-induced green fluorescent protein (GFPuv) bioassay was developed using the Lactococcus lactis two-component signal system NisRK and the nisin-inducible nisA promoter. The bioassay was extremely sensitive with detection limit of 10 pg/ml in culture supernatant. In addition, it was compatible for quantification from various food matrices, such as milk, salad dressings, processed cheese, liquid eggs, and canned tomatoes. Wine has good antimicrobial properties due to its alcohol concentration, low pH, and organic content and therefore often assumed to be microbially safe to consume. Another aim of this thesis was to study the microbiota of wines returned by customers complaining of food-poisoning symptoms. By partial 16S rRNA gene sequence analysis, ribotyping, and boar spermatozoa motility assay, it was identified that one of the wines contained a Bacillus simplex BAC91, which produced a heat-stable substance toxic to the mitochondria of sperm cells. The antibacterial activity of wine was tested on the vegetative cells and spores of B. simplex BAC91, B. cereus type strain ATCC 14579 and cereulide-producing B. cereus F4810/72. Although the vegetative cells and spores of B. simplex BAC91 were sensitive to the antimicrobial effects of wine, the spores of B. cereus strains ATCC 14579 and F4810/72 stayed viable for at least 4 months. According to these results, Bacillus spp., more specifically spores, can be a possible risk to the wine consumer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asymmetrical flow field-flow fractionation (AsFlFFF) was constructed, and its applicability to industrial, biochemical, and pharmaceutical applications was studied. The effect of several parameters, such as pH, ionic strength, temperature and the reactants mixing ratios on the particle sizes, molar masses, and the formation of aggregates of macromolecules was determined by AsFlFFF. In the case of industrial application AsFlFFF proved to be a valuable tool in the characterization of the hydrodynamic particle sizes, molar masses and phase transition behavior of various poly(N-isopropylacrylamide) (PNIPAM) polymers as a function of viscosity and phase transition temperatures. The effect of sodium chloride salt and the molar ratio of cationic and anionic polyelectrolytes on the hydrodynamic particle sizes of poly (methacryloxyethyl trimethylammonium chloride) and poly (ethylene oxide)-block-poly (sodium methacrylate) and their complexes were studied. The particle sizes of PNIPAM polymers, and polyelectrolyte complexes measured by AsFlFFF were in agreement with those obtained by dynamic light scattering. The molar masses of PNIPAM polymers obtained by AsFlFFF and size exclusion chromatography agreed also well. In addition, AsFlFFF proved to be a practical technique in thermo responsive behavior studies of polymers at temperatures up to about 50 oC. The suitability of AsFlFFF for biological, biomedical, and pharmaceutical applications was proved, upon studying the lipid-protein/peptide interactions, and the stability of liposomes at different temperatures. AsFlFFF was applied to the studies on the hydrophobic and electrostatic interactions between cytochrome c (a basic peripheral protein) and anionic lipid, and oleic acid, and sodium dodecyl sulphate surfactant. A miniaturized AsFlFFF constructed in this study was exploited in the elucidation of the effect of copper (II), pH, ionic strength, and vortexing on the particle sizes of low-density lipoproteins.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this study is to evaluate selected geophysical, structural and topographic methods on regional, local, and tunnel and borehole scales, as indicators of the properties of fracture zones or fractures relevant to groundwater flow. Such information serves, for example, groundwater exploration and prediction of the risk of groundwater inflow in underground construction. This study aims to address how the features detected by these methods link to groundwater flow in qualitative and semi-quantitative terms and how well the methods reveal properties of fracturing affecting groundwater flow in the studied sites. The investigated areas are: (1) the Päijänne Tunnel for water-conveyance whose study serves as a verification of structures identified on regional and local scales; (2) the Oitti fuel spill site, to telescope across scales and compare geometries of structural assessment; and (3) Leppävirta, where fracturing and hydrogeological environment have been studied on the scale of a drilled well. The methods applied in this study include: the interpretation of lineaments from topographic data and their comparison with aeromagnetic data; the analysis of geological structures mapped in the Päijänne Tunnel; borehole video surveying; groundwater inflow measurements; groundwater level observations; and information on the tunnel s deterioration as demonstrated by block falls. The study combined geological and geotechnical information on relevant factors governing groundwater inflow into a tunnel and indicators of fracturing, as well as environmental datasets as overlays for spatial analysis using GIS. Geophysical borehole logging and fluid logging were used in Leppävirta to compare the responses of different methods to fracturing and other geological features on the scale of a drilled well. Results from some of the geophysical measurements of boreholes were affected by the large diameter (gamma radiation) or uneven surface (caliper) of these structures. However, different anomalies indicating more fractured upper part of the bedrock traversed by well HN4 in Leppävirta suggest that several methods can be used for detecting fracturing. Fracture trends appear to align similarly on different scales in the zone of the Päijänne Tunnel. For example, similarities of patterns were found between the regional magnetic trends, correlating with orientations of topographic lineaments interpreted as expressions of fracture zones. The same structural orientations as those of the larger structures on local or regional scales were observed in the tunnel, even though a match could not be made in every case. The size and orientation of the observation space (patch of terrain at the surface, tunnel section, or borehole), the characterization method, with its typical sensitivity, and the characteristics of the location, influence the identification of the fracture pattern. Through due consideration of the influence of the sampling geometry and by utilizing complementary fracture characterization methods in tandem, some of the complexities of the relationship between fracturing and groundwater flow can be addressed. The flow connections demonstrated by the response of the groundwater level in monitoring wells to pressure decrease in the tunnel and the transport of MTBE through fractures in bedrock in Oitti, highlight the importance of protecting the tunnel water from a risk of contamination. In general, the largest values of drawdown occurred in monitoring wells closest to the tunnel and/or close to the topographically interpreted fracture zones. It seems that, to some degree, the rate of inflow shows a positive correlation with the level of reinforcement, as both are connected with the fracturing in the bedrock. The following geological features increased the vulnerability of tunnel sections to pollution, especially when several factors affected the same locations: (1) fractured bedrock, particularly with associated groundwater inflow; (2) thin or permeable overburden above fractured rock; (3) a hydraulically conductive layer underneath the surface soil; and (4) a relatively thin bedrock roof above the tunnel. The observed anisotropy of the geological media should ideally be taken into account in the assessment of vulnerability of tunnel sections and eventually for directing protective measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The future use of genetically modified (GM) plants in food, feed and biomass production requires a careful consideration of possible risks related to the unintended spread of trangenes into new habitats. This may occur via introgression of the transgene to conventional genotypes, due to cross-pollination, and via the invasion of GM plants to new habitats. Assessment of possible environmental impacts of GM plants requires estimation of the level of gene flow from a GM population. Furthermore, management measures for reducing gene flow from GM populations are needed in order to prevent possible unwanted effects of transgenes on ecosystems. This work develops modeling tools for estimating gene flow from GM plant populations in boreal environments and for investigating the mechanisms of the gene flow process. To describe spatial dimensions of the gene flow, dispersal models are developed for the local and regional scale spread of pollen grains and seeds, with special emphasis on wind dispersal. This study provides tools for describing cross-pollination between GM and conventional populations and for estimating the levels of transgenic contamination of the conventional crops. For perennial populations, a modeling framework describing the dynamics of plants and genotypes is developed, in order to estimate the gene flow process over a sequence of years. The dispersal of airborne pollen and seeds cannot be easily controlled, and small amounts of these particles are likely to disperse over long distances. Wind dispersal processes are highly stochastic due to variation in atmospheric conditions, so that there may be considerable variation between individual dispersal patterns. This, in turn, is reflected to the large amount of variation in annual levels of cross-pollination between GM and conventional populations. Even though land-use practices have effects on the average levels of cross-pollination between GM and conventional fields, the level of transgenic contamination of a conventional crop remains highly stochastic. The demographic effects of a transgene have impacts on the establishment of trangenic plants amongst conventional genotypes of the same species. If the transgene gives a plant a considerable fitness advantage in comparison to conventional genotypes, the spread of transgenes to conventional population can be strongly increased. In such cases, dominance of the transgene considerably increases gene flow from GM to conventional populations, due to the enhanced fitness of heterozygous hybrids. The fitness of GM plants in conventional populations can be reduced by linking the selectively favoured primary transgene to a disfavoured mitigation transgene. Recombination between these transgenes is a major risk related to this technique, especially because it tends to take place amongst the conventional genotypes and thus promotes the establishment of invasive transgenic plants in conventional populations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrogen (N) and phosphorus (P) are essential elements for all living organisms. However, in excess, they contribute to several environmental problems such as aquatic and terrestrial eutrophication. Globally, human action has multiplied the volume of N and P cycling since the onset of industrialization. The multiplication is a result of intensified agriculture, increased energy consumption and population growth. Industrial ecology (IE) is a discipline, in which human interaction with the ecosystems is investigated using a systems analytical approach. The main idea behind IE is that industrial systems resemble ecosystems, and, like them, industrial systems can then be described using material, energy and information flows and stocks. Industrial systems are dependent on the resources provided by the biosphere, and these two cannot be separated from each other. When studying substance flows, the aims of the research from the viewpoint of IE can be, for instance, to elucidate the ways how the cycles of a certain substance could be more closed and how the flows of a certain substance could be decreased per unit of production (= dematerialization). In Finland, N and P are studied widely in different ecosystems and environmental emissions. A holistic picture comparing different societal systems is, however, lacking. In this thesis, flows of N and P were examined in Finland using substance flow analysis (SFA) in the following four subsystems: I) forest industry and use of wood fuels, II) food production and consumption, III) energy, and IV) municipal waste. A detailed analysis at the end of the 1990s was performed. Furthermore, historical development of the N and P flows was investigated in the energy system (III) and the municipal waste system (IV). The main research sources were official statistics, literature, monitoring data, and expert knowledge. The aim was to identify and quantify the main flows of N and P in Finland in the four subsystems studied. Furthermore, the aim was to elucidate whether the nutrient systems are cyclic or linear, and to identify how these systems could be more efficient in the use and cycling of N and P. A final aim was to discuss how this type of an analysis can be used to support decision-making on environmental problems and solutions. Of the four subsystems, the food production and consumption system and the energy system created the largest N flows in Finland. For the creation of P flows, the food production and consumption system (Paper II) was clearly the largest, followed by the forest industry and use of wood fuels and the energy system. The contribution of Finland to N and P flows on a global scale is low, but when compared on a per capita basis, we are one of the largest producers of these flows, with relatively high energy and meat consumption being the main reasons. Analysis revealed the openness of all four systems. The openness is due to the high degree of internationality of the Finnish markets, the large-scale use of synthetic fertilizers and energy resources and the low recycling rate of many waste fractions. Reduction in the use of fuels and synthetic fertilizers, reorganization of the structure of energy production, reduced human intake of nutrients and technological development are crucial in diminishing the N and P flows. To enhance nutrient recycling and replace inorganic fertilizers, recycling of such wastes as wood ash and sludge could be promoted. SFA is not usually sufficiently detailed to allow specific recommendations for decision-making to be made, but it does yield useful information about the relative magnitude of the flows and may reveal unexpected losses. Sustainable development is a widely accepted target for all human action. SFA is one method that can help to analyse how effective different efforts are in leading to a more sustainable society. SFA's strength is that it allows a holistic picture of different natural and societal systems to be drawn. Furthermore, when the environmental impact of a certain flow is known, the method can be used to prioritize environmental policy efforts.