21 resultados para Los Alamos Scientific Laboratory. Theoretical Division.
em Helda - Digital Repository of University of Helsinki
Resumo:
Standards have been placed to regulate the microbial and preservative contents to assure that foods are safe to the consumer. In a case of a food-related disease outbreak, it is crucial to be able to detect and identify quickly and accurately the cause of the disease. In addition, for every day control of food microbial and preservative contents, the detection methods must be easily performed for numerous food samples. In this present study, quicker alternative methods were studied for identification of bacteria by DNA fingerprinting. A flow cytometry method was developed as an alternative to pulsed-field gel electrophoresis, the golden method . DNA fragment sizing by an ultrasensitive flow cytometer was able to discriminate species and strains in a reproducible and comparable manner to pulsed-field gel electrophoresis. This new method was hundreds times faster and 200,000 times more sensitive. Additionally, another DNA fingerprinting identification method was developed based on single-enzyme amplified fragment length polymorphism (SE-AFLP). This method allowed the differentiation of genera, species, and strains of pathogenic bacteria of Bacilli, Staphylococci, Yersinia, and Escherichia coli. These fingerprinting patterns obtained by SE-AFLP were simpler and easier to analyze than those by the traditional amplified fragment length polymorphism by double enzyme digestion. Nisin (E234) is added as a preservative to different types of foods, especially dairy products, around the world. Various detection methods exist for nisin, but they lack in sensitivity, speed or specificity. In this present study, a sensitive nisin-induced green fluorescent protein (GFPuv) bioassay was developed using the Lactococcus lactis two-component signal system NisRK and the nisin-inducible nisA promoter. The bioassay was extremely sensitive with detection limit of 10 pg/ml in culture supernatant. In addition, it was compatible for quantification from various food matrices, such as milk, salad dressings, processed cheese, liquid eggs, and canned tomatoes. Wine has good antimicrobial properties due to its alcohol concentration, low pH, and organic content and therefore often assumed to be microbially safe to consume. Another aim of this thesis was to study the microbiota of wines returned by customers complaining of food-poisoning symptoms. By partial 16S rRNA gene sequence analysis, ribotyping, and boar spermatozoa motility assay, it was identified that one of the wines contained a Bacillus simplex BAC91, which produced a heat-stable substance toxic to the mitochondria of sperm cells. The antibacterial activity of wine was tested on the vegetative cells and spores of B. simplex BAC91, B. cereus type strain ATCC 14579 and cereulide-producing B. cereus F4810/72. Although the vegetative cells and spores of B. simplex BAC91 were sensitive to the antimicrobial effects of wine, the spores of B. cereus strains ATCC 14579 and F4810/72 stayed viable for at least 4 months. According to these results, Bacillus spp., more specifically spores, can be a possible risk to the wine consumer.
Resumo:
The aim of the present thesis was to study the role of the epithelial sodium channel (ENaC) in clearance of fetal lung fluid in the newborn infant by measurement of airway epithelial expression of ENaC, of nasal transepithelial potential difference (N-PD), and of lung compliance (LC). In addition, the effect of postnatal dexamethasone on airway epithelial ENaC expression was measured in preterm infants with bronchopulmonary dysplasia (BPD). The patient population was formed of selected term newborn infants born in the Department of Obstetrics (Studies II-IV) and selected preterm newborn infants treated in the neonatal intensive care unit of the Hospital for Children and Adolescents (Studies I and IV) of the Helsinki University Central Hospital in Finland. A small population of preterm infants suffering from BPD was included in Study I. Studies I, III, and IV included airway epithelial measurement of ENaC and in Studies II and III, measurement of N-PD and LC. In Study I, ENaC expression analyses were performed in the Research Institute of the Hospital for Sick Children in Toronto, Ontario, Canada. In the following studies, analyses were performed in the Scientific Laboratory of the Hospital for Children and Adolescents. N-PD and LC measurements were performed at bedside in these hospitals. In term newborn infants, the percentage of amiloride-sensitive N-PD, a surrogate for ENaC activity, measured during the first 4 postnatal hours correlates positively with LC measured 1 to 2 days postnatally. Preterm infants with BPD had, after a therapeutic dose of dexamethasone, higher airway epithelial ENaC expression than before treatment. These patients were subsequently weaned from mechanical ventilation, probably as a result of the clearance of extra fluid from the alveolar spaces. In addition, we found that in preterm infants ENaC expression increases with gestational age (GA). In preterm infants, ENaC expression in the airway epithelium was lower than in term newborn infants. During the early postnatal period in those born both preterm and term airway epithelial βENaC expression decreased significantly. Term newborn infants delivered vaginally had a significantly smaller airway epithelial expression of αENaC after the first postnatal day than did those delivered by cesarean section. The functional studies showed no difference in N-PD between infants delivered vaginally and by cesarean section. We therefore conclude that the low airway epithelial expression of ENaC in the preterm infant and the correlation of N-PD with LC in the term infant indicate a role for ENaC in the pathogenesis of perinatal pulmonary adaptation and neonatal respiratory distress. Because dexamethasone raised ENaC expression in preterm infants with BPD, and infants were subsequently weaned from ventilator therapy, we suggest that studies on the treatment of respiratory distress in the preterm infant should include the induction of ENaC activity.
Resumo:
This dissertation inquires into the relationship between gender and biopolitics. Biopolitics, according to Michel Foucault, is the mode of politics that is situated and exercised at the level of life. The dissertation claims that gender is a technology of biopower specific to the optimisation of the sexual reproduction of human life, deployed through the scientific and governmental problematisation of declining fertility rates in the mid-twentieth century. Just as Michel Foucault claimed that sexuality became a scientific and political discourse in the nineteenth century, gender has also since emerged in these fields. In this dissertation, gender is treated as neither a representation of sex nor a cultural construct or category of identity. Rather, a genealogy of gender as an apparatus of biopower in conducted. It demonstrates how scientific and theoretical developments in the twentieth century marshalled gender into the sex/sexuality apparatus as a new technology of liberal biopower. Gender, I argue, has become necessary for the Western liberal order to recapture and re-optimise the life-producing functions of sex that reproduce the very object of biopolitics: life. The concept of the life function is introduced to analyse the life-producing violence of the sex/sexuality/gender apparatus. To do this, the thesis rereads the work of Michel Foucault through Gilles Deleuze for a deeper grasp of the material strategies of biopower and how it produces categories of difference and divides population according to them. The work of Judith Butler, in turn, is used as a foil against which to rearticulate the question of how to examine gender genealogically and biopolitically. The dissertation then executes a genealogy of gender, tracing the changing rationalities of sex/sexuality/gender from early feminist thought, through mid-twentieth century sexological, feminist, and demographic research, to current EU policy. According to this genealogy, in the mid-twentieth century demographers perceived that sexuality/sex, which Foucault observed as the life-producing biopolitical apparatus, was no longer sufficiently disciplining human bodies to reproduce. The life function was escaping the grasp of biopower. The analysis demonstrates how gender theory was taken up as a means of reterritorialising the life function: nature would be disciplined to reproduce by controlling culture. The crucial theoretical and genealogical argument of the thesis, that gender is a discourse with biopolitical foundations and a technology of biopower, radically challenges the premises of gender theory and feminist politics, as well as the emancipatory potential often granted to the gender concept. The project asks what gender means, what biopolitical function it performs, and what is at stake for feminist politics when it engages with it. In so doing, it identifies biopolitics and the problem of life as possibly the most urgent arena for feminist politics today.
Resumo:
Ei saatavilla
Resumo:
The Earth's ecosystems are protected from the dangerous part of the solar ultraviolet (UV) radiation by stratospheric ozone, which absorbs most of the harmful UV wavelengths. Severe depletion of stratospheric ozone has been observed in the Antarctic region, and to a lesser extent in the Arctic and midlatitudes. Concern about the effects of increasing UV radiation on human beings and the natural environment has led to ground based monitoring of UV radiation. In order to achieve high-quality UV time series for scientific analyses, proper quality control (QC) and quality assurance (QA) procedures have to be followed. In this work, practices of QC and QA are developed for Brewer spectroradiometers and NILU-UV multifilter radiometers, which measure in the Arctic and Antarctic regions, respectively. These practices are applicable to other UV instruments as well. The spectral features and the effect of different factors affecting UV radiation were studied for the spectral UV time series at Sodankylä. The QA of the Finnish Meteorological Institute's (FMI) two Brewer spectroradiometers included daily maintenance, laboratory characterizations, the calculation of long-term spectral responsivity, data processing and quality assessment. New methods for the cosine correction, the temperature correction and the calculation of long-term changes of spectral responsivity were developed. Reconstructed UV irradiances were used as a QA tool for spectroradiometer data. The actual cosine correction factor was found to vary between 1.08-1.12 and 1.08-1.13. The temperature characterization showed a linear temperature dependence between the instrument's internal temperature and the photon counts per cycle. Both Brewers have participated in international spectroradiometer comparisons and have shown good stability. The differences between the Brewers and the portable reference spectroradiometer QASUME have been within 5% during 2002-2010. The features of the spectral UV radiation time series at Sodankylä were analysed for the time period 1990-2001. No statistically significant long-term changes in UV irradiances were found, and the results were strongly dependent on the time period studied. Ozone was the dominant factor affecting UV radiation during the springtime, whereas clouds played a more important role during the summertime. During this work, the Antarctic NILU-UV multifilter radiometer network was established by the Instituto Nacional de Meteorogía (INM) as a joint Spanish-Argentinian-Finnish cooperation project. As part of this work, the QC/QA practices of the network were developed. They included training of the operators, daily maintenance, regular lamp tests and solar comparisons with the travelling reference instrument. Drifts of up to 35% in the sensitivity of the channels of the NILU-UV multifilter radiometers were found during the first four years of operation. This work emphasized the importance of proper QC/QA, including regular lamp tests, for the multifilter radiometers also. The effect of the drifts were corrected by a method scaling the site NILU-UV channels to those of the travelling reference NILU-UV. After correction, the mean ratios of erythemally-weighted UV dose rates measured during solar comparisons between the reference NILU-UV and the site NILU-UVs were 1.007±0.011 and 1.012±0.012 for Ushuaia and Marambio, respectively, when the solar zenith angle varied up to 80°. Solar comparisons between the NILU-UVs and spectroradiometers showed a ±5% difference near local noon time, which can be seen as proof of successful QC/QA procedures and transfer of irradiance scales. This work also showed that UV measurements made in the Arctic and Antarctic can be comparable with each other.
Resumo:
A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.
Resumo:
This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.
Resumo:
The number of immigrant students in vocational education and training is steadily increasing in Finland. This poses challenges for teachers and schools. This research focuses on emerging questions of intercultural learning in the context of immigrant training, and on a method the Culture Laboratory that was developed in an attempt to respond to the challenges. The main methodological and theoretical framework lies in cultural-historical activity theory, developmental work research, and in the concepts of the intercultural and hybridity. The empirical material consists of videotaped recordings of discussions in the Culture Laboratory. The five main research questions focused on the strengths and limitations of the Culture Laboratory as a tool for intercultural learning, the significance of disturbances in it, the potential of suggestions for intercultural learning, paper as a mediating artifact , and the concept of intercultural space. The findings showed that the Culture Laboratory offered a solid background for developing intercultural learning. The disturbances manifested revealed a multitude of scripts and activities. It was also suggested that the structure of expansive learning could start from externalization instead of internalization. The suggestions the participants made opened up a hybrid learning space for intercultural development, and offered a good springboard for new ideas. Learning in Paperland posed both challenges and opportunities for immigrant students, and different paper trails emerged. Intercultural space in the Culture Laboratory was a developmental zone in which a hybrid process of observing, comparing, and creating took place. Key words: intercultural learning, immigrant training, cultural-historical activity theory, developmental work research,
Resumo:
Reciprocal development of the object and subject of learning. The renewal of the learning practices of front-line communities in a telecommunications company as part of the techno-economical paradigm change. Current changes in production have been seen as an indication of a shift from the techno-economical paradigm of a mass-production era to a new paradigm of the information and communication technological era. The rise of knowledge management in the late 1990s can be seen as one aspect of this paradigm shift, as knowledge creation and customer responsiveness were recognized as the prime factors in business competition. However, paradoxical conceptions concerning learning and agency have been presented in the discussion of knowledge management. One prevalent notion in the literature is that learning is based on individuals’ voluntary actions and this has now become incompatible with the growing interest in knowledge-management systems. Furthermore, commonly held view of learning as a general process that is independent of the object of learning contradicts the observation that the current need for new knowledge and new competences are caused by ongoing techno-economic changes. Even though the current view acknowledges that individuals and communities have key roles in knowledge creation, this conception defies the idea of the individuals’ and communities’ agency in developing the practices through which they learn. This research therefore presents a new theoretical interpretation of learning and agency based on Cultural-Historical Activity Theory. This approach overcomes the paradoxes in knowledge-management theory and offers means for understanding and analyzing changes in the ways of learning within work communities. This research is also an evaluation of the Competence-Laboratory method which was developed as part of the study as a special application of Developmental Work Research methodology. The research data comprises the videotaped competence-laboratory processes of four front-line work communities in a telecommunications company. The findings reported in the five articles included in this thesis are based on the analyses of this data. The new theoretical interpretation offered here is based on the assessment that the findings reported in the articles represent one of the front lines of the ongoing historical transformation of work-related learning since the research site represents one of the key industries of the new “knowledge society”. The research can be characterized as elaboration of a hypothesis concerning the development of work related learning. According to the new theoretical interpretation, the object of activity is also the object of distributed learning in work communities. The historical socialization of production has increased the number of actors involved in an activity, which has also increased the number of mutual interdependencies as well as the need for communication. Learning practices and organizational systems of learning are historically developed forms of distributed learning mediated by specific forms of division of labor, specific tools, and specific rules. However, the learning practices of the mass production era become increasingly inadequate to accommodate the conditions in the new economy. This was manifested in the front-line work communities in the research site as an aggravating contradiction between the new objects of learning and the prevailing learning practices. The constituent element of this new theoretical interpretation is the idea of a work community’s learning as part of its collaborative mastery of the developing business activity. The development of the business activity is at the same time a practical and an epistemic object for the community. This kind of changing object cannot be mastered by using learning practices designed for the stable conditions of mass production, because learning has to change along the changes in business. According to the model introduced in this thesis, the transformation of learning proceeds through specific stages: predefined learning tasks are first transformed into learning through re-conceptualizing the object of the activity and of the joint learning and then, as the new object becomes stabilized, into the creation of new kinds of learning practices to master the re-defined object of the activity. This transformation of the form of learning is realized through a stepwise expansion of the work community’s agency. To summarize, the conceptual model developed in this study sets the tool-mediated co-development of the subject and the object of learning as the theoretical starting point for developing new, second-generation knowledge management methods. Key words: knowledge management, learning practice, organizational system of learning, agency
Resumo:
This study examines boundaries in health care organizations. Boundaries are sometimes considered things to be avoided in everyday living. This study suggests that boundaries can be important temporally and spatially emerging locations of development, learning, and change in inter-organizational activity. Boundaries can act as mediators of cultural and social formations and practices. The data of the study was gathered in an intervention project during the years 2000-2002 in Helsinki in which the care of 26 patients with multiple and chronic illnesses was improved. The project used the Change Laboratory method that represents a research assisted method for developing work. The research questions of the study are: (1) What are the boundary dynamics of development, learning, and change in health care for patients with multiple and chronic illnesses? (2) How do individual patients experience boundaries in their health care? (3) How are the boundaries of health care constructed and reconstructed in social interaction? (4) What are the dynamics of boundary crossing in the experimentation with the new tools and new practice? The methodology of the study, the ethnography of the multi-organizational field of activity, draws on cultural-historical activity theory and anthropological methods. The ethnographic fieldwork involves multiple research techniques and a collaborative strategy for raising research data. The data of this study consists of observations, interviews, transcribed intervention sessions, and patients' health documents. According to the findings, the care of patients with multiple and chronic illnesses emerges as fragmented by divisions of a patient and professionals, specialties of medicine and levels of health care organization. These boundaries have a historical origin in the Finnish health care system. As an implication of these boundaries, patients frequently experience uncertainty and neglect in their care. However, the boundaries of a single patient were transformed in the Change Laboratory discussions among patients, professionals and researchers. In these discussions, the questioning of the prevailing boundaries was triggered by the observation of gaps in inter-organizational care. Transformation of the prevailing boundaries was achieved in implementation of the collaborative care agreement tool and the practice of negotiated care. However, the new tool and practice did not expand into general use during the project. The study identifies two complementary models for the development of health care organization in Finland. The 'care package model', which is based on productivity and process models adopted from engineering and the 'model of negotiated care', which is based on co-configuration and the public good.
Resumo:
Education for a Technological Society. Public School Curriculum Construction, 1945-1952. The subject of my research is the significance of technology in the construction process of the public school curriculum during the years 1945-1952. During the period the war reparation and rebuilding placed demands and actions to rationalise and dramatise industry and agriculture. Thereby the ambitions of building a technological country and the reformation of curriculum took place simultaneously. Fordistian terms of reference, of which the principles were mass production, rationalisation and standardisation, a hierarchical division of labour and partition of assignments, provided a model for the developing curriculum. In the research the curriculum is examined as an artefact, which shapes socio-technically under the influence of social and technical factors. In the perspective of socio-technical construction the artefact is represented by the viewpoints of members of relevant social groups. The groups give meaning to the curriculum artefact, which determines the components of the curriculum. The weakness of the curriculum was its ineffectiveness, which was due to three critical problems. Firstly, the curriculum was to be based on scientific work, which meant the development of schools through experiments and scientific research. Secondly, the civilised conseption in the curriculum was to be composed of theoretical knowledge, as well as practical skills. Practical education was useful for both the individual and society. Thirdly, the curriculum was to be reformed in a way that the individuality of the pupil would be taken into account. It was useful for the society that talents and natural abilities of every pupil were observed and used to direct the pupil to the proper place in the social division of labour, according to the "right man in a right place" principle. The solutions to critical problems formed the instructions of the public school curriculum, which described the nature and content of education. Technology and its development were on essential part of the whole school curriculum process. The quality words connected to the development of technology - progress, rationality and effectiveness - were also suitable qualifiers and reasons for the reform of the curriculum. On the other hand, technology set a point of comparison and demand for the development of all phases of education. The view of technology was not clearly deterministic - it was also possible to shape technological society with the help of education. The public school curriculum process indicates how originally the principles of technological systems were shaped to the language of education and accepted in educational content.
Resumo:
The complexity of life is based on an effective energy transduction machinery, which has evolved during the last 3.5 billion years. In aerobic life, the utilization of the high oxidizing potential of molecular oxygen powers this machinery. Oxygen is safely reduced by a membrane bound enzyme, cytochrome c oxidase (CcO), to produce an electrochemical proton gradient over the mitochondrial or bacterial membrane. This gradient is used for energy-requiring reactions such as synthesis of ATP by F0F1-ATPase and active transport. In this thesis, the molecular mechanism by which CcO couples the oxygen reduction chemistry to proton-pumping has been studied by theoretical computer simulations. By building both classical and quantum mechanical model systems based on the X-ray structure of CcO from Bos taurus, the dynamics and energetics of the system were studied in different intermediate states of the enzyme. As a result of this work, a mechanism was suggested by which CcO can prevent protons from leaking backwards in proton-pumping. The use and activation of two proton conducting channels were also enlightened together with a mechanism by which CcO sorts the chemical protons from pumped protons. The latter problem is referred to as the gating mechanism of CcO, and has remained a challenge in the bioenergetics field for more than three decades. Furthermore, a new method for deriving charge parameters for classical simulations of complex metalloenzymes was developed.
Resumo:
Social groups are common across animal species. The reasons for grouping are straightforward when all individuals gain directly from cooperating. However, the situation becomes more complex when helping entails costs to the personal reproduction of individuals. Kin selection theory has offered a fruitful framework to explain such cooperation by stating that individuals may spread their genes not only through their own reproduction, but also by helping related individuals reproduce. However, kin selection theory also implicitly predicts conflicts when groups consist of non-clonal individuals, i.e. relatedness is less than one. Then, individual interests are not perfectly aligned, and each individual is predicted to favour the propagation of their own genome over others. Social insects provide a solid study system to study the interplay between cooperation and conflict. Breeding systems in social insects range from solitary breeding to eusocial colonies displaying complete division of reproduction between the fertile queen and the sterile worker caste. Within colonies, additional variation is provided by the presence of several reproductive individuals. In many species, the queen mates multiply, which causes the colony to consist of half-sib instead of full-sib offspring. Furthermore, in many species colonies contain multiple breeding queens, which further dilutes relatedness between colony members. Evolutionary biology is thus faced with the challenge to answer why such variation in social structure exists, and what the consequences are on the individual and population level. The main part of this thesis takes on this challenge by investing the dynamics of socially polymorphic ant colonies. The first four chapters investigate the causes and consequences of different social structures, using a combination of field studies, genetic analyses and laboratory experiments. The thesis ends with a theoretical chapter focusing on different social interactions (altruism and spite), and the evolution of harming traits. The main results of the thesis show that social polymorphism has the potential to affect the behaviour and traits of both individuals and colonies. For example, we found that genetic polymorphism may increase the phenotypic variation between individuals in colonies, and that socially polymorphic colonies may show different life history patterns. We also show that colony cohesion may be enhanced even in multiple-queen colonies through patterns of unequal reproduction between queens. However, the thesis also demonstrates that spatial and temporal variation between both populations and environments may affect individual and colony traits, to the degree that results obtained in one place or at one time may not be applicable in other situations. This opens up potential further areas of research to explain these differences.
Resumo:
Pack ice is an aggregate of ice floes drifting on the sea surface. The forces controlling the motion and deformation of pack ice are air and water drag forces, sea surface tilt, Coriolis force and the internal force due to the interaction between ice floes. In this thesis, the mechanical behavior of compacted pack ice is investigated using theoretical and numerical methods, focusing on the three basic material properties: compressive strength, yield curve and flow rule. A high-resolution three-category sea ice model is applied to investigate the sea ice dynamics in two small basins, the whole Gulf Riga and the inside Pärnu Bay, focusing on the calibration of the compressive strength for thin ice. These two basins are on the scales of 100 km and 20 km, respectively, with typical ice thickness of 10-30 cm. The model is found capable of capturing the main characteristics of the ice dynamics. The compressive strength is calibrated to be about 30 kPa, consistent with the values from most large-scale sea ice dynamic studies. In addition, the numerical study in Pärnu Bay suggests that the shear strength drops significantly when the ice-floe size markedly decreases. A characteristic inversion method is developed to probe the yield curve of compacted pack ice. The basis of this method is the relationship between the intersection angle of linear kinematic features (LKFs) in sea ice and the slope of the yield curve. A summary of the observed LKFs shows that they can be basically divided into three groups: intersecting leads, uniaxial opening leads and uniaxial pressure ridges. Based on the available observed angles, the yield curve is determined to be a curved diamond. Comparisons of this yield curve with those from other methods show that it possesses almost all the advantages identified by the other methods. A new constitutive law is proposed, where the yield curve is a diamond and the flow rule is a combination of the normal and co-axial flow rule. The non-normal co-axial flow rule is necessary for the Coulombic yield constraint. This constitutive law not only captures the main features of forming LKFs but also takes the advantage of avoiding overestimating divergence during shear deformation. Moreover, this study provides a method for observing the flow rule for pack ice during deformation.
Resumo:
The first quarter of the 20th century witnessed a rebirth of cosmology, study of our Universe, as a field of scientific research with testable theoretical predictions. The amount of available cosmological data grew slowly from a few galaxy redshift measurements, rotation curves and local light element abundances into the first detection of the cos- mic microwave background (CMB) in 1965. By the turn of the century the amount of data exploded incorporating fields of new, exciting cosmological observables such as lensing, Lyman alpha forests, type Ia supernovae, baryon acoustic oscillations and Sunyaev-Zeldovich regions to name a few. -- CMB, the ubiquitous afterglow of the Big Bang, carries with it a wealth of cosmological information. Unfortunately, that information, delicate intensity variations, turned out hard to extract from the overall temperature. Since the first detection, it took nearly 30 years before first evidence of fluctuations on the microwave background were presented. At present, high precision cosmology is solidly based on precise measurements of the CMB anisotropy making it possible to pinpoint cosmological parameters to one-in-a-hundred level precision. The progress has made it possible to build and test models of the Universe that differ in the way the cosmos evolved some fraction of the first second since the Big Bang. -- This thesis is concerned with the high precision CMB observations. It presents three selected topics along a CMB experiment analysis pipeline. Map-making and residual noise estimation are studied using an approach called destriping. The studied approximate methods are invaluable for the large datasets of any modern CMB experiment and will undoubtedly become even more so when the next generation of experiments reach the operational stage. -- We begin with a brief overview of cosmological observations and describe the general relativistic perturbation theory. Next we discuss the map-making problem of a CMB experiment and the characterization of residual noise present in the maps. In the end, the use of modern cosmological data is presented in the study of an extended cosmological model, the correlated isocurvature fluctuations. Current available data is shown to indicate that future experiments are certainly needed to provide more information on these extra degrees of freedom. Any solid evidence of the isocurvature modes would have a considerable impact due to their power in model selection.