18 resultados para Kähler Metrics
em Helda - Digital Repository of University of Helsinki
Resumo:
This PhD Thesis is about certain infinite-dimensional Grassmannian manifolds that arise naturally in geometry, representation theory and mathematical physics. From the physics point of view one encounters these infinite-dimensional manifolds when trying to understand the second quantization of fermions. The many particle Hilbert space of the second quantized fermions is called the fermionic Fock space. A typical element of the fermionic Fock space can be thought to be a linear combination of the configurations m particles and n anti-particles . Geometrically the fermionic Fock space can be constructed as holomorphic sections of a certain (dual)determinant line bundle lying over the so called restricted Grassmannian manifold, which is a typical example of an infinite-dimensional Grassmannian manifold one encounters in QFT. The construction should be compared with its well-known finite-dimensional analogue, where one realizes an exterior power of a finite-dimensional vector space as the space of holomorphic sections of a determinant line bundle lying over a finite-dimensional Grassmannian manifold. The connection with infinite-dimensional representation theory stems from the fact that the restricted Grassmannian manifold is an infinite-dimensional homogeneous (Kähler) manifold, i.e. it is of the form G/H where G is a certain infinite-dimensional Lie group and H its subgroup. A central extension of G acts on the total space of the dual determinant line bundle and also on the space its holomorphic sections; thus G admits a (projective) representation on the fermionic Fock space. This construction also induces the so called basic representation for loop groups (of compact groups), which in turn are vitally important in string theory / conformal field theory. The Thesis consists of three chapters: the first chapter is an introduction to the backround material and the other two chapters are individually written research articles. The first article deals in a new way with the well-known question in Yang-Mills theory, when can one lift the action of the gauge transformation group on the space of connection one forms to the total space of the Fock bundle in a compatible way with the second quantized Dirac operator. In general there is an obstruction to this (called the Mickelsson-Faddeev anomaly) and various geometric interpretations for this anomaly, using such things as group extensions and bundle gerbes, have been given earlier. In this work we give a new geometric interpretation for the Faddeev-Mickelsson anomaly in terms of differentiable gerbes (certain sheaves of categories) and central extensions of Lie groupoids. The second research article deals with the question how to define a Dirac-like operator on the restricted Grassmannian manifold, which is an infinite-dimensional space and hence not in the landscape of standard Dirac operator theory. The construction relies heavily on infinite-dimensional representation theory and one of the most technically demanding challenges is to be able to introduce proper normal orderings for certain infinite sums of operators in such a way that all divergences will disappear and the infinite sum will make sense as a well-defined operator acting on a suitable Hilbert space of spinors. This research article was motivated by a more extensive ongoing project to construct twisted K-theory classes in Yang-Mills theory via a Dirac-like operator on the restricted Grassmannian manifold.
Resumo:
This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.
Resumo:
Wireless access is expected to play a crucial role in the future of the Internet. The demands of the wireless environment are not always compatible with the assumptions that were made on the era of the wired links. At the same time, new services that take advantage of the advances in many areas of technology are invented. These services include delivery of mass media like television and radio, Internet phone calls, and video conferencing. The network must be able to deliver these services with acceptable performance and quality to the end user. This thesis presents an experimental study to measure the performance of bulk data TCP transfers, streaming audio flows, and HTTP transfers which compete the limited bandwidth of the GPRS/UMTS-like wireless link. The wireless link characteristics are modeled with a wireless network emulator. We analyze how different competing workload types behave with regular TPC and how the active queue management, the Differentiated services (DiffServ), and a combination of TCP enhancements affect the performance and the quality of service. We test on four link types including an error-free link and the links with different Automatic Repeat reQuest (ARQ) persistency. The analysis consists of comparing the resulting performance in different configurations based on defined metrics. We observed that DiffServ and Random Early Detection (RED) with Explicit Congestion Notification (ECN) are useful, and in some conditions necessary, for quality of service and fairness because a long queuing delay and congestion related packet losses cause problems without DiffServ and RED. However, we observed situations, where there is still room for significant improvements if the link-level is aware of the quality of service. Only very error-prone link diminishes the benefits to nil. The combination of TCP enhancements improves performance. These include initial window of four, Control Block Interdependence (CBI) and Forward RTO recovery (F-RTO). The initial window of four helps a later starting TCP flow to start faster but generates congestion under some conditions. CBI prevents slow-start overshoot and balances slow start in the presence of error drops, and F-RTO reduces unnecessary retransmissions successfully.
Resumo:
Free and Open Source Software (FOSS) has gained increased interest in the computer software industry, but assessing its quality remains a challenge. FOSS development is frequently carried out by globally distributed development teams, and all stages of development are publicly visible. Several product and process-level quality factors can be measured using the public data. This thesis presents a theoretical background for software quality and metrics and their application in a FOSS environment. Information available from FOSS projects in three information spaces are presented, and a quality model suitable for use in a FOSS context is constructed. The model includes both process and product quality metrics, and takes into account the tools and working methods commonly used in FOSS projects. A subset of the constructed quality model is applied to three FOSS projects, highlighting both theoretical and practical concerns in implementing automatic metric collection and analysis. The experiment shows that useful quality information can be extracted from the vast amount of data available. In particular, projects vary in their growth rate, complexity, modularity and team structure.
Resumo:
Place identification is the methodology of automatically detecting spatial regions or places that are meaningful to a user by analysing her location traces. Following this approach several algorithms have been proposed in the literature. Most of the algorithms perform well on a particular data set with suitable choice of parameter values. However, tuneable parameters make it difficult for an algorithm to generalise to data sets collected from different geographical locations, different periods of time or containing different activities. This thesis compares the generalisation performance of our proposed DPCluster algorithm along with six state-of-the-art place identification algorithms on twelve location data sets collected using Global Positioning System (GPS). Spatial and temporal variations present in the data help us to identify strengths and weaknesses of the place identification algorithms under study. We begin by discussing the notion of a place and its importance in location-aware computing. Next, we discuss different phases of the place identification process found in the literature followed by a thorough description of seven algorithms. After that, we define evaluation metrics and compare generalisation performance of individual place identification algorithms and report the results. The results indicate that the DPCluster algorithm performs superior to all other algorithms in terms of generalisation performance.
Resumo:
Free and open source software development is an alternative to traditional software engineering as an approach to the development of complex software systems. It is a way of developing software based on geographically distributed teams of volunteers without apparent central plan or traditional mechanisms of coordination. The purpose of this thesis is to summarize the current knowledge about free and open source software development and explore the ways on which further understanding on it could be gained. The results of research on the field as well as the research methods are introduced and discussed. Also adapting software process metrics to the context of free and open source software development is illustrated and the possibilities to utilize them as tools to validate other research are discussed.
Resumo:
The tackling of coastal eutrophication requires water protection measures based on status assessments of water quality. The main purpose of this thesis was to evaluate whether it is possible both scientifically and within the terms of the European Union Water Framework Directive (WFD) to assess the status of coastal marine waters reliably by using phytoplankton biomass (ww) and chlorophyll a (Chl) as indicators of eutrophication in Finnish coastal waters. Empirical approaches were used to study whether the criteria, established for determining an indicator, are fulfilled. The first criterion (i) was that an indicator should respond to anthropogenic stresses in a predictable manner and has low variability in its response. Summertime Chl could be predicted accurately by nutrient concentrations, but not from the external annual loads alone, because of the rapid affect of primary production and sedimentation close to the loading sources in summer. The most accurate predictions were achieved in the Archipelago Sea, where total phosphorus (TP) and total nitrogen (TN) alone accounted for 87% and 78% of the variation in Chl, respectively. In river estuaries, the TP mass-balance regression model predicted Chl most accurately when nutrients originated from point-sources, whereas land-use regression models were most accurate in cases when nutrients originated mainly from diffuse sources. The inclusion of morphometry (e.g. mean depth) into nutrient models improved accuracy of the predictions. The second criterion (ii) was associated with the WFD. It requires that an indicator should have type-specific reference conditions, which are defined as "conditions where the values of the biological quality elements are at high ecological status". In establishing reference conditions, the empirical approach could only be used in the outer coastal water types, where historical observations of Secchi depth of the early 1900s are available. The most accurate prediction was achieved in the Quark. In the inner coastal water types, reference Chl, estimated from present monitoring data, are imprecise - not only because of the less accurate estimation method but also because the intrinsic characteristics, described for instance by morphometry, vary considerably inside these extensive inner coastal types. As for phytoplankton biomass, the reference values were less accurate than in the case of Chl, because it was possible to estimate reference conditions for biomass only by using the reconstructed Chl values, not the historical Secchi observations. An paleoecological approach was also applied to estimate annual average reference conditions for Chl. In Laajalahti, an urban embayment off Helsinki, strongly loaded by municipal waste waters in the 1960s and 1970s, reference conditions prevailed in the mid- and late 1800s. The recovery of the bay from pollution has been delayed as a consequence of benthic release of nutrients. Laajalahti will probably not achieve the good quality objectives of the WFD on time. The third criterion (iii) was associated with coastal management including the resources it has available. Analyses of Chl are cheap and fast to carry out compared to the analyses of phytoplankton biomass and species composition; the fact which has an effect on number of samples to be taken and thereby on the reliability of assessments. However, analyses on phytoplankton biomass and species composition provide more metrics for ecological classification, the metrics which reveal various aspects of eutrophication contrary to what Chl alone does.
Resumo:
While environmental variation is an ubiquitous phenomenon in the natural world which has for long been appreciated by the scientific community recent changes in global climatic conditions have begun to raise consciousness about the economical, political and sociological ramifications of global climate change. Climate warming has already resulted in documented changes in ecosystem functioning, with direct repercussions on ecosystem services. While predicting the influence of ecosystem changes on vital ecosystem services can be extremely difficult, knowledge of the organisation of ecological interactions within natural communities can help us better understand climate driven changes in ecosystems. The role of environmental variation as an agent mediating population extinctions is likely to become increasingly important in the future. In previous studies population extinction risk in stochastic environmental conditions has been tied to an interaction between population density dependence and the temporal autocorrelation of environmental fluctuations. When populations interact with each other, forming ecological communities, the response of such species assemblages to environmental stochasticity can depend, e.g., on trophic structure in the food web and the similarity in species-specific responses to environmental conditions. The results presented in this thesis indicate that variation in the correlation structure between species-specific environmental responses (environmental correlation) can have important qualitative and quantitative effects on community persistence and biomass stability in autocorrelated (coloured) environments. In addition, reddened environmental stochasticity and ecological drift processes (such as demographic stochasticity and dispersal limitation) have important implications for patterns in species relative abundances and community dynamics over time and space. Our understanding of patterns in biodiversity at local and global scale can be enhanced by considering the relevance of different drift processes for community organisation and dynamics. Although the results laid out in this thesis are based on mathematical simulation models, they can be valuable in planning effective empirical studies as well as in interpreting existing empirical results. Most of the metrics considered here are directly applicable to empirical data.
Resumo:
Phytoplankton ecology and productivity is one of the main branches of contemporary oceanographic research. Research groups in this branch have increasingly started to utilise bio-optical applications. My main research objective was to critically investigate the advantages and deficiencies of the fast repetition rate (FRR) fluorometry for studies of productivity of phytoplankton, and the responses of phytoplankton towards varying environmental stress. Second, I aimed to clarify the applicability of the FRR system to the optical environment of the Baltic Sea. The FRR system offers a highly dynamic tool for studies of phytoplankton photophysiology and productivity both in the field and in a controlled environment. The FRR metrics obtain high-frequency in situ determinations of the light-acclimative and photosynthetic parameters of intact phytoplankton communities. The measurement protocol is relatively easy to use without phases requiring analytical determinations. The most notable application of the FRR system lies in its potential for making primary productivity (PP) estimations. However, the realisation of this scheme is not straightforward. The FRR-PP, based on the photosynthetic electron flow (PEF) rate, are linearly related to the photosynthetic gas exchange (fixation of 14C) PP only in environments where the photosynthesis is light-limited. If the light limitation is not present, as is usually the case in the near-surface layers of the water column, the two PP approaches will deviate. The prompt response of the PEF rate to the short-term variability in the natural light field makes the field comparisons between the PEF-PP and the 14C-PP difficult to interpret, because this variability is averaged out in the 14C-incubations. Furthermore, the FRR based PP models are tuned to closely follow the vertical pattern of the underwater irradiance. Due to the photoacclimational plasticity of phytoplankton, this easily leads to overestimates of water column PP, if precautionary measures are not taken. Natural phytoplankton is subject to broad-waveband light. Active non-spectral bio-optical instruments, like the FRR fluorometer, emit light in a relatively narrow waveband, which by its nature does not represent the in situ light field. Thus, the spectrally-dependent parameters provided by the FRR system need to be spectrally scaled to the natural light field of the Baltic Sea. In general, the requirement of spectral scaling in the water bodies under terrestrial impact concerns all light-adaptive parameters provided by any active non-spectral bio-optical technique. The FRR system can be adopted to studies of all phytoplankton that possess efficient light harvesting in the waveband matching the bluish FRR excitation. Although these taxa cover the large bulk of all the phytoplankton taxa, one exception with a pronounced ecological significance is found in the Baltic Sea. The FRR system cannot be used to monitor the photophysiology of the cyanobacterial taxa harvesting light in the yellow-red waveband. These taxa include the ecologically-significant bloom-forming cyanobacterial taxa in the Baltic Sea.
Resumo:
Background: The aging population is placing increasing demands on surgical services, simultaneously with a decreasing supply of professional labor and a worsening economic situation. Under growing financial constraints, successful operating room management will be one of the key issues in the struggle for technical efficiency. This study focused on several issues affecting operating room efficiency. Materials and methods: The current formal operating room management in Finland and the use of performance metrics and information systems used to support this management were explored using a postal survey. We also studied the feasibility of a wireless patient tracking system as a tool for managing the process. The reliability of the system as well as the accuracy and precision of its automatically recorded time stamps were analyzed. The benefits of a separate anesthesia induction room in a prospective setting were compared with the traditional way of working, where anesthesia is induced in the operating room. Using computer simulation, several models of parallel processing for the operating room were compared with the traditional model with respect to cost-efficiency. Moreover, international differences in operating room times for two common procedures, laparoscopic cholecystectomy and open lung lobectomy, were investigated. Results: The managerial structure of Finnish operating units was not clearly defined. Operating room management information systems were found to be out-of-date, offering little support to online evaluation of the care process. Only about half of the information systems provided information in real time. Operating room performance was most often measured by the number of procedures in a time unit, operating room utilization, and turnover time. The wireless patient tracking system was found to be feasible for hospital use. Automatic documentation of the system facilitated patient flow management by increasing process transparency via more available and accurate data, while lessening work for staff. Any parallel work flow model was more cost-efficient than the traditional way of performing anesthesia induction in the operating room. Mean operating times for two common procedures differed by 50% among eight hospitals in different countries. Conclusions: The structure of daily operative management of an operating room warrants redefinition. Performance measures as well as information systems require updating. Parallel work flows are more cost-efficient than the traditional induction-in-room model.
Resumo:
In this thesis we consider the phenomenology of supergravity, and in particular the particle called "gravitino". We begin with an introductory part, where we discuss the theories of inflation, supersymmetry and supergravity. Gravitino production is then investigated into details, by considering the research papers here included. First we study the scattering of massive W bosons in the thermal bath of particles, during the period of reheating. We show that the process generates in the cross section non trivial contributions, which eventually lead to unitarity breaking above a certain scale. This happens because, in the annihilation diagram, the longitudinal degrees of freedom in the propagator of the gauge bosons disappear from the amplitude, by virtue of the supergravity vertex. Accordingly, the longitudinal polarizations of the on-shell W become strongly interacting in the high energy limit. By studying the process with both gauge and mass eigenstates, it is shown that the inclusion of diagrams with off-shell scalars of the MSSM does not cancel the divergences. Next, we approach cosmology more closely, and study the decay of a scalar field S into gravitinos at the end of inflation. Once its mass is comparable to the Hubble rate, the field starts coherent oscillations about the minimum of its potential and decays pertubatively. We embed S in a model of gauge mediation with metastable vacua, where the hidden sector is of the O'Raifeartaigh type. First we discuss the dynamics of the field in the expanding background, then radiative corrections to the scalar potential V(S) and to the Kähler potential are calculated. Constraints on the reheating temperature are accordingly obtained, by demanding that the gravitinos thus produced provide with the observed Dark Matter density. We modify consistently former results in the literature, and find that the gravitino number density and T_R are extremely sensitive to the parameters of the model. This means that it is easy to account for gravitino Dark Matter with an arbitrarily low reheating temperature.
Resumo:
A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.
Resumo:
A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.
Resumo:
Herbivorous insects, their host plants and natural enemies form the largest and most species-rich communities on earth. But what forces structure such communities? Do they represent random collections of species, or are they assembled by given rules? To address these questions, food webs offer excellent tools. As a result of their versatile information content, such webs have become the focus of intensive research over the last few decades. In this thesis, I study herbivore-parasitoid food webs from a new perspective: I construct multiple, quantitative food webs in a spatially explicit setting, at two different scales. Focusing on food webs consisting of specialist herbivores and their natural enemies on the pedunculate oak, Quercus robur, I examine consistency in food web structure across space and time, and how landscape context affects this structure. As an important methodological development, I use DNA barcoding to resolve potential cryptic species in the food webs, and to examine their effect on food web structure. I find that DNA barcoding changes our perception of species identity for as many as a third of the individuals, by reducing misidentifications and by resolving several cryptic species. In terms of the variation detected in food web structure, I find surprising consistency in both space and time. From a spatial perspective, landscape context leaves no detectable imprint on food web structure, while species richness declines significantly with decreasing connectivity. From a temporal perspective, food web structure remains predictable from year to year, despite considerable species turnover in local communities. The rate of such turnover varies between guilds and species within guilds. The factors best explaining these observations are abundant and common species, which have a quantitatively dominant imprint on overall structure, and suffer the lowest turnover. By contrast, rare species with little impact on food web structure exhibit the highest turnover rates. These patterns reveal important limitations of modern metrics of quantitative food web structure. While they accurately describe the overall topology of the web and its most significant interactions, they are disproportionately affected by species with given traits, and insensitive to the specific identity of species. As rare species have been shown to be important for food web stability, metrics depicting quantitative food web structure should then not be used as the sole descriptors of communities in a changing world. To detect and resolve the versatile imprint of global environmental change, one should rather use these metrics as one tool among several.