18 resultados para Standard map

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This masters thesis explores some of the most recent developments in noncommutative quantum field theory. This old theme, first suggested by Heisenberg in the late 1940s, has had a renaissance during the last decade due to the firmly held belief that space-time becomes noncommutative at small distances and also due to the discovery that string theory in a background field gives rise to noncommutative field theory as an effective low energy limit. This has led to interesting attempts to create a noncommutative standard model, a noncommutative minimal supersymmetric standard model, noncommutative gravity theories etc. This thesis reviews themes and problems like those of UV/IR mixing, charge quantization, how to deal with the non-commutative symmetries, how to solve the Seiberg-Witten map, its connection to fluid mechanics and the problem of constructing general coordinate transformations to obtain a theory of noncommutative gravity. An emphasis has been put on presenting both the group theoretical results and the string theoretical ones, so that a comparison of the two can be made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our present-day understanding of fundamental constituents of matter and their interactions is based on the Standard Model of particle physics, which relies on quantum gauge field theories. On the other hand, the large scale dynamical behaviour of spacetime is understood via the general theory of relativity of Einstein. The merging of these two complementary aspects of nature, quantum and gravity, is one of the greatest goals of modern fundamental physics, the achievement of which would help us understand the short-distance structure of spacetime, thus shedding light on the events in the singular states of general relativity, such as black holes and the Big Bang, where our current models of nature break down. The formulation of quantum field theories in noncommutative spacetime is an attempt to realize the idea of nonlocality at short distances, which our present understanding of these different aspects of Nature suggests, and consequently to find testable hints of the underlying quantum behaviour of spacetime. The formulation of noncommutative theories encounters various unprecedented problems, which derive from their peculiar inherent nonlocality. Arguably the most serious of these is the so-called UV/IR mixing, which makes the derivation of observable predictions especially hard by causing new tedious divergencies, to which our previous well-developed renormalization methods for quantum field theories do not apply. In the thesis I review the basic mathematical concepts of noncommutative spacetime, different formulations of quantum field theories in the context, and the theoretical understanding of UV/IR mixing. In particular, I put forward new results to be published, which show that also the theory of quantum electrodynamics in noncommutative spacetime defined via Seiberg-Witten map suffers from UV/IR mixing. Finally, I review some of the most promising ways to overcome the problem. The final solution remains a challenge for the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate change will influence the living conditions of all life on Earth. For some species the change in the environmental conditions that has occurred so far has already increased the risk of extinction, and the extinction risk is predicted to increase for large numbers of species in the future. Some species may have time to adapt to the changing environmental conditions, but the rate and magnitude of the change are too great to allow many species to survive via evolutionary changes. Species responses to climate change have been documented for some decades. Some groups of species, like many insects, respond readily to changes in temperature conditions and have shifted their distributions northwards to new climatically suitable regions. Such range shifts have been well documented especially in temperate zones. In this context, butterflies have been studied more than any other group of species, partly for the reason that their past geographical ranges are well documented, which facilitates species-climate modelling and other analyses. The aim of the modelling studies is to examine to what extent shifts in species distributions can be explained by climatic and other factors. Models can also be used to predict the future distributions of species. In this thesis, I have studied the response to climate change of one species of butterfly within one geographically restricted area. The study species, the European map butterfly (Araschnia levana), has expanded rapidly northwards in Finland during the last two decades. I used statistical and dynamic modelling approaches in combination with field studies to analyse the effects of climate warming and landscape structure on the expansion. I studied possible role of molecular variation in phosphoglucose isomerase (PGI), a glycolytic enzyme affecting flight metabolism and thereby flight performance, in the observed expansion of the map butterfly at two separate expansion fronts in Finland. The expansion rate of the map butterfly was shown to be correlated with the frequency of warmer than average summers during the study period. The result is in line with the greater probability of occurrence of the second generation during warm summers and previous results on this species showing greater mobility of the second than first generation individuals. The results of a field study in this thesis indicated low mobility of the first generation butterflies. Climatic variables alone were not sufficient to explain the observed expansion in Finland. There are also problems in transferring the climate model to new regions from the ones from which data were available to construct the model. The climate model predicted a wider distribution in the south-western part of Finland than what has been observed. Dynamic modelling of the expansion in response to landscape structure suggested that habitat and landscape structure influence the rate of expansion. In southern Finland the landscape structure may have slowed down the expansion rate. The results on PGI suggested that allelic variation in this enzyme may influence flight performance and thereby the rate of expansion. Genetic differences of the populations at the two expansion fronts may explain at least partly the observed differences in the rate of expansion. Individuals with the genotype associated with high flight metabolic rate were most frequent in eastern Finland, where the rate of range expansion has been highest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Holistic physics education in upper secondary level based on the optional course of physics Keywords: physics education, education, holistic, curriculum, world view, values A physics teacher s task is to put into practice all goals of the curriculum. Holistic physics education means in this research teaching, in which the school s common educational goals and the goals particular to the physics curriculum are taken into account. These involve knowledge, skills and personal value and attitude goals. Research task was to clarify how the educational goals involving student s values and attitudes can be carried out through the subject content of physics. How does the physics teacher communicate the modern world view through the content of the physics class? The goal of this research was to improve teaching, to find new points of view and to widen the perspective on how physics is taught. The teacher, who acted also as a researcher, planned and delivered an optional course where she could study the possibilities of holistic physics education. In 2001-2002 ten girls and two boys of the grade 9th class participated in that elective course. According to principles of action research the teacher-researcher reflected also on her own teaching action. Research method was content analysis that involved both analyzing student feedback, and relevant features of the teacher s knowledge, which are needed for planning and giving the physics lessons. In this research that means taking into account the subject matter knowledge, curriculum, didactic and the pedagogical content knowledge of the teacher. The didactic includes the knowledge of the learning process, students motivation, specific features of the physics didactics and the research of physics education. Among other things, the researcher constructed the contents of the curriculum and abstracted sentences as keywords, from which she drew a concept map. The concept maps, for instance, the map of educational goals and the mapping of the physics essence, were tools for studying contents which are included in the holistic physics education. Moreover, conclusions were reached concerning the contents of physics domains by which these can be achieved. According to this research, the contents employing the holistic physics education is as follows: perception, the essence of science, the development of science, new research topics and interactions in physics. The starting point of teaching should be connected with the student s life experiences and the approach to teaching should be broadly relevant to those experiences. The teacher-researcher observed and analyzed the effects of the experimental physics course, through the lens of a holistic physics education. The students reported that the goals of holistic physics education were achieved in the course. The discourses of the students indicated that in the experimental course they could express their opinions and feelings and make proposals and evaluations. The students had experiences about chances to affect the content of the course, and they considered the philosophical physics course interesting, it awakened questions, increased their self-esteem and helped them to become more aware of their world views. The students analytic skills developed in the interactive learning environment. The physics teacher needs broad knowledge for planning his or her teaching, which is evaluated in this research from contents maps made for the tools of the teaching. In the holistic physics education the teacher needs an open and curious mind and skills for interaction in teaching. This research indicates the importance of teaching physics in developing attitudes and values beside substance of the physics in class environment. The different points of view concerning human beings life make it possible to construct the modern world view of the students and to develop analytic skills and the self-esteem and thus help them in learning. Overall and wide points of view also help to transfer knowledge to practice. Since such contents is not employed by teaching the physics included in the standard curriculum, supplement relevant teaching material that includes such topics are needed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"We report on a search for the standard-model Higgs boson in pp collisions at s=1.96 TeV using an integrated luminosity of 2.0 fb(-1). We look for production of the Higgs boson decaying to a pair of bottom quarks in association with a vector boson V (W or Z) decaying to quarks, resulting in a four-jet final state. Two of the jets are required to have secondary vertices consistent with B-hadron decays. We set the first 95% confidence level upper limit on the VH production cross section with V(-> qq/qq('))H(-> bb) decay for Higgs boson masses of 100-150 GeV/c(2) using data from run II at the Fermilab Tevatron. For m(H)=120 GeV/c(2), we exclude cross sections larger than 38 times the standard-model prediction."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We combine searches by the CDF and D0 collaborations for a Higgs boson decaying to W+W-. The data correspond to an integrated total luminosity of 4.8 (CDF) and 5.4 (D0) fb-1 of p-pbar collisions at sqrt{s}=1.96 TeV at the Fermilab Tevatron collider. No excess is observed above background expectation, and resulting limits on Higgs boson production exclude a standard-model Higgs boson in the mass range 162-166 GeV at the 95% C.L.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a search for standard model (SM) Higgs boson production using ppbar collision data at sqrt(s) = 1.96 TeV, collected with the CDF II detector and corresponding to an integrated luminosity of 4.8 fb-1. We search for Higgs bosons produced in all processes with a significant production rate and decaying to two W bosons. We find no evidence for SM Higgs boson production and place upper limits at the 95% confidence level on the SM production cross section (sigma(H)) for values of the Higgs boson mass (m_H) in the range from 110 to 200 GeV. These limits are the most stringent for m_H > 130 GeV and are 1.29 above the predicted value of sigma(H) for mH = 165 GeV.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"We report on a search for the standard-model Higgs boson in pp collisions at s=1.96 TeV using an integrated luminosity of 2.0 fb(-1). We look for production of the Higgs boson decaying to a pair of bottom quarks in association with a vector boson V (W or Z) decaying to quarks, resulting in a four-jet final state. Two of the jets are required to have secondary vertices consistent with B-hadron decays. We set the first 95% confidence level upper limit on the VH production cross section with V(-> qq/qq('))H(-> bb) decay for Higgs boson masses of 100-150 GeV/c(2) using data from run II at the Fermilab Tevatron. For m(H)=120 GeV/c(2), we exclude cross sections larger than 38 times the standard-model prediction."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a search for standard model Higgs boson production in association with a W boson in proton-antiproton collisions at a center of mass energy of 1.96 TeV. The search employs data collected with the CDF II detector that correspond to an integrated luminosity of approximately 1.9 inverse fb. We select events consistent with a signature of a single charged lepton, missing transverse energy, and two jets. Jets corresponding to bottom quarks are identified with a secondary vertex tagging method, a jet probability tagging method, and a neural network filter. We use kinematic information in an artificial neural network to improve discrimination between signal and background compared to previous analyses. The observed number of events and the neural network output distributions are consistent with the standard model background expectations, and we set 95% confidence level upper limits on the production cross section times branching fraction ranging from 1.2 to 1.1 pb or 7.5 to 102 times the standard model expectation for Higgs boson masses from 110 to $150 GeV/c^2, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a search for new phenomena in a signature suppressed in the standard model of elementary particles (SM), we compare the inclusive production of events containing a lepton, a photon, significant transverse momentum imbalance (MET), and a jet identified as containing a b-quark, to SM predictions. The search uses data produced in proton-antiproton collisions at 1.96 TeV corresponding to 1.9 fb-1 of integrated luminosity taken with the CDF detector at the Fermilab Tevatron. We find 28 lepton+photon+MET+b events versus an expectation of 31.0+4.1/-3.5 events. If we further require events to contain at least three jets and large total transverse energy, simulations predict that the largest SM source is top-quark pair production with an additional radiated photon, ttbar+photon. In the data we observe 16 ttbar+photon candidate events versus an expectation from SM sources of 11.2+2.3/-2.1. Assuming the difference between the observed number and the predicted non-top-quark total is due to SM top quark production, we estimate the ttg cross section to be 0.15 +- 0.08 pb.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Layering is a widely used method for structuring data in CAD-models. During the last few years national standardisation organisations, professional associations, user groups for particular CAD-systems, individual companies etc. have issued numerous standards and guidelines for the naming and structuring of layers in building design. In order to increase the integration of CAD data in the industry as a whole ISO recently decided to define an international standard for layer usage. The resulting standard proposal, ISO 13567, is a rather complex framework standard which strives to be more of a union than the least common denominator of the capabilities of existing guidelines. A number of principles have been followed in the design of the proposal. The first one is the separation of the conceptual organisation of information (semantics) from the way this information is coded (syntax). The second one is orthogonality - the fact that many ways of classifying information are independent of each other and can be applied in combinations. The third overriding principle is the reuse of existing national or international standards whenever appropriate. The fourth principle allows users to apply well-defined subsets of the overall superset of possible layernames. This article describes the semantic organisation of the standard proposal as well as its default syntax. Important information categories deal with the party responsible for the information, the type of building element shown, whether a layer contains the direct graphical description of a building part or additional information needed in an output drawing etc. Non-mandatory information categories facilitate the structuring of information in rebuilding projects, use of layers for spatial grouping in large multi-storey projects, and storing multiple representations intended for different drawing scales in the same model. Pilot testing of ISO 13567 is currently being carried out in a number of countries which have been involved in the definition of the standard. In the article two implementations, which have been carried out independently in Sweden and Finland, are described. The article concludes with a discussion of the benefits and possible drawbacks of the standard. Incremental development within the industry, (where ”best practice” can become ”common practice” via a standard such as ISO 13567), is contrasted with the more idealistic scenario of building product models. The relationship between CAD-layering, document management product modelling and building element classification is also discussed.