905 resultados para Running Kinematics
Resumo:
This article discusses the physics programme of the TOTEM experiment at the LHC. A new special beam optics with beta* = 90 m, enabling the measurements of the total cross-section, elastic pp scattering and diffractive phenomena already at early LHC runs, is explained. For this and the various other TOTEM running scenarios, the acceptances of the leading proton detectors and of the forward tracking stations for some physics processes are described.
Resumo:
The production rate and kinematics of photons produced in association with Z bosons are studied using 2/fb of p\bar{p} collision data collected at the Collider Detector at Fermilab. The cross section for p\bar{p} -> l^+ l^- gamma + X (where the leptons l are either muons or electrons with dilepton mass M_{ll} > 40 GeV/c^2, and where the photon has transverse energy Et_{gamma} > 7 GeV and is well separated from the leptons) is 4.6 +/- 0.2 (stat) +/- 0.3 (syst) +/- 0.3 (lum) pb, which is consistent with standard model expectations. We use the photon Et distribution from Z-gamma events where the Z has decayed to mu^+ mu^-, e^+ e^-, or nu\bar{nu} to set limits on anomalous (non-standard-model) trilinear couplings between photons and Z bosons.
Resumo:
Wear studies of engine components of high-speed diesel engines running under various operating conditions are presented. Tests were conducted under controlled conditions over long periods. The results of the various tests are discussed and attempts have been made to examine the effects of engine operating variables and the quality of the lubricating oil on the wear of engine components.
Resumo:
We report a measurement of the top quark mass, m_t, obtained from ppbar collisions at sqrt(s) = 1.96 TeV at the Fermilab Tevatron using the CDF II detector. We analyze a sample corresponding to an integrated luminosity of 1.9 fb^-1. We select events with an electron or muon, large missing transverse energy, and exactly four high-energy jets in the central region of the detector, at least one of which is tagged as coming from a b quark. We calculate a signal likelihood using a matrix element integration method, with effective propagators to take into account assumptions on event kinematics. Our event likelihood is a function of m_t and a parameter JES that determines /in situ/ the calibration of the jet energies. We use a neural network discriminant to distinguish signal from background events. We also apply a cut on the peak value of each event likelihood curve to reduce the contribution of background and badly reconstructed events. Using the 318 events that pass all selection criteria, we find m_t = 172.7 +/- 1.8 (stat. + JES) +/- 1.2 (syst.) GeV/c^2.
Resumo:
The status of the TOTEM experiment is described as well as the prospects for the measurements in the early LHC runs. The primary goal of TOTEM is the measurement of the total p-p cross section, using a method independent of the luminosity. A final accuracy of 1% is ex- pected with dedicated β∗ = 1540 m runs, while at the beginning a 5% resolution is achievable with a β∗ = 90 m optics. Accordingly to the running scenarios TOTEM will be able to measure the elastic scattering in a wide range of t and to study the cross-sections and the topologies of diffractive events. In a later stage, physics studies will be extended to low-x and forward physics collaborating with CMS as a whole experimental apparatus.
Resumo:
We present a measurement of the top quark mass with t-tbar dilepton events produced in p-pbar collisions at the Fermilab Tevatron $\sqrt{s}$=1.96 TeV and collected by the CDF II detector. A sample of 328 events with a charged electron or muon and an isolated track, corresponding to an integrated luminosity of 2.9 fb$^{-1}$, are selected as t-tbar candidates. To account for the unconstrained event kinematics, we scan over the phase space of the azimuthal angles ($\phi_{\nu_1},\phi_{\nu_2}$) of neutrinos and reconstruct the top quark mass for each $\phi_{\nu_1},\phi_{\nu_2}$ pair by minimizing a $\chi^2$ function in the t-tbar dilepton hypothesis. We assign $\chi^2$-dependent weights to the solutions in order to build a preferred mass for each event. Preferred mass distributions (templates) are built from simulated t-tbar and background events, and parameterized in order to provide continuous probability density functions. A likelihood fit to the mass distribution in data as a weighted sum of signal and background probability density functions gives a top quark mass of $165.5^{+{3.4}}_{-{3.3}}$(stat.)$\pm 3.1$(syst.) GeV/$c^2$.
Resumo:
Volatile organic compounds (VOCs) are emitted into the atmosphere from natural and anthropogenic sources, vegetation being the dominant source on a global scale. Some of these reactive compounds are deemed major contributors or inhibitors to aerosol particle formation and growth, thus making VOC measurements essential for current climate change research. This thesis discusses ecosystem scale VOC fluxes measured above a boreal Scots pine dominated forest in southern Finland. The flux measurements were performed using the micrometeorological disjunct eddy covariance (DEC) method combined with proton transfer reaction mass spectrometry (PTR-MS), which is an online technique for measuring VOC concentrations. The measurement, calibration, and calculation procedures developed in this work proved to be well suited to long-term VOC concentration and flux measurements with PTR-MS. A new averaging approach based on running averaged covariance functions improved the determination of the lag time between wind and concentration measurements, which is a common challenge in DEC when measuring fluxes near the detection limit. The ecosystem scale emissions of methanol, acetaldehyde, and acetone were substantial. These three oxygenated VOCs made up about half of the total emissions, with the rest comprised of monoterpenes. Contrary to the traditional assumption that monoterpene emissions from Scots pine originate mainly as evaporation from specialized storage pools, the DEC measurements indicated a significant contribution from de novo biosynthesis to the ecosystem scale monoterpene emissions. This thesis offers practical guidelines for long-term DEC measurements with PTR-MS. In particular, the new averaging approach to the lag time determination seems useful in the automation of DEC flux calculations. Seasonal variation in the monoterpene biosynthesis and the detailed structure of a revised hybrid algorithm, describing both de novo and pool emissions, should be determined in further studies to improve biological realism in the modelling of monoterpene emissions from Scots pine forests. The increasing number of DEC measurements of oxygenated VOCs will probably enable better estimates of the role of these compounds in plant physiology and tropospheric chemistry. Keywords: disjunct eddy covariance, lag time determination, long-term flux measurements, proton transfer reaction mass spectrometry, Scots pine forests, volatile organic compounds
Resumo:
We report the results of a study of multi-muon events produced at the Fermilab Tevatron collider and acquired with the CDF II detector using a dedicated dimuon trigger. The production cross section and kinematics of events in which both muon candidates are produced inside the beam pipe of radius 1.5 cm are successfully modeled by known processes which include heavy flavor production. In contrast, we are presently unable to fully account for the number and properties of the remaining events, in which at least one muon candidate is produced outside of the beam pipe, in terms of the same understanding of the CDF II detector, trigger, and event reconstruction.
Resumo:
As globalization and capital free movement has increased, so has interest in the effects of that global money flow, especially during financial crises. The concern has been that large global money flows will affect the pricing of small local markets by causing, in particular, overreaction. The purpose of this thesis is to contribute to the body of work concerning short-term under- and overreaction and the short-term effects of foreign investment flow in the small Finnish equity markets. This thesis also compares foreign execution return to domestic execution return. This study’s results indicate that short-term under- and overreaction occurs in domestic-buy portfolios (domestic net buying) rather than in foreign-buy portfolios. This under- and overreaction, however, is not economically meaningful after controlling for the bid-ask bounce effect. Based on this finding, one can conclude that foreign investors do not have a destabilizing effect in the short-term in the Finnish markets. Foreign activity affects short-term returns. When foreign investors are net buyers (sellers) there are positive (negative) market adjusted returns. Literature related to nationality and institutional effect leads us to expect these kind of results. These foreign flows are persistent at a 5 % to 21 % level and the persistence of foreign buy flow is higher than the foreign sell flow. Foreign daily trading execution is worse than domestic execution. Literature which quantifies foreign investors as liquidity demanders and literature related to front-running leads us to expect poorer foreign execution than domestic execution.
Resumo:
The move towards IT outsourcing is the first step towards an environment where compute infrastructure is treated as a service. In utility computing this IT service has to honor Service Level Agreements (SLA) in order to meet the desired Quality of Service (QoS) guarantees. Such an environment requires reliable services in order to maximize the utilization of the resources and to decrease the Total Cost of Ownership (TCO). Such reliability cannot come at the cost of resource duplication, since it increases the TCO of the data center and hence the cost per compute unit. We, in this paper, look into aspects of projecting impact of hardware failures on the SLAs and techniques required to take proactive recovery steps in case of a predicted failure. By maintaining health vectors of all hardware and system resources, we predict the failure probability of resources based on observed hardware errors/failure events, at runtime. This inturn influences an availability aware middleware to take proactive action (even before the application is affected in case the system and the application have low recoverability). The proposed framework has been prototyped on a system running HP-UX. Our offline analysis of the prediction system on hardware error logs indicate no more than 10% false positives. This work to the best of our knowledge is the first of its kind to perform an end-to-end analysis of the impact of a hardware fault on application SLAs, in a live system.
Resumo:
The shock manifold equation is a first order nonlinear partial differential equation, which describes the kinematics of a shockfront in an ideal gas with constant specific heats. However, it was found that there was more than one of these shock manifold equations, and the shock surface could be embedded in a one parameter family of surfaces, obtained as a solution of any of these shock manifold equations. Associated with each shock manifold equation is a set of characteristic curves called lsquoshock raysrsquo. This paper investigates the nature of various associated shock ray equations.
Resumo:
A high contrast laser writing technique based on laser induced efficient chemical oxidation in insitu textured Ge films is demonstrated. Free running Nd-YAG laser pulses are used for irradiating the films. The irradiation effects have been characterised using optical microscopy, electron spectroscopy and microdensitometry. The mechanism for the observed contrast has been identified as due to formation of GeO2 phase upon laser irradiation using X-ray initiated Auger spectroscopy (XAES) and X-ray photoelectron spectroscopy (XPS). The contrast in the present films is found to be nearly five times more than that known due to GeO phase formation in similar films.
Resumo:
The k-colouring problem is to colour a given k-colourable graph with k colours. This problem is known to be NP-hard even for fixed k greater than or equal to 3. The best known polynomial time approximation algorithms require n(delta) (for a positive constant delta depending on k) colours to colour an arbitrary k-colourable n-vertex graph. The situation is entirely different if we look at the average performance of an algorithm rather than its worst-case performance. It is well known that a k-colourable graph drawn from certain classes of distributions can be ii-coloured almost surely in polynomial time. In this paper, we present further results in this direction. We consider k-colourable graphs drawn from the random model in which each allowed edge is chosen independently with probability p(n) after initially partitioning the vertex set into ii colour classes. We present polynomial time algorithms of two different types. The first type of algorithm always runs in polynomial time and succeeds almost surely. Algorithms of this type have been proposed before, but our algorithms have provably exponentially small failure probabilities. The second type of algorithm always succeeds and has polynomial running time on average. Such algorithms are more useful and more difficult to obtain than the first type of algorithms. Our algorithms work as long as p(n) greater than or equal to n(-1+is an element of) where is an element of is a constant greater than 1/4.
Resumo:
This thesis explores the particular framework of evidentiary assessment of three selected appellate national asylum procedures in Europe and discusses the relationship between these procedures, on the one hand, and between these procedures and other legal systems, including the EU legal order and international law, on the other. A theme running throughout the thesis is the EU strivings towards approximation of national asylum procedures and my study analyses the evidentiary assessment of national procedures with the aim of pinpointing similarities and differences, and the influences which affect these distinctions. The thesis first explores the frames construed for national evidentiary solutions by studying the object of decision-making and the impact of legal systems outside the national. Second, the study analyses the factual evidentiary assessment of three national procedures - German, Finnish and English. Thirdly, the study explores the interrelationship between these procedures and the legal systems influencing them and poses questions in relation to the strivings of EU and methods of convergence. The thesis begins by stating the framework and starting points for the research. It moves on to establish keys of comparison concerning four elements of evidentiary assessment that are of importance to any appellate asylum procedure, and that can be compared between national procedures, on the one hand, and between international, regional and national frameworks, on the other. Four keys of comparison are established: the burden of proof, demands for evidentiary robustness, the standard of proof and requirements for the methods of evidentiary assessment. These keys of comparison are then identified in three national appellate asylum procedures, and in order to come to conclusions on the evidentiary standards of the appellate asylum procedures, relevant elements of the asylum procedures in general are presented. Further, institutional, formal and procedural matters which have an impact on the evidentiary standards in the national appellate procedures are analysed. From there, the thesis moves on to establish the relationship between national evidentiary standards and the legal systems which affect them, and gives reasons for similarities and divergences. Further, the thesis studies the impact of the national frameworks on the regional and international level. Lastly, the dissertation makes a de lege ferenda survey of the relationship between EU developments, the goal of harmonization in relation to national asylum procedures and the particular feature of evidentiary standards in national appellate asylum procedures. Methodology The thesis follows legal dogmatic methods. The aim is to analyse legal norms and legal constructions and give them content and context. My study takes as its outset an understanding of the purposes for legal research also regarding evidence and asylum to determine the contents of valid law through analysis and systematization. However, as evidentiary issues traditionally are normatively vaguely defined, a strict traditional normative dogmatic approach is not applied. For the same reason a traditionalist and strict legal positivism is not applied. The dogmatics applied to the analysis of the study is supported by practical analysis. The aim is not only to reach conclusions concerning the contents of legal norms and the requirements of law, but also to study the use and practical functioning of these norms, giving them a practcial context. Further, the study relies on a comparative method. A functionalist comparative method is employed and keys of comparison are found in evidentiary standards of three selected national appellate asylum procedures. The functioning equivalences of German, Finnish and English evidentiary standards of appellate asylum procedures are compared, and they are positioned in an European and international legal setting. Research Results The thesis provides results regarding the use of evidence in national appellate asylum procedures. It is established that evidentiary solutions do indeed impact on the asylum procedure and that the results of the procedure are dependent on the evidentiary solutions made in the procedures. Variations in, amongst other things, the interpretation of the burden of proof, the applied standard of proof and the method for determining evidentiary value, are analysed. It is established that national impacts play an important role in the adaptation of national appellate procedures to external requirements. Further, it is established that the impact of national procedures on as well the international framework as on EU law varies between the studied countries, partly depending on the position of the Member State in legislative advances at the EU level. In this comparative study it is, further, established that the impact of EU requirements concerning evidentiary issues may be have positive as well as negative effects with regard to the desired harmonization. It is also concluded that harmonization using means of convergence that primaly target legal frameworks may not in all instances be optimal in relation to evidentiary standards, and that more varied and pragmatic means of convergence must be introduced in order to secure harmonization also in terms of evidence. To date, legal culture and traditions seem to prevail over direct efforts at procedural harmonization.
Resumo:
The subject of the present research is historical lighthouse and maritime pilot stations in Finland. If one thinks of these now-abandoned sites as an empty stage, the dissertation aims to recreate the drama that once played out there. The research comprises three main themes. The first, the family problematic, focuses on the relationship between the family members concerned and the public service positions held, as well as the islands on which these people were stationed. The role of the male actors becomes apparent through an examination of the job descriptions of pilots and lighthouse keepers, but the role of the wives appears more problematic: running a household and the insularity of the community came with their own challenges, and the husbands were away for much of the time. In this context the children emerge as crucial. What was their role in the family of a public official? What were the effects of having to move to the mainland for school? The second theme is the station community. A socioecological examination is undertaken which defines the islands as plots allowing the researcher to study the social behaviours of the isolated communities in question. The development of this theme is based on interpretations of interviews revealing starkly opposed views on the existing neighbourly relations. The premise is that social friction is inevitable among people living within close proximity of each other, and the study proceeds to become an analysis that seeks to uncover the sociocultural strategies designed to control the risks of communal living either by creating distance between neighbours or by enhancing their mutual ties. In connection with this, the question of why some neighbourhoods were open and cooperative while others were restrained and quarrelsome is addressed. Finally, the third main theme discusses the changes in piloting and lighthouse keeping that took place increasingly numerous towards the end of the 20th century. How did individuals react to the central management s technocratic strivings and rationalisations, such as the automation of lighthouses and the intense downsizing of the network of pilot stations? How was piloting, previously very comprehensive work, splintered into specialisations, and how did the entire occupation of lighthouse keeping lose its status before completely disappearing, as the new technology took over?