82 resultados para Data uncertainty


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents measurements from the ATLAS experiment of the forward-backward asymmetry in the reaction pp→Z/γ∗→l+l−, with l being electrons or muons, and the extraction of the effective weak mixing angle. The results are based on the full set of data collected in 2011 in pp collisions at the LHC at s√ = 7 TeV, corresponding to an integrated luminosity of 4.8 fb−1. The measured asymmetry values are found to be in agreement with the corresponding Standard Model predictions. The combination of the muon and electron channels yields a value of the effective weak mixing angle of 0.2308±0.0005(stat.)±0.0006(syst.)±0.0009(PDF), where the first uncertainty corresponds to data statistics, the second to systematic effects and the third to knowledge of the parton density functions. This result agrees with the current world average from the Particle Data Group fit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Programa Doutoral em Matemática e Aplicações.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The results of a search for charged Higgs bosons decaying to a τ lepton and a neutrino, H±→τ±ν, are presented. The analysis is based on 19.5 fb−1 of proton--proton collision data at s√=8 TeV collected by the ATLAS experiment at the Large Hadron Collider. Charged Higgs bosons are searched for in events consistent with top-quark pair production or in associated production with a top quark. The final state is characterised by the presence of a hadronic τ decay, missing transverse momentum, b-tagged jets, a hadronically decaying W boson, and the absence of any isolated electrons or muons with high transverse momenta. The data are consistent with the expected background from Standard Model processes. A statistical analysis leads to 95% confidence-level upper limits on the product of branching ratios B(t→bH±)×B(H±→τ±ν), between 0.23% and 1.3% for charged Higgs boson masses in the range 80--160 GeV. It also leads to 95% confidence-level upper limits on the production cross section times branching ratio, σ(pp→tH±+X)×B(H±→τ±ν), between 0.76 pb and 4.5 fb, for charged Higgs boson masses ranging from 180 GeV to 1000 GeV. In the context of different scenarios of the Minimal Supersymmetric Standard Model, these results exclude nearly all values of tanβ above one for charged Higgs boson masses between 80 GeV and 160 GeV, and exclude a region of parameter space with high tanβ for H± masses between 200 GeV and 250 GeV.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The inclusive jet cross-section is measured in proton--proton collisions at a centre-of-mass energy of 7 TeV using a data set corresponding to an integrated luminosity of 4.5 fb−1 collected with the ATLAS detector at the Large Hadron Collider in 2011. Jets are identified using the anti-kt algorithm with radius parameter values of 0.4 and 0.6. The double-differential cross-sections are presented as a function of the jet transverse momentum and the jet rapidity, covering jet transverse momenta from 100 GeV to 2 TeV. Next-to-leading-order QCD calculations corrected for non-perturbative effects and electroweak effects, as well as Monte Carlo simulations with next-to-leading-order matrix elements interfaced to parton showering, are compared to the measured cross-sections. A quantitative comparison of the measured cross-sections to the QCD calculations using several sets of parton distribution functions is performed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Double-differential three-jet production cross-sections are measured in proton--proton collisions at a centre-of-mass energy of s√=7TeV using the ATLAS detector at the Large Hadron Collider. The measurements are presented as a function of the three-jet mass (mjjj), in bins of the sum of the absolute rapidity separations between the three leading jets (|Y∗|). Invariant masses extending up to 5 TeV are reached for 8<|Y∗|<10. These measurements use a sample of data recorded using the ATLAS detector in 2011, which corresponds to an integrated luminosity of 4.51fb−1. Jets are identified using the anti-kt algorithm with two different jet radius parameters, R=0.4 and R=0.6. The dominant uncertainty in these measurements comes from the jet energy scale. Next-to-leading-order QCD calculations corrected to account for non-perturbative effects are compared to the measurements. Good agreement is found between the data and the theoretical predictions based on most of the available sets of parton distribution functions, over the full kinematic range, covering almost seven orders of magnitude in the measured cross-section values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance of the ATLAS muon trigger system has been evaluated with proton--proton collision data collected in 2012 at the Large Hadron Collider at a centre-of-mass energy of 8 TeV. The performance was primarily evaluated using events containing a pair of muons from the decay of Z bosons. The efficiency is measured for the single-muon trigger for a kinematic region of the transverse momentum pT between 25 and 100 GeV, with a statistical uncertainty of less than 0.01% and a systematic uncertainty of 0.6%. The performance is also compared in detail to the predictions from simulation. The efficiency was measured over a wide pT range (a few GeV to several hundred GeV) by using muons from J/ψ mesons,W bosons, and top and antitop quarks. It showed highly uniform and stable performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in 2012 with the ATLAS detector at the LHC center-of-mass energy s√ = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb−1. An uncertainty on the offline reconstructed tau energy scale of 2% to 4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2% to 8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton--proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Project Management involves onetime endeavors that demand for getting it right the first time. On the other hand, project scheduling, being one of the most modeled project management process stages, still faces a wide gap from theory to practice. Demanding computational models and their consequent call for simplification, divert the implementation of such models in project management tools from the actual day to day project management process. Special focus is being made to the robustness of the generated project schedules facing the omnipresence of uncertainty. An "easy" way out is to add, more or less cleverly calculated, time buffers that always result in project duration increase and correspondingly, in cost. A better approach to deal with uncertainty seems to be to explore slack that might be present in a given project schedule, a fortiori when a non-optimal schedule is used. The combination of such approach to recent advances in modeling resource allocation and scheduling techniques to cope with the increasing flexibility in resources, as can be expressed in "Flexible Resource Constraint Project Scheduling Problem" (FRCPSP) formulations, should be a promising line of research to generate more adequate project management tools. In reality, this approach has been frequently used, by project managers in an ad-hoc way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extreme value models are widely used in different areas. The Birnbaum–Saunders distribution is receiving considerable attention due to its physical arguments and its good properties. We propose a methodology based on extreme value Birnbaum–Saunders regression models, which includes model formulation, estimation, inference and checking. We further conduct a simulation study for evaluating its performance. A statistical analysis with real-world extreme value environmental data using the methodology is provided as illustration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In longitudinal studies of disease, patients may experience several events through a follow-up period. In these studies, the sequentially ordered events are often of interest and lead to problems that have received much attention recently. Issues of interest include the estimation of bivariate survival, marginal distributions and the conditional distribution of gap times. In this work we consider the estimation of the survival function conditional to a previous event. Different nonparametric approaches will be considered for estimating these quantities, all based on the Kaplan-Meier estimator of the survival function. We explore the finite sample behavior of the estimators through simulations. The different methods proposed in this article are applied to a data set from a German Breast Cancer Study. The methods are used to obtain predictors for the conditional survival probabilities as well as to study the influence of recurrence in overall survival.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nitrogen dioxide is a primary pollutant, regarded for the estimation of the air quality index, whose excessive presence may cause significant environmental and health problems. In the current work, we suggest characterizing the evolution of NO2 levels, by using geostatisti- cal approaches that deal with both the space and time coordinates. To develop our proposal, a first exploratory analysis was carried out on daily values of the target variable, daily measured in Portugal from 2004 to 2012, which led to identify three influential covariates (type of site, environment and month of measurement). In a second step, appropriate geostatistical tools were applied to model the trend and the space-time variability, thus enabling us to use the kriging techniques for prediction, without requiring data from a dense monitoring network. This method- ology has valuable applications, as it can provide accurate assessment of the nitrogen dioxide concentrations at sites where either data have been lost or there is no monitoring station nearby.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biofilm research is growing more diverse and dependent on high-throughput technologies and the large-scale production of results aggravates data substantiation. In particular, it is often the case that experimental protocols are adapted to meet the needs of a particular laboratory and no statistical validation of the modified method is provided. This paper discusses the impact of intra-laboratory adaptation and non-rigorous documentation of experimental protocols on biofilm data interchange and validation. The case study is a non-standard, but widely used, workflow for Pseudomonas aeruginosa biofilm development, considering three analysis assays: the crystal violet (CV) assay for biomass quantification, the XTT assay for respiratory activity assessment, and the colony forming units (CFU) assay for determination of cell viability. The ruggedness of the protocol was assessed by introducing small changes in the biofilm growth conditions, which simulate minor protocol adaptations and non-rigorous protocol documentation. Results show that even minor variations in the biofilm growth conditions may affect the results considerably, and that the biofilm analysis assays lack repeatability. Intra-laboratory validation of non-standard protocols is found critical to ensure data quality and enable the comparison of results within and among laboratories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For any vacuum initial data set, we define a local, non-negative scalar quantity which vanishes at every point of the data hypersurface if and only if the data are Kerr initial data. Our scalar quantity only depends on the quantities used to construct the vacuum initial data set which are the Riemannian metric defined on the initial data hypersurface and a symmetric tensor which plays the role of the second fundamental form of the embedded initial data hypersurface. The dependency is algorithmic in the sense that given the initial data one can compute the scalar quantity by algebraic and differential manipulations, being thus suitable for an implementation in a numerical code. The scalar could also be useful in studies of the non-linear stability of the Kerr solution because it serves to measure the deviation of a vacuum initial data set from the Kerr initial data in a local and algorithmic way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article revisits Michel Chevalier’s work and discussions of tariffs. Chevalier shifted from Saint-Simonism to economic liberalism during his life in the 19th century. His influence was soon perceived in the political world and economic debates, mainly because of his discussion of tariffs as instruments of efficient transport policies. This work discusses Chevalier’s thoughts on tariffs by revisiting his masterpiece, Le Cours d’Économie Politique. Data Envelopment Analysis (DEA) was conducted to test Chevalier’s hypothesis on the inefficiency of French tariffs. This work showed that Chevalier’s claims on French tariffs are not validated by DEA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado em Engenharia Industrial