62 resultados para Quantum Chromodynamics, Helicity Rates, One-Loop Corrections, Bremsstrahlung Contributions, Heavy Quarks, Standard Model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the differential rates and branching ratios of the radiative decays τ→lννγ, with l = e or μ, and μ→eννγ in the Standard Model at next-to-leading order. Radiative corrections are computed taking into account the full depencence on the mass m l of the final charged leptons, which is necessary for the correct determination of the branching ratios. Only partial agreement is found with previous calculations performed in the m l → 0 limit. Our results agree with the measurements of the branching ratios B(μ→eννγ) and B(τ→μννγ) for a minimum photon energy of 10 MeV in the μ and τ rest frames, respectively. Babar’s recent precise measurement of the branching ratio B(τ→eννγ), for the same photon energy threshold, differs from our prediction by 3.5 standard deviations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for diphoton events with large missing transverse energy is presented. The data were collected with the ATLAS detector in proton-proton collisions at √s=7 TeV at the CERN Large Hadron Collider and correspond to an integrated luminosity of 3.1 pb⁻¹. No excess of such events is observed above the standard model background prediction. In the context of a specific model with one universal extra dimension with compactification radius R and gravity-induced decays, values of 1/R<729 GeV are excluded at 95% C. L., providing the most sensitive limit on this model to date.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Antimicrobial resistance is an emerging concern to public health, and food-producing animals are known to be a potential source for transmission of resistant bacteria to humans. As legislation of the European Union requires to ban conventional cages for the housing of laying hens on the one hand, and a high food safety standard for eggs on the other hand, further investigations about the occurrence of antimicrobial resistance in alternative housing types are required. In this study, we determined antimicrobial resistance in indicator bacteria from 396 cloacal swabs from 99 Swiss laying hen farms among four alternative housing types during a cross-sectional study. On each farm, four hens were sampled and exposure to potential risk factors was identified with a questionnaire. The minimal inhibitory concentration was determined using broth microdilution in Escherichia coli (n=371) for 18 antimicrobials and in Enterococcus faecalis (n=138) and Enterococcus faecium (n=153) for 16 antimicrobials. All antimicrobial classes recommended by the European Food Safety Authority for E. coli and enterococci were included in the resistance profile. Sixty per cent of the E. coli isolates were susceptible to all of the considered antimicrobials and 30% were resistant to at least two antimicrobials. In E. faecalis, 33% of the strains were susceptible to all tested antimicrobials and 40% were resistant to two or more antimicrobials, whereas in E. faecium these figures were 14% and 39% respectively. Risk factor analyses were carried out for bacteria species and antimicrobials with a prevalence of resistance between 15% and 85%. In these analyses, none of the considered housing and management factors showed a consistent association with the prevalence of resistance for more than two combinations of bacteria and antimicrobial. Therefore we conclude that the impact of the considered housing and management practices on the egg producing farms on resistance in laying hens is low.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RATIONALE AND OBJECTIVES: A feasibility study on measuring kidney perfusion by a contrast-free magnetic resonance (MR) imaging technique is presented. MATERIALS AND METHODS: A flow-sensitive alternating inversion recovery (FAIR) prepared true fast imaging with steady-state precession (TrueFISP) arterial spin labeling sequence was used on a 3.0-T MR-scanner. The basis for quantification is a two-compartment exchange model proposed by Parkes that corrects for diverse assumptions in single-compartment standard models. RESULTS: Eleven healthy volunteers (mean age, 42.3 years; range 24-55) were examined. The calculated mean renal blood flow values for the exchange model (109 +/- 5 [medulla] and 245 +/- 11 [cortex] ml/min - 100 g) are in good agreement with the literature. Most important, the two-compartment exchange model exhibits a stabilizing effect on the evaluation of perfusion values if the finite permeability of the vessel wall and the venous outflow (fast solution) are considered: the values for the one-compartment standard model were 93 +/- 18 (medulla) and 208 +/- 37 (cortex) ml/min - 100 g. CONCLUSION: This improvement will increase the accuracy of contrast-free imaging of kidney perfusion in treatment renovascular disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, an Insulin Infusion Advisory System (IIAS) for Type 1 diabetes patients, which use insulin pumps for the Continuous Subcutaneous Insulin Infusion (CSII) is presented. The purpose of the system is to estimate the appropriate insulin infusion rates. The system is based on a Non-Linear Model Predictive Controller (NMPC) which uses a hybrid model. The model comprises a Compartmental Model (CM), which simulates the absorption of the glucose to the blood due to meal intakes, and a Neural Network (NN), which simulates the glucose-insulin kinetics. The NN is a Recurrent NN (RNN) trained with the Real Time Recurrent Learning (RTRL) algorithm. The output of the model consists of short term glucose predictions and provides input to the NMPC, in order for the latter to estimate the optimum insulin infusion rates. For the development and the evaluation of the IIAS, data generated from a Mathematical Model (MM) of a Type 1 diabetes patient have been used. The proposed control strategy is evaluated at multiple meal disturbances, various noise levels and additional time delays. The results indicate that the implemented IIAS is capable of handling multiple meals, which correspond to realistic meal profiles, large noise levels and time delays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we perform an extensive study of flavor observables in a two-Higgs-doublet model with generic Yukawa structure (of type III). This model is interesting not only because it is the decoupling limit of the minimal supersymmetric standard model but also because of its rich flavor phenomenology which also allows for sizable effects not only in flavor-changing neutral-current (FCNC) processes but also in tauonic B decays. We examine the possible effects in flavor physics and constrain the model both from tree-level processes and from loop observables. The free parameters of the model are the heavy Higgs mass, tanβ (the ratio of vacuum expectation values) and the “nonholomorphic” Yukawa couplings ϵfij(f=u,d,ℓ). In our analysis we constrain the elements ϵfij in various ways: In a first step we give order of magnitude constraints on ϵfij from ’t Hooft’s naturalness criterion, finding that all ϵfij must be rather small unless the third generation is involved. In a second step, we constrain the Yukawa structure of the type-III two-Higgs-doublet model from tree-level FCNC processes (Bs,d→μ+μ−, KL→μ+μ−, D¯¯¯0→μ+μ−, ΔF=2 processes, τ−→μ−μ+μ−, τ−→e−μ+μ− and μ−→e−e+e−) and observe that all flavor off-diagonal elements of these couplings, except ϵu32,31 and ϵu23,13, must be very small in order to satisfy the current experimental bounds. In a third step, we consider Higgs mediated loop contributions to FCNC processes [b→s(d)γ, Bs,d mixing, K−K¯¯¯ mixing and μ→eγ] finding that also ϵu13 and ϵu23 must be very small, while the bounds on ϵu31 and ϵu32 are especially weak. Furthermore, considering the constraints from electric dipole moments we obtain constrains on some parameters ϵu,ℓij. Taking into account the constraints from FCNC processes we study the size of possible effects in the tauonic B decays (B→τν, B→Dτν and B→D∗τν) as well as in D(s)→τν, D(s)→μν, K(π)→eν, K(π)→μν and τ→K(π)ν which are all sensitive to tree-level charged Higgs exchange. Interestingly, the unconstrained ϵu32,31 are just the elements which directly enter the branching ratios for B→τν, B→Dτν and B→D∗τν. We show that they can explain the deviations from the SM predictions in these processes without fine-tuning. Furthermore, B→τν, B→Dτν and B→D∗τν can even be explained simultaneously. Finally, we give upper limits on the branching ratios of the lepton flavor-violating neutral B meson decays (Bs,d→μe, Bs,d→τe and Bs,d→τμ) and correlate the radiative lepton decays (τ→μγ, τ→eγ and μ→eγ) to the corresponding neutral current lepton decays (τ−→μ−μ+μ−, τ−→e−μ+μ− and μ−→e−e+e−). A detailed Appendix contains all relevant information for the considered processes for general scalar-fermion-fermion couplings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Report summarizes the results of the activities in 2012 and the first half of 2013 of the LHC Higgs Cross Section Working Group. The main goal of the working group was to present the state of the art of Higgs Physics at the LHC, integrating all new results that have appeared in the last few years. This report follows the first working group report Handbook of LHC Higgs Cross Sections: 1. Inclusive Observables (CERN-2011-002) and the second working group report Handbook of LHC Higgs Cross Sections: 2. Differential Distributions (CERN-2012-002). After the discovery of a Higgs boson at the LHC in mid-2012 this report focuses on refined prediction of Standard Model (SM) Higgs phenomenology around the experimentally observed value of 125-126 GeV, refined predictions for heavy SM-like Higgs bosons as well as predictions in the Minimal Supersymmetric Standard Model and first steps to go beyond these models. The other main focus is on the extraction of the characteristics and properties of the newly discovered particle such as couplings to SM particles, spin and CP-quantum numbers etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurements are presented of production properties and couplings of the recently discovered Higgs boson using the decays into boson pairs, H --> gamma-gamma, H --> ZZ* --> 4 leptons and H --> WW --> 2 leptons + 2 neutrinos. The results are based on the complete pp collision data sample recorded by the ATLAS experiment at the CERN Large Hadron Collider at centre-of-mass energies of 7 TeV and 8 TeV, corresponding to an integrated luminosity of about 25/fb. Evidence for Higgs boson production through vector-boson fusion is reported. Results of combined fits probing Higgs boson couplings to fermions and bosons, as well as anomalous contributions to loop-induced production and decay modes, are presented. All measurements are consistent with expectations for the Standard Model Higgs boson.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Letter describes a model-independent search for the production of new resonances in photon + jet events using 20 inverse fb of proton--proton LHC data recorded with the ATLAS detector at a centre-of-mass energy of s√ = 8 TeV. The photon + jet mass distribution is compared to a background model fit from data; no significant deviation from the background-only hypothesis is found. Limits are set at 95% credibility level on generic Gaussian-shaped signals and two benchmark phenomena beyond the Standard Model: non-thermal quantum black holes and excited quarks. Non-thermal quantum black holes are excluded below masses of 4.6 TeV and excited quarks are excluded below masses of 3.5 TeV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE To explore the cost-effectiveness of using drug-eluting balloon (DEB) angioplasty for the treatment of femoropopliteal arterial lesions, which has been shown to significantly lower the rates of target lesion revascularization (TLR) compared with standard balloon angioplasty (BA). METHODS A simplified decision-analytic model based on TLR rates reported in the literature was applied to baseline and follow-up costs associated with in-hospital patient treatment during 1 year of follow-up. Costs were expressed in Swiss Francs (sFr) and calculated per 100 patients treated. Budgets were analyzed in the context of current SwissDRG reimbursement figures and calculated from two different perspectives: a general budget on total treatment costs (third-party healthcare payer) as well as a budget focusing on the physician/facility provider perspective. RESULTS After 1 year, use of DEB was associated with substantially lower total inpatient treatment costs when compared with BA (sFr 861,916 vs. sFr 951,877) despite the need for a greater investment at baseline related to higher prices for DEBs. In the absence of dedicated reimbursement incentives, however, use of DEB was shown to be the financially less favorable treatment approach from the physician/facility provider perspective (12-month total earnings: sFr 179,238 vs. sFr 333,678). CONCLUSION Use of DEBs may be cost-effective through prevention of TLR at 1 year of follow-up. The introduction of dedicated financial incentives aimed at improving DEB reimbursements may help lower total healthcare costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

his Letter presents measurements of the polarization of the top quark in top-antitop quark pair events, using 4.7  fb−1 of proton-proton collision data recorded with the ATLAS detector at the Large Hadron Collider at s√=7  TeV. Final states containing one or two isolated leptons (electrons or muons) and jets are considered. Two measurements of αℓP, the product of the leptonic spin-analyzing power and the top quark polarization, are performed assuming that the polarization is introduced by either a CP conserving or a maximally CP violating production process. The measurements obtained, αℓPCPC=−0.035±0.014(stat)±0.037(syst) and αℓPCPV=0.020±0.016(stat)+0.013−0.017(syst), are in good agreement with the standard model prediction of negligible top quark polarization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for supersymmetric particles in final states with zero, one, and two leptons, with and without jets identified as originating from b-quarks, in 4.7 fb(-1) of root s = 7 TeV pp collisions produced by the Large Hadron Collider and recorded by the ATLAS detector is presented. The search uses a set of variables carrying information on the event kinematics transverse and parallel to the beam line that are sensitive to several topologies expected in supersymmetry. Mutually exclusive final states are defined, allowing a combination of all channels to increase the search sensitivity. No deviation from the Standard Model expectation is observed. Upper limits at 95 % confidence level on visible cross-sections for the production of new particles are extracted. Results are interpreted in the context of the constrained minimal supersymmetric extension to the Standard Model and in supersymmetry-inspired models with diverse, high-multiplicity final states.