850 resultados para Multi-Higgs Models


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work we discuss the strength of the trilinear Higgs boson coupling in composite models in a model independent way. The coupling is determined as a function of a very general ansatz for the fermionic self-energy, and turns out to be equal or smaller than the one of the Standard Model Higgs boson depending on the dynamics of the theory. © World Scientific Publishing Company.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recently a Minimal and an Ultraminimal Technicolor models were proposed where the presence of TC fermions in other representations than the fundamental one led to viable models without conflict with the known value for the measured S parameter. In this work we apply the results of [5] to compute the masses of the Higgs boson in the case of the Minimal and Ultraminimal Technicolor models. © 2010 American Institute of Physics.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Searches are reported for Higgs bosons in the context of either the standard model extended to include a fourth generation of fermions (SM4) with masses of up to 600 GeV or fermiophobic models. For the former, results from three decay modes (ττ, WW, and ZZ) are combined, whilst for the latter the diphoton decay is exploited. The analysed proton-proton collision data correspond to integrated luminosities of up to 5.1 fb-1 at 7 TeV and up to 5.3 fb-1 at 8 TeV. The observed results exclude the SM4 Higgs boson in the mass range 110-600 GeV at 99% confidence level (CL), and in the mass range 110-560 GeV at 99.9% CL. A fermiophobic Higgs boson is excluded in the mass range 110-147 GeV at 95% CL, and in the range 110-133 GeV at 99% CL. The recently observed boson with a mass near 125 GeV is not consistent with either an SM4 or a fermiophobic Higgs boson. © 2013 CERN.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Brazilian Association of Simmental and Simbrasil Cattle Farmers provided 29,510 records from 10,659 Simmental beef cattle; these were used to estimate (co)variance components and genetic parameters for weights in the growth trajectory, based on multi-trait (MTM) and random regression models (RRM). The (co)variance components and genetic parameters were estimated by restricted maximum likelihood. In the MTM analysis, the likelihood ratio test was used to determine the significance of random effects included in the model and to define the most appropriate model. All random effects were significant and included in the final model. In the RRM analysis, different adjustments of polynomial orders were compared for 5 different criteria to choose the best fit model. An RRM of third order for the direct additive genetic, direct permanent environmental, maternal additive genetic, and maternal permanent environment effects was sufficient to model variance structures in the growth trajectory of the animals. The (co)variance components were generally similar in MTM and RRM. Direct heritabilities of MTM were slightly lower than RRM and varied from 0.04 to 0.42 and 0.16 to 0.45, respectively. Additive direct correlations were mostly positive and of high magnitude, being highest at closest ages. Considering the results and that pre-adjustment of the weights to standard ages is not required, RRM is recommended for genetic evaluation of Simmental beef cattle in Brazil. ©FUNPEC-RP.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Searches are presented for heavy scalar (H) and pseudoscalar (A) Higgs bosons posited in the two doublet model (2HDM) extensions of the standard model (SM). These searches are based on a data sample of pp collisions collected with the CMS experiment at the LHC at a center-of-mass energy of root s = 8 TeV and corresponding to an integrated luminosity of 19.5 fb(-1). The decays H -> hh and A -> Zh, where h denotes an SM-like Higgs boson, lead to events with three or more isolated charged leptons or with a photon pair accompanied by one or more isolated leptons. The search results are presented in terms of the H and A production cross sections times branching fractions and are further interpreted in terms of 2HDM parameters. We place 95% C.L. cross section upper limits of approximately 7 pb on sigma B for H -> hh and 2 pb for A -> Zh. Also presented are the results of a search for the rare decay of the top quark that results in a charm quark and an SM Higgs boson, t -> ch, the existence of which would indicate a nonzero flavor-changing Yukawa coupling of the top quark to the Higgs boson. We place a 95% C.L. upper limit of 0.56% on B(t -> ch).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Le persone che soffrono di insufficienza renale terminale hanno due possibili trattamenti da affrontare: la dialisi oppure il trapianto di organo. Nel caso volessero seguire la seconda strada, oltre che essere inseriti nella lista d'attesa dei donatori deceduti, possono trovare una persona, come il coniuge, un parente o un amico, disposta a donare il proprio rene. Tuttavia, non sempre il trapianto è fattibile: donatore e ricevente possono, infatti, presentare delle incompatibilità a livello di gruppo sanguigno o di tessuto organico. Come risposta a questo tipo di problema nasce il KEP (Kidney Exchange Program), un programma, ampiamente avviato in diverse realtà europee e mondiali, che si occupa di raggruppare in un unico insieme le coppie donatore/ricevente in questa stessa situazione al fine di operare e massimizzare scambi incrociati di reni fra coppie compatibili. Questa tesi approffondisce tale questione andando a valutare la possibilità di unire in un unico insieme internazionale coppie donatore/ricevente provenienti da più paesi. Lo scopo, naturalmente, è quello di poter ottenere un numero sempre maggiore di trapianti effettuati. Lo studio affronta dal punto di vista matematico problematiche legate a tale collaborazione: i paesi che eventualmente accettassero di partecipare a un simile programma, infatti, devono avere la garanzia non solo di ricavarne un vantaggio, ma anche che tale vantaggio sia equamente distribuito fra tutti i paesi partecipanti.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Invasive exotic plants have altered natural ecosystems across much of North America. In the Midwest, the presence of invasive plants is increasing rapidly, causing changes in ecosystem patterns and processes. Early detection has become a key component in invasive plant management and in the detection of ecosystem change. Risk assessment through predictive modeling has been a useful resource for monitoring and assisting with treatment decisions for invasive plants. Predictive models were developed to assist with early detection of ten target invasive plants in the Great Lakes Network of the National Park Service and for garlic mustard throughout the Upper Peninsula of Michigan. These multi-criteria risk models utilize geographic information system (GIS) data to predict the areas at highest risk for three phases of invasion: introduction, establishment, and spread. An accuracy assessment of the models for the ten target plants in the Great Lakes Network showed an average overall accuracy of 86.3%. The model developed for garlic mustard in the Upper Peninsula resulted in an accuracy of 99.0%. Used as one of many resources, the risk maps created from the model outputs will assist with the detection of ecosystem change, the monitoring of plant invasions, and the management of invasive plants through prioritized control efforts.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this article, we perform an extensive study of flavor observables in a two-Higgs-doublet model with generic Yukawa structure (of type III). This model is interesting not only because it is the decoupling limit of the minimal supersymmetric standard model but also because of its rich flavor phenomenology which also allows for sizable effects not only in flavor-changing neutral-current (FCNC) processes but also in tauonic B decays. We examine the possible effects in flavor physics and constrain the model both from tree-level processes and from loop observables. The free parameters of the model are the heavy Higgs mass, tanβ (the ratio of vacuum expectation values) and the “nonholomorphic” Yukawa couplings ϵfij(f=u,d,ℓ). In our analysis we constrain the elements ϵfij in various ways: In a first step we give order of magnitude constraints on ϵfij from ’t Hooft’s naturalness criterion, finding that all ϵfij must be rather small unless the third generation is involved. In a second step, we constrain the Yukawa structure of the type-III two-Higgs-doublet model from tree-level FCNC processes (Bs,d→μ+μ−, KL→μ+μ−, D¯¯¯0→μ+μ−, ΔF=2 processes, τ−→μ−μ+μ−, τ−→e−μ+μ− and μ−→e−e+e−) and observe that all flavor off-diagonal elements of these couplings, except ϵu32,31 and ϵu23,13, must be very small in order to satisfy the current experimental bounds. In a third step, we consider Higgs mediated loop contributions to FCNC processes [b→s(d)γ, Bs,d mixing, K−K¯¯¯ mixing and μ→eγ] finding that also ϵu13 and ϵu23 must be very small, while the bounds on ϵu31 and ϵu32 are especially weak. Furthermore, considering the constraints from electric dipole moments we obtain constrains on some parameters ϵu,ℓij. Taking into account the constraints from FCNC processes we study the size of possible effects in the tauonic B decays (B→τν, B→Dτν and B→D∗τν) as well as in D(s)→τν, D(s)→μν, K(π)→eν, K(π)→μν and τ→K(π)ν which are all sensitive to tree-level charged Higgs exchange. Interestingly, the unconstrained ϵu32,31 are just the elements which directly enter the branching ratios for B→τν, B→Dτν and B→D∗τν. We show that they can explain the deviations from the SM predictions in these processes without fine-tuning. Furthermore, B→τν, B→Dτν and B→D∗τν can even be explained simultaneously. Finally, we give upper limits on the branching ratios of the lepton flavor-violating neutral B meson decays (Bs,d→μe, Bs,d→τe and Bs,d→τμ) and correlate the radiative lepton decays (τ→μγ, τ→eγ and μ→eγ) to the corresponding neutral current lepton decays (τ−→μ−μ+μ−, τ−→e−μ+μ− and μ−→e−e+e−). A detailed Appendix contains all relevant information for the considered processes for general scalar-fermion-fermion couplings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In these proceedings we review the flavour phenomenology of 2HDMs with generic Yukawa structures [1]. We first consider the quark sector and find that despite the stringent constraints from FCNC processes large effects in tauonic B decays are still possible. We then consider lepton flavour observables, show correlations between m →eg and m− →e−e+e− in the 2HDM of type III and give upper bounds on the lepton flavour violating B decay Bd →me.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The authors are from UPM and are relatively grouped, and all have intervened in different academic or real cases on the subject, at different times as being of different age. With precedent from E. Torroja and A. Páez in Madrid Spain Safety Probabilistic models for concrete about 1957, now in ICOSSAR conferences, author J.M. Antón involved since autumn 1967 for euro-steel construction in CECM produced a math model for independent load superposition reductions, and using it a load coefficient pattern for codes in Rome Feb. 1969, practically adopted for European constructions, giving in JCSS Lisbon Feb. 1974 suggestion of union for concrete-steel-al.. That model uses model for loads like Gumbel type I, for 50 years for one type of load, reduced to 1 year to be added to other independent loads, the sum set in Gumbel theories to 50 years return period, there are parallel models. A complete reliability system was produced, including non linear effects as from buckling, phenomena considered somehow in actual Construction Eurocodes produced from Model Codes. The system was considered by author in CEB in presence of Hydraulic effects from rivers, floods, sea, in reference with actual practice. When redacting a Road Drainage Norm in MOPU Spain an optimization model was realized by authors giving a way to determine the figure of Return Period, 10 to 50 years, for the cases of hydraulic flows to be considered in road drainage. Satisfactory examples were a stream in SE of Spain with Gumbel Type I model and a paper of Ven Te Chow with Mississippi in Keokuk using Gumbel type II, and the model can be modernized with more varied extreme laws. In fact in the MOPU drainage norm the redacting commission acted also as expert to set a table of return periods for elements of road drainage, in fact as a multi-criteria complex decision system. These precedent ideas were used e.g. in wide Codes, indicated in symposia or meetings, but not published in journals in English, and a condensate of contributions of authors is presented. The authors are somehow involved in optimization for hydraulic and agro planning, and give modest hints of intended applications in presence of agro and environment planning as a selection of the criteria and utility functions involved in bayesian, multi-criteria or mixed decision systems. Modest consideration is made of changing in climate, and on the production and commercial systems, and on others as social and financial.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract This work is focused on the problem of performing multi‐robot patrolling for infrastructure security applications in order to protect a known environment at critical facilities. Thus, given a set of robots and a set of points of interest, the patrolling task consists of constantly visiting these points at irregular time intervals for security purposes. Current existing solutions for these types of applications are predictable and inflexible. Moreover, most of the previous centralized and deterministic solutions and only few efforts have been made to integrate dynamic methods. Therefore, the development of new dynamic and decentralized collaborative approaches in order to solve the aforementioned problem by implementing learning models from Game Theory. The model selected in this work that includes belief‐based and reinforcement models as special cases is called Experience‐Weighted Attraction. The problem has been defined using concepts of Graph Theory to represent the environment in order to work with such Game Theory techniques. Finally, the proposed methods have been evaluated experimentally by using a patrolling simulator. The results obtained have been compared with previous available

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In recent decades, there has been an increasing interest in systems comprised of several autonomous mobile robots, and as a result, there has been a substantial amount of development in the eld of Articial Intelligence, especially in Robotics. There are several studies in the literature by some researchers from the scientic community that focus on the creation of intelligent machines and devices capable to imitate the functions and movements of living beings. Multi-Robot Systems (MRS) can often deal with tasks that are dicult, if not impossible, to be accomplished by a single robot. In the context of MRS, one of the main challenges is the need to control, coordinate and synchronize the operation of multiple robots to perform a specic task. This requires the development of new strategies and methods which allow us to obtain the desired system behavior in a formal and concise way. This PhD thesis aims to study the coordination of multi-robot systems, in particular, addresses the problem of the distribution of heterogeneous multi-tasks. The main interest in these systems is to understand how from simple rules inspired by the division of labor in social insects, a group of robots can perform tasks in an organized and coordinated way. We are mainly interested on truly distributed or decentralized solutions in which the robots themselves, autonomously and in an individual manner, select a particular task so that all tasks are optimally distributed. In general, to perform the multi-tasks distribution among a team of robots, they have to synchronize their actions and exchange information. Under this approach we can speak of multi-tasks selection instead of multi-tasks assignment, which means, that the agents or robots select the tasks instead of being assigned a task by a central controller. The key element in these algorithms is the estimation ix of the stimuli and the adaptive update of the thresholds. This means that each robot performs this estimate locally depending on the load or the number of pending tasks to be performed. In addition, it is very interesting the evaluation of the results in function in each approach, comparing the results obtained by the introducing noise in the number of pending loads, with the purpose of simulate the robot's error in estimating the real number of pending tasks. The main contribution of this thesis can be found in the approach based on self-organization and division of labor in social insects. An experimental scenario for the coordination problem among multiple robots, the robustness of the approaches and the generation of dynamic tasks have been presented and discussed. The particular issues studied are: Threshold models: It presents the experiments conducted to test the response threshold model with the objective to analyze the system performance index, for the problem of the distribution of heterogeneous multitasks in multi-robot systems; also has been introduced additive noise in the number of pending loads and has been generated dynamic tasks over time. Learning automata methods: It describes the experiments to test the learning automata-based probabilistic algorithms. The approach was tested to evaluate the system performance index with additive noise and with dynamic tasks generation for the same problem of the distribution of heterogeneous multi-tasks in multi-robot systems. Ant colony optimization: The goal of the experiments presented is to test the ant colony optimization-based deterministic algorithms, to achieve the distribution of heterogeneous multi-tasks in multi-robot systems. In the experiments performed, the system performance index is evaluated by introducing additive noise and dynamic tasks generation over time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper focuses on the general problem of coordinating multiple robots. More specifically, it addresses the self-selection of heterogeneous specialized tasks by autonomous robots. In this paper we focus on a specifically distributed or decentralized approach as we are particularly interested in a decentralized solution where the robots themselves autonomously and in an individual manner, are responsible for selecting a particular task so that all the existing tasks are optimally distributed and executed. In this regard, we have established an experimental scenario to solve the corresponding multi-task distribution problem and we propose a solution using two different approaches by applying Response Threshold Models as well as Learning Automata-based probabilistic algorithms. We have evaluated the robustness of the algorithms, perturbing the number of pending loads to simulate the robot’s error in estimating the real number of pending tasks and also the dynamic generation of loads through time. The paper ends with a critical discussion of experimental results.