970 resultados para Standard Model


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ce mémoire s’intéresse à l’étude du critère de validation croisée pour le choix des modèles relatifs aux petits domaines. L’étude est limitée aux modèles de petits domaines au niveau des unités. Le modèle de base des petits domaines est introduit par Battese, Harter et Fuller en 1988. C’est un modèle de régression linéaire mixte avec une ordonnée à l’origine aléatoire. Il se compose d’un certain nombre de paramètres : le paramètre β de la partie fixe, la composante aléatoire et les variances relatives à l’erreur résiduelle. Le modèle de Battese et al. est utilisé pour prédire, lors d’une enquête, la moyenne d’une variable d’intérêt y dans chaque petit domaine en utilisant une variable auxiliaire administrative x connue sur toute la population. La méthode d’estimation consiste à utiliser une distribution normale, pour modéliser la composante résiduelle du modèle. La considération d’une dépendance résiduelle générale, c’est-à-dire autre que la loi normale donne une méthodologie plus flexible. Cette généralisation conduit à une nouvelle classe de modèles échangeables. En effet, la généralisation se situe au niveau de la modélisation de la dépendance résiduelle qui peut être soit normale (c’est le cas du modèle de Battese et al.) ou non-normale. L’objectif est de déterminer les paramètres propres aux petits domaines avec le plus de précision possible. Cet enjeu est lié au choix de la bonne dépendance résiduelle à utiliser dans le modèle. Le critère de validation croisée sera étudié à cet effet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O principal objetivo desta dissertação é a produção de charginos (partículas supersimétricascarregadas) leves no futuro acelerador internacional linear de e +e− (ILC) para diferentescenários de quebra de supersimetria. Charginos são partículas constituídas pela mistura docampo Wino carregado com o Higgsino carregado. A principal motivação para se estudar teorias supersimétricas deve-se ao grande número de problemas do Modelo Padrão (SM) que esta consegue solucionar, entre eles: massa dos neutrinos, matéria escura fria e o ajuste-fine (finetuning). Além disso, estudamos os princípios fundamentais que norteam a física de partículas,isto é, o princípio de gauge e o mecanismo de Higgs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A presente investigação identifica o caminho seguido no estudo que se centrou na durabilidade dos produtos como estratégia para a prevenção/redução dos impactes ambientais provocados pela excessiva produção e consumo das sociedades ocidentais. O trabalho baseou-se primeiramente na reunião de argumentos que justificassem a necessidade da investigação. A partir dos conhecimentos obtidos sobre as consequências do descarte prematuro de produtos, incluindo a produção de resíduos e utilização de recursos naturais, a investigação foi direcionada para as estratégias que motivam a redução do consumo pelo aumento da vida útil do produto. A durabilidade de alguns produtos designados de Clássicos do Design motivou o desenvolvimento da investigação. Depois de definido o universo que se enquadra nesta categoria, foi selecionada uma amostra que se considerou representativa e criou-se uma base de dados onde se sistematizou os conteúdos relevantes a conhecer. Através da análise qualitativa e quantitativa da amostra desses produtos, obteve-se uma matriz operativa composta por 10 princípios que pode ser introduzida no processo de design de novos produtos para aquisição de um tempo de vida útil inicial potencialmente maior. Os resultados da aplicação prática da estratégia desenvolvida, a tese, determinarão no futuro a conceção e produção de artefactos que se pretende apresentar à indústria nacional.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Present the measurement of a rare Standard Model processes, pp →W±γγ for the leptonic decays of the W±. The measurement is made with 19.4 fb−1 of 8 TeV data collected in 2012 by the CMS experiment. The measured cross section is consistent with the Standard Model prediction and has a significance of 2.9σ. Limits are placed on dimension-8 Effective Field Theories of anomalous Quartic Gauge Couplings. The analysis has particularly sensitivity to the fT,0 coupling and a 95% confidence limit is placed at −35.9 < fT,0/Λ4< 36.7 TeV−4. Studies of the pp →Zγγ process are also presented. The Zγγ signal is in strict agreement with the Standard Model and has a significance of 5.9σ.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Since it has been found that the MadGraph Monte Carlo generator offers superior flavour-matching capability as compared to Alpgen, the suitability of MadGraph for the generation of ttb¯ ¯b events is explored, with a view to simulating this background in searches for the Standard Model Higgs production and decay process ttH, H ¯ → b ¯b. Comparisons are performed between the output of MadGraph and that of Alpgen, showing that satisfactory agreement in their predictions can be obtained with the appropriate generator settings. A search for the Standard Model Higgs boson, produced in association with the top quark and decaying into a b ¯b pair, using 20.3 fb−1 of 8 TeV collision data collected in 2012 by the ATLAS experiment at CERN’s Large Hadron Collider, is presented. The GlaNtp analysis framework, together with the RooFit package and associated software, are used to obtain an expected 95% confidence-level limit of 4.2 +4.1 −2.0 times the Standard Model expectation, and the corresponding observed limit is found to be 5.9; this is within experimental uncertainty of the published result of the analysis performed by the ATLAS collaboration. A search for a heavy charged Higgs boson of mass mH± in the range 200 ≤ mH± /GeV ≤ 600, where the Higgs mediates the five-flavour beyond-theStandard-Model physics process gb → tH± → ttb, with one top quark decaying leptonically and the other decaying hadronically, is presented, using the 20.3 fb−1 8 TeV ATLAS data set. Upper limits on the product of the production cross-section and the branching ratio of the H± boson are computed for six mass points, and these are found to be compatible within experimental uncertainty with those obtained by the corresponding published ATLAS analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Using the one-loop Coleman-Weinberg effective potential, we derive a general analytic expression for all the derivatives of the effective potential with respect to any number of classical scalar fields. The result is valid for a renormalisable theory in four dimensions with any number of scalars, fermions or gauge bosons. This result corresponds to the zero-external momentum contribution to a general one-loop diagram with N scalar external legs. We illustrate the use of the general result in two simple scalar singlet extensions of the Standard Model, to obtain the dominant contributions to the triple couplings of light scalar particles under the zero external momentum approximation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Simplifying the Einstein field equation by assuming the cosmological principle yields a set of differential equations which governs the dynamics of the universe as described in the cosmological standard model. The cosmological principle assumes the space appears the same everywhere and in every direction and moreover, the principle has earned its position as a fundamental assumption in cosmology by being compatible with the observations of the 20th century. It was not until the current century when observations in cosmological scales showed significant deviation from isotropy and homogeneity implying the violation of the principle. Among these observations are the inconsistency between local and non-local Hubble parameter evaluations, baryon acoustic features of the Lyman-α forest and the anomalies of the cosmic microwave background radiation. As a consequence, cosmological models beyond the cosmological principle have been studied vastly; after all, the principle is a hypothesis and as such should frequently be tested as any other assumption in physics. In this thesis, the effects of inhomogeneity and anisotropy, arising as a consequence of discarding the cosmological principle, is investigated. The geometry and matter content of the universe becomes more cumbersome and the resulting effects on the Einstein field equation is introduced. The cosmological standard model and its issues, both fundamental and observational are presented. Particular interest is given to the local Hubble parameter, supernova explosion, baryon acoustic oscillation, and cosmic microwave background observations and the cosmological constant problems. Explored and proposed resolutions emerging by violating the cosmological principle are reviewed. This thesis is concluded by a summary and outlook of the included research papers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The extreme sensitivity of the mass of the Higgs boson to quantum corrections from high mass states, makes it 'unnaturally' light in the standard model. This 'hierarchy problem' can be solved by symmetries, which predict new particles related, by the symmetry, to standard model fields. The Large Hadron Collider (LHC) can potentially discover these new particles, thereby finding the solution to the hierarchy problem. However, the dynamics of the Higgs boson is also sensitive to this new physics. We show that in many scenarios the Higgs can be a complementary and powerful probe of the hierarchy problem at the LHC and future colliders. If the top quark partners carry the color charge of the strong nuclear force, the production of Higgs pairs is affected. This effect is tightly correlated with single Higgs production, implying that only modest enhancements in di-Higgs production occur when the top partners are heavy. However, if the top partners are light, we show that di-Higgs production is a useful complementary probe to single Higgs production. We verify this result in the context of a simplified supersymmetric model. If the top partners do not carry color charge, their direct production is greatly reduced. Nevertheless, we show that such scenarios can be revealed through Higgs dynamics. We find that many color neutral frameworks leave observable traces in Higgs couplings, which, in some cases, may be the only way to probe these theories at the LHC. Some realizations of the color neutral framework also lead to exotic decays of the Higgs with displaced vertices. We show that these decays are so striking that the projected sensitivity for these searches, at hadron colliders, is comparable to that of searches for colored top partners. Taken together, these three case studies show the efficacy of the Higgs as a probe of naturalness.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação (Mestrado em Tecnologia Nuclear)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the past few years, there has been a concern among economists and policy makers that increased openness to international trade affects some regions in a country more than others. Recent research has found that local labor markets more exposed to import competition through their initial employment composition experience worse outcomes in several dimensions such as, employment, wages, and poverty. Although there is evidence that regions within a country exhibit variation in the intensity with which they trade with each other and with other countries, trade linkages have been ignored in empirical analyses of the regional effects of trade, which focus on differences in employment composition. In this dissertation, I investigate how local labor markets' trade linkages shape the response of wages to international trade shocks. In the second chapter, I lay out a standard multi-sector general equilibrium model of trade, where domestic regions trade with each other and with the rest of the world. Using this benchmark, I decompose a region's wage change resulting from a national import cost shock into a direct effect on prices, holding other endogenous variables constant, and a series of general equilibrium effects. I argue the direct effect provides a natural measure of exposure to import competition within the model since it summarizes the effect of the shock on a region's wage as a function of initial conditions given by its trade linkages. I call my proposed measure linkage exposure while I refer to the measures used in previous studies as employment exposure. My theoretical analysis also shows that the assumptions previous studies make on trade linkages are not consistent with the standard trade model. In the third chapter, I calibrate the model to the Brazilian economy in 1991--at the beginning of a period of trade liberalization--to perform a series of experiments. In each of them, I reduce the Brazilian import cost by 1 percent in a single sector and I calculate how much of the cross-regional variation in counterfactual wage changes is explained by exposure measures. Over this set of experiments, employment exposure explains, for the median sector, 2 percent of the variation in counterfactual wage changes while linkage exposure explains 44 percent. In addition, I propose an estimation strategy that incorporates trade linkages in the analysis of the effects of trade on observed wages. In the model, changes in wages are completely determined by changes in market access, an endogenous variable that summarizes the real demand faced by a region. I show that a linkage measure of exposure is a valid instrument for changes in market access within Brazil. By using observed wage changes in Brazil between 1991-2000, my estimates imply that a region at the 25th percentile of the change in domestic market access induced by trade liberalization, experiences a 0.6 log points larger wage decline (or smaller wage increase) than a region at the 75th percentile. The estimates from a regression of wages changes on exposure imply that a region at the 25th percentile of exposure experiences a 3 log points larger wage decline (or smaller wage increase) than a region at the 75th percentile. I conclude that estimates based on exposure overstate the negative impact of trade liberalization on wages in Brazil. In the fourth chapter, I extend the standard model to allow for two types of workers according to their education levels: skilled and unskilled. I show that there is substantial variation across Brazilian regions in the skill premium. I use the exogenous variation provided by tariff changes to estimate the impact of market access on the skill premium. I find that decreased domestic market access resulting from trade liberalization resulted in a higher skill premium. I propose a mechanism to explain this result: that the manufacturing sector is relatively more intensive in unskilled labor and I show empirical evidence that supports this hypothesis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Attribute-based signature (ABS) is a novel cryptographic primitive, which can make the signing party sign a message with fine-grained control over identifying information. ABS only reveals the fact that the verified message must be signed by a user with a set of attributes satisfying a predicate. Thus, ABS can hide any identifying information and make fine-grained control on signing. Presently, many attribute-based signature schemes have been proposed, but most of them are not very efficient. Maji et al. recently presented a complete definition and construction about ABS for monotone predicates and showed three instantiations under their framework for ABS. Although the most practical one of their instantiations is efficient, the instantiation is constructed in the generic group model and has been proved to be insecure. Then, Okamoto et al. proposed an attribute-based signature scheme in the standard model, which can support generalized non-monotone predicates over access structure. However, their scheme is not efficient in practice. In this paper, we present a framework for ABS and show a detailed security model for ABS. Under our framework, we present an attribute-based signature scheme for monotone predicates in the standard model, where we choose the Waters’ signature scheme as the prototype of our attribute-based signature scheme. Compared with the Maji’s scheme in the generic group model, the proposed scheme is constructed in the standard model. Furthermore, compared with the Okamoto’s scheme, the proposed scheme is more efficient by decreasing the computation cost.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Vehicular ad hoc network (VANET) is an increasing important paradigm, which not only provides safety enhancement but also improves roadway system efficiency. However, the security issues of data confidentiality, and access control over transmitted messages in VANET have remained to be solved. In this paper, we propose a secure and efficient message dissemination scheme (SEMD) with policy enforcement in VANET, and construct an outsourcing decryption of ciphertext-policy attribute-based encryption (CP-ABE) to provide differentiated access control services, which makes the vehicles delegate most of the decryption computation to nearest roadside unit (RSU). Performance evaluation demonstrates its efficiency in terms of computational complexity, space complexity, and decryption time. Security proof shows that it is secure against replayable choosen-ciphertext attacks (RCCA) in the standard model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Parabolic trough concentrator collector is the most matured, proven and widespread technology for the exploitation of the solar energy on a large scale for middle temperature applications. The assessment of the opportunities and the possibilities of the collector system are relied on its optical performance. A reliable Monte Carlo ray tracing model of a parabolic trough collector is developed by using Zemax software. The optical performance of an ideal collector depends on the solar spectral distribution and the sunshape, and the spectral selectivity of the associated components. Therefore, each step of the model, including the spectral distribution of the solar energy, trough reflectance, glazing anti-reflection coating and the absorber selective coating is explained and verified. Radiation flux distribution around the receiver, and the optical efficiency are two basic aspects of optical simulation are calculated using the model, and verified with widely accepted analytical profile and measured values respectively. Reasonably very good agreement is obtained. Further investigations are carried out to analyse the characteristics of radiation distribution around the receiver tube at different insolation, envelop conditions, and selective coating on the receiver; and the impact of scattered light from the receiver surface on the efficiency. However, the model has the capability to analyse the optical performance at variable sunshape, tracking error, collector imperfections including absorber misalignment with focal line and de-focal effect of the absorber, different rim angles, and geometric concentrations. The current optical model can play a significant role in understanding the optical aspects of a trough collector, and can be employed to extract useful information on the optical performance. In the long run, this optical model will pave the way for the construction of low cost standalone photovoltaic and thermal hybrid collector in Australia for small scale domestic hot water and electricity production.