943 resultados para Little Higgs model
Resumo:
There has been a tremendous increase in our knowledge of hum motor performance over the last few decades. Our theoretical understanding of how an individual learns to move is sophisticated and complex. It is difficult however to relate much of this information in practical terms to physical educators, coaches, and therapists concerned with the learning of motor skills (Shumway-Cook & Woolcott, 1995). Much of our knowledge stems from lab testing which often appears to bear little relation to real-life situations. This lack of ecological validity has slowed the flow of information from the theorists and researchers to the practitioners. This paper is concerned with taking some small aspects of motor learning theory, unifying them, and presenting them in a usable fashion. The intention is not to present a recipe for teaching motor skills, but to present a framework from which solutions can be found. If motor performance research has taught us anything, it is that every individual and situation presents unique challenges. By increasing our ability to conceptualize the learning situation we should be able to develop more flexible and adaptive responses to the challege of teaching motor skills. The model presented here allows a teacher, coach, or therapist to use readily available observations and known characteristics about a motor task and to conceptualize them in a manner which allows them to make appropriate teaching/learning decisions.
Resumo:
Since the beginning of the 20th century, the Garden City model has been a predominant theory emerging from Ecological Urbanism. In his book Howard observed the disastrous effects of rapid urbanization and as a response, proposed the Garden City. Although Howard’s proposal was first published in the late 1800’s, the clear imbalance that Howard aimed to address is still prevalent in the UK today. Each year, the UK wastes nearly 15 million tons of food, despite this an estimated 500,000 people in the UK go without sufficient access to food. While the urban population is rapidly increasing and cities are becoming hubs of economic activity, producing wealth and improving education and access to markets, it is within these cities that the imbalance is most evident, with a significant proportion of the world’s population with unmet needs living in urban areas. Despite Howard’s model being a response to 17th century London, many still consider the Garden City model to be an effective solution for the 21st century. In his book, Howard details the metrics required for the design of a Garden City. This paper will discuss how, by using this methodology and comparing it with more recent studies by Cornell University and Matthew Wheeland (Pure Energies); it is possible to test the validity of Howard’s proposal to establish whether the Garden City model is a viable solution to the increasing pressures of urbanization.
This paper outlines how the analysis of Howard’s proposal has shown the model to be flawed, incapable of producing enough food to sustain the proposed 32,000 population, with a capacity to produce only 23% of the food required to meet the current average UK consumption rate. Beyond the limited productive capacity of Howard’s model, the design itself does little to increase local resilience or the ecological base. This paper will also discuss how a greater understanding of the
Land-share requirements enables the design of a new urban model, building on the foundations initially laid out by Howard and combining a number of other theories to produce a more resilient and efficient model of ecological urbanism.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Provenance plays a pivotal in tracing the origin of something and determining how and why something had occurred. With the emergence of the cloud and the benefits it encompasses, there has been a rapid proliferation of services being adopted by commercial and government sectors. However, trust and security concerns for such services are on an unprecedented scale. Currently, these services expose very little internal working to their customers; this can cause accountability and compliance issues especially in the event of a fault or error, customers and providers are left to point finger at each other. Provenance-based traceability provides a mean to address part of this problem by being able to capture and query events occurred in the past to understand how and why it took place. However, due to the complexity of the cloud infrastructure, the current provenance models lack the expressibility required to describe the inner-working of a cloud service. For a complete solution, a provenance-aware policy language is also required for operators and users to define policies for compliance purpose. The current policy standards do not cater for such requirement. To address these issues, in this paper we propose a provenance (traceability) model cProv, and a provenance-aware policy language (cProvl) to capture traceability data, and express policies for validating against the model. For implementation, we have extended the XACML3.0 architecture to support provenance, and provided a translator that converts cProvl policy and request into XACML type.
Resumo:
Iron-chromium alloys are used as a model to study the microstructural evolution of defects in irradiated structural steel components of a nuclear reactor. We examine the effects of temperature and chromium concentration on the defect evolution and segregation behavior in the early stages of damage. In situ irradiations are conducted in a transmission electron microscope (TEM) at 300°C and 450°C with 150keV iron ions in single crystal Fe14Cr and Fe19Cr bicrystal to doses of 2E15 ions/cm^2. The microstructures resulting from annealing and irradiation of the alloy are characterized by analysis of TEM micrographs and diffraction patterns and compared with those of irradiated pure iron. We found the irradiation temperature to have little effect on the microstructural development. We also found that the presence of chromium in the sample leads to defect populations with small average loop size and no extended or nested loop structures, in contrast to the populations of large extended loops seen in irradiated pure iron. A very weak dependence was found on the specific chromium content of the alloy. Chromium was shown to suppress defect growth by inhibiting defect mobility in the alloy. While defects in pure iron are highly mobile and able to grow, those in the FeCr alloys remained small and relatively motionless due to the pinning effect of the chromium.
Resumo:
The underwater environment is an extreme environment that requires a process of human adaptation with specific psychophysiological demands to ensure survival and productive activity. From the standpoint of existing models of intelligence, personality and performance, in this explanatory study we have analyzed the contribution of individual differences in explaining the adaptation of military personnel in a stressful environment. Structural equation analysis was employed to verify a model representing the direct effects of psychological variables on individual adaptation to an adverse environment, and we have been able to confirm, during basic military diving courses, the structural relationships among these variables and their ability to predict a third of the variance of a criterion that has been studied very little to date. In this way, we have confirmed in a sample of professionals (N = 575) the direct relationship of emotional adjustment, conscientiousness and general mental ability with underwater adaptation, as well as the inverse relationship of emotional reactivity. These constructs are the psychological basis for working under water, contributing to an improved adaptation to this environment and promoting risk prevention and safety in diving activities.
Resumo:
Since it has been found that the MadGraph Monte Carlo generator offers superior flavour-matching capability as compared to Alpgen, the suitability of MadGraph for the generation of ttb¯ ¯b events is explored, with a view to simulating this background in searches for the Standard Model Higgs production and decay process ttH, H ¯ → b ¯b. Comparisons are performed between the output of MadGraph and that of Alpgen, showing that satisfactory agreement in their predictions can be obtained with the appropriate generator settings. A search for the Standard Model Higgs boson, produced in association with the top quark and decaying into a b ¯b pair, using 20.3 fb−1 of 8 TeV collision data collected in 2012 by the ATLAS experiment at CERN’s Large Hadron Collider, is presented. The GlaNtp analysis framework, together with the RooFit package and associated software, are used to obtain an expected 95% confidence-level limit of 4.2 +4.1 −2.0 times the Standard Model expectation, and the corresponding observed limit is found to be 5.9; this is within experimental uncertainty of the published result of the analysis performed by the ATLAS collaboration. A search for a heavy charged Higgs boson of mass mH± in the range 200 ≤ mH± /GeV ≤ 600, where the Higgs mediates the five-flavour beyond-theStandard-Model physics process gb → tH± → ttb, with one top quark decaying leptonically and the other decaying hadronically, is presented, using the 20.3 fb−1 8 TeV ATLAS data set. Upper limits on the product of the production cross-section and the branching ratio of the H± boson are computed for six mass points, and these are found to be compatible within experimental uncertainty with those obtained by the corresponding published ATLAS analysis.
Resumo:
In this paper we present the development and the implementation of a content analysis model for observing aspects relating to the social mission of the public library on Facebook pages and websites. The model is unique and it was developed from the literature. There were designed the four categories for analysis Generate social capital and social cohesion, Consolidate democracy and citizenship, Social and digital inclusion and Fighting illiteracies. The model enabled the collection and the analysis of data applied to a case study consisting of 99 Portuguese public libraries with Facebook page. With this model of content analysis we observed the facets of social mission and we read the actions with social facets on the Facebook page and in the websites of public libraries. At the end we discuss in parallel the results of observation of the Facebook of libraries and the websites. By reading the description of the actions of the social mission, the general conclusion and the most immediate is that 99 public libraries on Facebook and websites rarely publish social character actions, and the results are little satisfying. The Portuguese public libraries highlight substantially the actions in the category Generate social capital and social cohesion.
Resumo:
The Complex singlet extension of the Standard Model (CxSM) is the simplest extension that provides scenarios for Higgs pair production with different masses. The model has two interesting phases: the dark matter phase, with a Standard Model-like Higgs boson, a new scalar and a dark matter candidate; and the broken phase, with all three neutral scalars mixing. In the latter phase Higgs decays into a pair of two different Higgs bosons are possible. In this study we analyse Higgs-to-Higgs decays in the framework of singlet extensions of the Standard Model (SM), with focus on the CxSM. After demonstrating that scenarios with large rates for such chain decays are possible we perform a comparison between the NMSSM and the CxSM. We find that, based on Higgs-to-Higgs decays, the only possibility to distinguish the two models at the LHC run 2 is through final states with two different scalars. This conclusion builds a strong case for searches for final states with two different scalars at the LHC run 2. Finally, we propose a set of benchmark points for the real and complex singlet extensions to be tested at the LHC run 2. They have been chosen such that the discovery prospects of the involved scalars are maximised and they fulfil the dark matter constraints. Furthermore, for some of the points the theory is stable up to high energy scales. For the computation of the decay widths and branching ratios we developed the Fortran code sHDECAY, which is based on the implementation of the real and complex singlet extensions of the SM in HDECAY.
Resumo:
The extreme sensitivity of the mass of the Higgs boson to quantum corrections from high mass states, makes it 'unnaturally' light in the standard model. This 'hierarchy problem' can be solved by symmetries, which predict new particles related, by the symmetry, to standard model fields. The Large Hadron Collider (LHC) can potentially discover these new particles, thereby finding the solution to the hierarchy problem. However, the dynamics of the Higgs boson is also sensitive to this new physics. We show that in many scenarios the Higgs can be a complementary and powerful probe of the hierarchy problem at the LHC and future colliders. If the top quark partners carry the color charge of the strong nuclear force, the production of Higgs pairs is affected. This effect is tightly correlated with single Higgs production, implying that only modest enhancements in di-Higgs production occur when the top partners are heavy. However, if the top partners are light, we show that di-Higgs production is a useful complementary probe to single Higgs production. We verify this result in the context of a simplified supersymmetric model. If the top partners do not carry color charge, their direct production is greatly reduced. Nevertheless, we show that such scenarios can be revealed through Higgs dynamics. We find that many color neutral frameworks leave observable traces in Higgs couplings, which, in some cases, may be the only way to probe these theories at the LHC. Some realizations of the color neutral framework also lead to exotic decays of the Higgs with displaced vertices. We show that these decays are so striking that the projected sensitivity for these searches, at hadron colliders, is comparable to that of searches for colored top partners. Taken together, these three case studies show the efficacy of the Higgs as a probe of naturalness.
Resumo:
International audience
Resumo:
Purpose: To determine if the methanol extract of Pericarpium zanthoxyli exerts anti-anxiety effects and also to explore any probable anti-anxiety mechanism in vivo. Methods: The staircase test, elevated plus maze test, rota-rod treadmill test and convulsions induced by strychnine and picrotoxin on mice were tested to identify potential mechanism of anti-anxiety activity of the plant extract. Results: The plant extract (10 mg/kg, p.o.) significantly reduced rearing numbers in the staircase test while it increased the time spent in the open arms as well as the number of entries to the open arms in the elevated plus maze test, suggesting that it has significant anti-anxiety activity. Furthermore, the extract inhibited strychnine-induced convulsion. However, it had little effect on picrotoxin-induced convulsion, suggesting that its anti-anxiety activity may be linked to strychnine-sensitive glycine receptor and not GABA receptor. Conclusion: These results suggest that the Pericarpium zanthoxyli extract may be beneficial for the control of anxiety.
Resumo:
Planning, navigation, and search are fundamental human cognitive abilities central to spatial problem solving in search and rescue, law enforcement, and military operations. Despite a wealth of literature concerning naturalistic spatial problem solving in animals, literature on naturalistic spatial problem solving in humans is comparatively lacking and generally conducted by separate camps among which there is little crosstalk. Addressing this deficiency will allow us to predict spatial decision making in operational environments, and understand the factors leading to those decisions. The present dissertation is comprised of two related efforts, (1) a set of empirical research studies intended to identify characteristics of planning, execution, and memory in naturalistic spatial problem solving tasks, and (2) a computational modeling effort to develop a model of naturalistic spatial problem solving. The results of the behavioral studies indicate that problem space hierarchical representations are linear in shape, and that human solutions are produced according to multiple optimization criteria. The Mixed Criteria Model presented in this dissertation accounts for global and local human performance in a traditional and naturalistic Traveling Salesman Problem. The results of the empirical and modeling efforts hold implications for basic and applied science in domains such as problem solving, operations research, human-computer interaction, and artificial intelligence.
Resumo:
The purpose of this project centered on the influential literary magazine Timothy McSweeney’s Quarterly Concern. Using Bruno Latour’s network theory as well as the methods put forth by Robert Scholes and Clifford Wulfman to study modernist little magazines, I analyzed the influence McSweeney’s has on contemporary little magazines. I traced the connections between McSweeney’s and other paradigmatic examples of little magazines—The Believer and n+1—to show how the McSweeney’s aesthetic and business practice creates a model for more recent publications. My thesis argued that The Believer continues McSweeney’s aesthetic mission. In contrast, n+1 positioned itself against the McSweeney’s aesthetic, which indirectly created a space within the little magazines for writers, philosophers, and artists to debate the prevailing aesthetic theories of the contemporary period. The creation of this space connects these contemporary magazines back to modernist little magazines, thereby validating my decision to use the methods of Scholes and Wulfman.
Resumo:
On most if not all evaluatively relevant dimensions such as the temperature level, taste intensity, and nutritional value of a meal, one range of adequate, positive states is framed by two ranges of inadequate, negative states, namely too much and too little. This distribution of positive and negative states in the information ecology results in a higher similarity of positive objects, people, and events to other positive stimuli as compared to the similarity of negative stimuli to other negative stimuli. In other words, there are fewer ways in which an object, a person, or an event can be positive as compared to negative. Oftentimes, there is only one way in which a stimulus can be positive (e.g., a good meal has to have an adequate temperature level, taste intensity, and nutritional value). In contrast, there are many different ways in which a stimulus can be negative (e.g., a bad meal can be too hot or too cold, too spicy or too bland, or too fat or too lean). This higher similarity of positive as compared to negative stimuli is important, as similarity greatly impacts speed and accuracy on virtually all levels of information processing, including attention, classification, categorization, judgment and decision making, and recognition and recall memory. Thus, if the difference in similarity between positive and negative stimuli is a general phenomenon, it predicts and may explain a variety of valence asymmetries in cognitive processing (e.g., positive as compared to negative stimuli are processed faster but less accurately). In my dissertation, I show that the similarity asymmetry is indeed a general phenomenon that is observed in thousands of words and pictures. Further, I show that the similarity asymmetry applies to social groups. Groups stereotyped as average on the two dimensions agency / socio-economic success (A) and conservative-progressive beliefs (B) are stereotyped as positive or high on communion (C), while groups stereotyped as extreme on A and B (e.g., managers, homeless people, punks, and religious people) are stereotyped as negative or low on C. As average groups are more similar to one another than extreme groups, according to this ABC model of group stereotypes, positive groups are mentally represented as more similar to one another than negative groups. Finally, I discuss implications of the ABC model of group stereotypes, pointing to avenues for future research on how stereotype content shapes social perception, cognition, and behavior.