901 resultados para the carpet model
Resumo:
We study work extraction from the Dicke model achieved using simple unitary cyclic transformations keeping into account both a non optimal unitary protocol, and the energetic cost of creating the initial state. By analyzing the role of entanglement, we find that highly entangled states can be inefficient for energy storage when considering the energetic cost of creating the state. Such surprising result holds notwithstanding the fact that the criticality of the model at hand can sensibly improve the extraction of work. While showing the advantages of using a many-body system for work extraction, our results demonstrate that entanglement is not necessarily advantageous for energy storage purposes, when non optimal processes are considered. Our work shows the importance of better understanding the complex interconnections between non-equilibrium thermodynamics of quantum systems and correlations among their subparts.
Resumo:
Statistical association between a single nucleotide polymorphism (SNP) genotype and a quantitative trait in genome-wide association studies is usually assessed using a linear regression model, or, in the case of non-normally distributed trait values, using the Kruskal-Wallis test. While linear regression models assume an additive mode of inheritance via equi-distant genotype scores, Kruskal-Wallis test merely tests global differences in trait values associated with the three genotype groups. Both approaches thus exhibit suboptimal power when the underlying inheritance mode is dominant or recessive. Furthermore, these tests do not perform well in the common situations when only a few trait values are available in a rare genotype category (disbalance), or when the values associated with the three genotype categories exhibit unequal variance (variance heterogeneity). We propose a maximum test based on Marcus-type multiple contrast test for relative effect sizes. This test allows model-specific testing of either dominant, additive or recessive mode of inheritance, and it is robust against variance heterogeneity. We show how to obtain mode-specific simultaneous confidence intervals for the relative effect sizes to aid in interpreting the biological relevance of the results. Further, we discuss the use of a related all-pairwise comparisons contrast test with range preserving confidence intervals as an alternative to Kruskal-Wallis heterogeneity test. We applied the proposed maximum test to the Bogalusa Heart Study dataset, and gained a remarkable increase in the power to detect association, particularly for rare genotypes. Our simulation study also demonstrated that the proposed non-parametric tests control family-wise error rate in the presence of non-normality and variance heterogeneity contrary to the standard parametric approaches. We provide a publicly available R library nparcomp that can be used to estimate simultaneous confidence intervals or compatible multiplicity-adjusted p-values associated with the proposed maximum test.
Resumo:
This paper reports some experiments in using SVG (Scalable Vector Graphics), rather than the browser default of (X)HTML/CSS, as a potential Web-based rendering technology, in an attempt to create an approach that integrates the structural and display aspects of a Web document in a single XML-compliant envelope. Although the syntax of SVG is XML based, the semantics of the primitive graphic operations more closely resemble those of page description languages such as PostScript or PDF. The principal usage of SVG, so far, is for inserting complex graphic material into Web pages that are predominantly controlled via (X)HTML and CSS. The conversion of structured and unstructured PDF into SVG is discussed. It is found that unstructured PDF converts into pages of SVG with few problems, but difficulties arise when one attempts to map the structural components of a Tagged PDF into an XML skeleton underlying the corresponding SVG. These difficulties are not fundamentally syntactic; they arise largely because browsers are innately bound to (X)HTML/CSS as their default rendering model. Some suggestions are made for ways in which SVG could be more totally integrated into browser functionality, with the possibility that future browsers might be able to use SVG as their default rendering paradigm.
Resumo:
The text of this thesis provides historical introduction to the two studies Theoretical Model of Superconductivity and the Martensitic Transformation in A15 Compounds" and "A Comparison of Kadanoff-Migdal Renormalization with New Monte Carlo Results for the XY Model", contained herein as appendices.
Resumo:
International audience
Resumo:
The Complex singlet extension of the Standard Model (CxSM) is the simplest extension that provides scenarios for Higgs pair production with different masses. The model has two interesting phases: the dark matter phase, with a Standard Model-like Higgs boson, a new scalar and a dark matter candidate; and the broken phase, with all three neutral scalars mixing. In the latter phase Higgs decays into a pair of two different Higgs bosons are possible. In this study we analyse Higgs-to-Higgs decays in the framework of singlet extensions of the Standard Model (SM), with focus on the CxSM. After demonstrating that scenarios with large rates for such chain decays are possible we perform a comparison between the NMSSM and the CxSM. We find that, based on Higgs-to-Higgs decays, the only possibility to distinguish the two models at the LHC run 2 is through final states with two different scalars. This conclusion builds a strong case for searches for final states with two different scalars at the LHC run 2. Finally, we propose a set of benchmark points for the real and complex singlet extensions to be tested at the LHC run 2. They have been chosen such that the discovery prospects of the involved scalars are maximised and they fulfil the dark matter constraints. Furthermore, for some of the points the theory is stable up to high energy scales. For the computation of the decay widths and branching ratios we developed the Fortran code sHDECAY, which is based on the implementation of the real and complex singlet extensions of the SM in HDECAY.
Resumo:
The main aim of this study was to determine the impact of innovation on productivity in service sector companies — especially those in the hospitality sector — that value the reduction of environmental impact as relevant to the innovation process. We used a structural analysis model based on the one developed by Crépon, Duguet, and Mairesse (1998). This model is known as the CDM model (an acronym of the authors’ surnames). These authors developed seminal studies in the field of the relationships between innovation and productivity (see Griliches 1979; Pakes and Grilliches 1980). The main advantage of the CDM model is its ability to integrate the process of innovation and business productivity from an empirical perspective.
Resumo:
Field lab: Entrepreneurial and innovative ventures
Resumo:
The Ocean Model Intercomparison Project (OMIP) is an endorsed project in the Coupled Model Intercomparison Project Phase 6 (CMIP6). OMIP addresses CMIP6 science questions, investigating the origins and consequences of systematic model biases. It does so by providing a framework for evaluating (including assessment of systematic biases), understanding, and improving ocean, sea-ice, tracer, and biogeochemical components of climate and earth system models contributing to CMIP6. Among the WCRP Grand Challenges in climate science (GCs), OMIP primarily contributes to the regional sea level change and near-term (climate/decadal) prediction GCs. OMIP provides (a) an experimental protocol for global ocean/sea-ice models run with a prescribed atmospheric forcing; and (b) a protocol for ocean diagnostics to be saved as part of CMIP6. We focus here on the physical component of OMIP, with a companion paper (Orr et al., 2016) detailing methods for the inert chemistry and interactive biogeochemistry. The physical portion of the OMIP experimental protocol follows the interannual Coordinated Ocean-ice Reference Experiments (CORE-II). Since 2009, CORE-I (Normal Year Forcing) and CORE-II (Interannual Forcing) have become the standard methods to evaluate global ocean/sea-ice simulations and to examine mechanisms for forced ocean climate variability. The OMIP diagnostic protocol is relevant for any ocean model component of CMIP6, including the DECK (Diagnostic, Evaluation and Characterization of Klima experiments), historical simulations, FAFMIP (Flux Anomaly Forced MIP), C4MIP (Coupled Carbon Cycle Climate MIP), DAMIP (Detection and Attribution MIP), DCPP (Decadal Climate Prediction Project), ScenarioMIP, HighResMIP (High Resolution MIP), as well as the ocean/sea-ice OMIP simulations.
Resumo:
We provide a nonparametric 'revealed preference’ characterization of rational household behavior in terms of the collective consumption model, while accounting for general (possibly non-convex) individual preferences. We establish a Collective Axiom of Revealed Preference (CARP), which provides a necessary and sufficient condition for data consistency with collective rationality. Our main result takes the form of a ‘collective’ version of the Afriat Theorem for rational behavior in terms of the unitary model. This theorem has some interesting implications. With only a finite set of observations, the nature of consumption externalities (positive or negative) in the intra-household allocation process is non-testable. The same non-testability conclusion holds for privateness (with or without externalities) or publicness of consumption. By contrast, concavity of individual utility functions (representing convex preferences) turns out to be testable. In addition, monotonicity is testable for the model that assumes all household consumption is public.
Resumo:
In the first part of this thesis we search for beyond the Standard Model physics through the search for anomalous production of the Higgs boson using the razor kinematic variables. We search for anomalous Higgs boson production using proton-proton collisions at center of mass energy √s=8 TeV collected by the Compact Muon Solenoid experiment at the Large Hadron Collider corresponding to an integrated luminosity of 19.8 fb-1.
In the second part we present a novel method for using a quantum annealer to train a classifier to recognize events containing a Higgs boson decaying to two photons. We train that classifier using simulated proton-proton collisions at √s=8 TeV producing either a Standard Model Higgs boson decaying to two photons or a non-resonant Standard Model process that produces a two photon final state.
The production mechanisms of the Higgs boson are precisely predicted by the Standard Model based on its association with the mechanism of electroweak symmetry breaking. We measure the yield of Higgs bosons decaying to two photons in kinematic regions predicted to have very little contribution from a Standard Model Higgs boson and search for an excess of events, which would be evidence of either non-standard production or non-standard properties of the Higgs boson. We divide the events into disjoint categories based on kinematic properties and the presence of additional b-quarks produced in the collisions. In each of these disjoint categories, we use the razor kinematic variables to characterize events with topological configurations incompatible with typical configurations found from standard model production of the Higgs boson.
We observe an excess of events with di-photon invariant mass compatible with the Higgs boson mass and localized in a small region of the razor plane. We observe 5 events with a predicted background of 0.54 ± 0.28, which observation has a p-value of 10-3 and a local significance of 3.35σ. This background prediction comes from 0.48 predicted non-resonant background events and 0.07 predicted SM higgs boson events. We proceed to investigate the properties of this excess, finding that it provides a very compelling peak in the di-photon invariant mass distribution and is physically separated in the razor plane from predicted background. Using another method of measuring the background and significance of the excess, we find a 2.5σ deviation from the Standard Model hypothesis over a broader range of the razor plane.
In the second part of the thesis we transform the problem of training a classifier to distinguish events with a Higgs boson decaying to two photons from events with other sources of photon pairs into the Hamiltonian of a spin system, the ground state of which is the best classifier. We then use a quantum annealer to find the ground state of this Hamiltonian and train the classifier. We find that we are able to do this successfully in less than 400 annealing runs for a problem of median difficulty at the largest problem size considered. The networks trained in this manner exhibit good classification performance, competitive with the more complicated machine learning techniques, and are highly resistant to overtraining. We also find that the nature of the training gives access to additional solutions that can be used to improve the classification performance by up to 1.2% in some regions.
Resumo:
We study nonequilibrium processes in an isolated quantum system-the Dicke model-focusing on the role played by the transition from integrability to chaos and the presence of excited-state quantum phase transitions. We show that both diagonal and entanglement entropies are abruptly increased by the onset of chaos. Also, this increase ends in both cases just after the system crosses the critical energy of the excited-state quantum phase transition. The link between entropy production, the development of chaos, and the excited-state quantum phase transition is more clear for the entanglement entropy.
Resumo:
The curated commons is a model in which a flexible library building shell and its infrastructure can respond to the specific time-sensitive needs of differing clients. It applies to faculty research, in particular small science activities (as opposed to big science activities that have major support which includes proprietary laboratories and facilities). It provides for sustained transformation of library facilities as well as its utilitarian and cyber-infrastructures to become a flexible reconfigurable space with cutting edge technology and sustained funding streams.