25 resultados para heory of constraints
em Universidade do Minho
Resumo:
Tese de Doutoramento - Programa Doutoral em Engenharia Industrial e Sistemas (PDEIS)
Resumo:
A summary of the constraints from the ATLAS experiment on R-parity-conserving supersymmetry is presented. Results from 22 separate ATLAS searches are considered, each based on analysis of up to 20.3 fb−1 of proton-proton collision data at centre-of-mass energies of s√=7 and 8 TeV at the Large Hadron Collider. The results are interpreted in the context of the 19-parameter phenomenological minimal supersymmetric standard model, in which the lightest supersymmetric particle is a neutralino, taking into account constraints from previous precision electroweak and flavour measurements as well as from dark matter related measurements. The results are presented in terms of constraints on supersymmetric particle masses and are compared to limits from simplified models. The impact of ATLAS searches on parameters such as the dark matter relic density, the couplings of the observed Higgs boson, and the degree of electroweak fine-tuning is also shown. Spectra for surviving supersymmetry model points with low fine-tunings are presented.
Resumo:
Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.
Resumo:
A search for a new resonance decaying to a W or Z boson and a Higgs boson in the ℓℓ/ℓν/νν+bb¯ final states is performed using 20.3 fb−1 of pp collision data recorded at s√= 8 TeV with the ATLAS detector at the Large Hadron Collider. The search is conducted by examining the WH/ZH invariant mass distribution for a localized excess. No significant deviation from the Standard Model background prediction is observed. The results are interpreted in terms of constraints on the Minimal Walking Technicolor model and on a simplified approach based on a phenomenological Lagrangian of Heavy Vector Triplets.
Resumo:
This chapter presents a general methodology for the formulation of the kinematic constraint equations at position, velocity and acceleration levels. Also a brief characterization of the different type of constraints is offered, namely the holonomic and nonholonomic constraints. The kinematic constraints described here are formulated using generalized coordinates. The chapter ends with a general approach to deal with the kinematic analysis of multibody systems.
Resumo:
Tese de Doutoramento em Ciências Empresariais.
Resumo:
We investigate the impact of cross-delisting on firms’ financial constraints and investment sensitivities. We find that firms that cross-delisted from a U.S. stock exchange face stronger post-delisting financial constraints than their cross-listed counterparts, as measured by investment-to-cash flow sensitivity. Following a delisting, the sensitivity of investment-to-cash flow increases significantly and firms also tend to save more cash out of cash flows. Moreover, this increase appears to be primarily driven by informational frictions that constrain access to external financing. We document that information asymmetry problems are stronger for firms from countries with weaker shareholders protection and for firms from less developed capital markets.
Resumo:
The kinetics of GnP dispersion in polypropylene melt was studied using a prototype small scale modular extensional mixer. Its modular nature enabled the sequential application of a mixing step, melt relaxation, and a second mixing step. The latter could reproduce the flow conditions on the first mixing step, or generate milder flow conditions. The effect of these sequences of flow constraints upon GnP dispersion along the mixer length was studied for composites with 2 and 10 wt.% GnP. The samples collected along the first mixing zone showed a gradual decrease of number and size of GnP agglomerates, at a rate that was independent of the flow conditions imposed to the melt, but dependent on composition. The relaxation zone induced GnP re-agglomeration, and the application of a second mixing step caused variable dispersion results that were largely dependent on the hydrodynamic stresses generated.
Resumo:
Nowadays, many P2P applications proliferate in the Internet. The attractiveness of many of these systems relies on the collaborative approach used to exchange large resources without the dependence and associated constraints of centralized approaches where a single server is responsible to handle all the requests from the clients. As consequence, some P2P systems are also interesting and cost-effective approaches to be adopted by content-providers and other Internet players. However, there are several coexistence problems between P2P applications and In- ternet Service Providers (ISPs) due to the unforeseeable behavior of P2P traffic aggregates in ISP infrastructures. In this context, this work proposes a collaborative P2P/ISP system able to underpin the development of novel Traffic Engi- neering (TE) mechanisms contributing for a better coexistence between P2P applications and ISPs. Using the devised system, two TE methods are described being able to estimate and control the impact of P2P traffic aggregates on the ISP network links. One of the TE methods allows that ISP administrators are able to foresee the expected impact that a given P2P swarm will have in the underlying network infrastructure. The other TE method enables the definition of ISP friendly P2P topologies, where specific network links are protected from P2P traffic. As result, the proposed system and associated mechanisms will contribute for improved ISP resource management tasks and to foster the deployment of innovative ISP-friendly systems.
Resumo:
A search is performed for Higgs bosons produced in association with top quarks using the diphoton decay mode of the Higgs boson. Selection requirements are optimized separately for leptonic and fully hadronic final states from the top quark decays. The dataset used corresponds to an integrated luminosity of 4.5 fb−1 of proton--proton collisions at a center-of-mass energy of 7 TeV and 20.3 fb−1 at 8 TeV recorded by the ATLAS detector at the CERN Large Hadron Collider. No significant excess over the background prediction is observed and upper limits are set on the tt¯H production cross section. The observed exclusion upper limit at 95% confidence level is 6.7 times the predicted Standard Model cross section value. In addition, limits are set on the strength of the Yukawa coupling between the top quark and the Higgs boson, taking into account the dependence of the tt¯H and tH cross sections as well as the H→γγ branching fraction on the Yukawa coupling. Lower and upper limits at 95% confidence level are set at −1.3 and +8.0 times the Yukawa coupling strength in the Standard Model.
Resumo:
[Extrat] The answer to the social and economic challenges that it is assumed literacy (or its lack) puts to developed countries deeply concerns public policies of governments namely those of the OECD area. In the last decades, these concerns gave origin to several and diverse monitoring devices, initiatives and programmes for reading (mainly) development, putting a strong stress on education. UNESCO (2006, p. 6), for instance, assumes that the literacy challenge can only be met raising the quality of primary and secondary education and intensifying programmes explicitly oriented towards youth and adult literacy. (...)
Resumo:
A search for pair production of vector-like quarks, both up-type (T) and down-type (B), as well as for four-top-quark production, is presented. The search is based on pp collisions at s√=8 TeV recorded in 2012 with the ATLAS detector at the CERN Large Hadron Collider and corresponding to an integrated luminosity of 20.3 fb−1. Data are analysed in the lepton-plus-jets final state, characterised by an isolated electron or muon with high transverse momentum, large missing transverse momentum and multiple jets. Dedicated analyses are performed targeting three cases: a T quark with significant branching ratio to a W boson and a b-quark (TT¯→Wb+X), and both a T quark and a B quark with significant branching ratio to a Higgs boson and a third-generation quark (TT¯→Ht+X and BB¯→Hb+X respectively). No significant excess of events above the Standard Model expectation is observed, and 95% CL lower limits are derived on the masses of the vector-like T and B quarks under several branching ratio hypotheses assuming contributions from T→Wb, Zt, Ht and B→Wt, Zb, Hb decays. The 95% CL observed lower limits on the T quark mass range between 715 GeV and 950 GeV for all possible values of the branching ratios into the three decay modes, and are the most stringent constraints to date. Additionally, the most restrictive upper bounds on four-top-quark production are set in a number of new physics scenarios.
Resumo:
Dissertação de mestrado em Engenharia Industrial
Resumo:
The morphological evolution of the city of Braga has been the subject of several studies focusing on different urban areas in different periods. Using the accumulated knowledge provided by the available archaeological, historical and iconographic data of Braga, from the Roman times to the nineteenth century, we intend to present a working methodology for 3D representation of urban areas and its evolution, using the CityEngine ESRI tool. Different types of graphic and cartographic data will be integrated in an archaeological information system for the characterization of urban buildings. Linking this information system to the rules of characterization of urban spaces through the CityEngine tool, we can create the 3D urban spaces and their changes. The building characterization rules include several parameters of architectural elements that can be dynamically changed according the latest information. This methodology will be applied to the best known areas within of the city allowing the creation of different and dynamic layouts. Considerations about the concepts, challenges and constraints of using the CityEngine tool for recording and representing urban evolution knowledge will be discussed.
Resumo:
Currently, the quality of the Indonesian national road network is inadequate due to several constraints, including overcapacity and overloaded trucks. The high deterioration rate of the road infrastructure in developing countries along with major budgetary restrictions and high growth in traffic have led to an emerging need for improving the performance of the highway maintenance system. However, the high number of intervening factors and their complex effects require advanced tools to successfully solve this problem. The high learning capabilities of Data Mining (DM) are a powerful solution to this problem. In the past, these tools have been successfully applied to solve complex and multi-dimensional problems in various scientific fields. Therefore, it is expected that DM can be used to analyze the large amount of data regarding the pavement and traffic, identify the relationship between variables, and provide information regarding the prediction of the data. In this paper, we present a new approach to predict the International Roughness Index (IRI) of pavement based on DM techniques. DM was used to analyze the initial IRI data, including age, Equivalent Single Axle Load (ESAL), crack, potholes, rutting, and long cracks. This model was developed and verified using data from an Integrated Indonesia Road Management System (IIRMS) that was measured with the National Association of Australian State Road Authorities (NAASRA) roughness meter. The results of the proposed approach are compared with the IIRMS analytical model adapted to the IRI, and the advantages of the new approach are highlighted. We show that the novel data-driven model is able to learn (with high accuracy) the complex relationships between the IRI and the contributing factors of overloaded trucks