91 resultados para search engine optimization
Resumo:
This paper presents a search for Higgs bosons decaying to four leptons, either electrons or muons, via one or two light exotic gauge bosons Zd, H→ZZd→4ℓ or H→ZdZd→4ℓ. The search was performed using pp collision data corresponding to an integrated luminosity of about 20 fb−1 at the center-of-mass energy of s√=8TeV recorded with the ATLAS detector at the Large Hadron Collider. The observed data are well described by the Standard Model prediction. Upper bounds on the branching ratio of H→ZZd→4ℓ and on the kinetic mixing parameter between the Zd and the Standard Model hypercharge gauge boson are set in the range (1--9)×10−5 and (4--17)×10−2 respectively, at 95% confidence level assuming the Standard Model branching ratio of H→ZZ∗→4ℓ, for Zd masses between 15 and 55 GeV. Upper bounds on the effective mass mixing parameter between the Z and the Zd are also set using the branching ratio limits in the H→ZZd→4ℓ search, and are in the range (1.5--8.7)×10−4 for 15
Resumo:
Many extensions of the Standard Model posit the existence of heavy particles with long lifetimes. This article presents the results of a search for events containing at least one long-lived particle that decays at a significant distance from its production point into two leptons or into five or more charged particles. This analysis uses a data sample of proton-proton collisions at s√ = 8 TeV corresponding to an integrated luminosity of 20.3 fb−1 collected in 2012 by the ATLAS detector operating at the Large Hadron Collider. No events are observed in any of the signal regions, and limits are set on model parameters within supersymmetric scenarios involving R-parity violation, split supersymmetry, and gauge mediation. In some of the search channels, the trigger and search strategy are based only on the decay products of individual long-lived particles, irrespective of the rest of the event. In these cases, the provided limits can easily be reinterpreted in different scenarios.
Resumo:
Results of a search for new phenomena in events with large missing transverse momentum and a Higgs boson decaying to two photons are reported. Data from proton--proton collisions at a center-of-mass energy of 8 TeV and corresponding to an integrated luminosity of 20.3 fb−1 have been collected with the ATLAS detector at the LHC. The observed data are well described by the expected Standard Model backgrounds. Upper limits on the cross section of events with large missing transverse momentum and a Higgs boson candidate are also placed. Exclusion limits are presented for models of physics beyond the Standard Model featuring dark-matter candidates.
Resumo:
Dissertação de mestrado integrado em Engenharia Eletrónica Industrial e de Computadores
Resumo:
Dissertação de mestrado integrado em Engenharia Mecânica
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Tese de Doutoramento em Engenharia de Materiais.
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
Tese de Doutoramento (Programa Doutoral em Engenharia Biomédica)
Resumo:
Kinetic models have a great potential for metabolic engineering applications. They can be used for testing which genetic and regulatory modifications can increase the production of metabolites of interest, while simultaneously monitoring other key functions of the host organism. This work presents a methodology for increasing productivity in biotechnological processes exploiting dynamic models. It uses multi-objective dynamic optimization to identify the combination of targets (enzymatic modifications) and the degree of up- or down-regulation that must be performed in order to optimize a set of pre-defined performance metrics subject to process constraints. The capabilities of the approach are demonstrated on a realistic and computationally challenging application: a large-scale metabolic model of Chinese Hamster Ovary cells (CHO), which are used for antibody production in a fed-batch process. The proposed methodology manages to provide a sustained and robust growth in CHO cells, increasing productivity while simultaneously increasing biomass production, product titer, and keeping the concentrations of lactate and ammonia at low values. The approach presented here can be used for optimizing metabolic models by finding the best combination of targets and their optimal level of up/down-regulation. Furthermore, it can accommodate additional trade-offs and constraints with great flexibility.
Resumo:
Software reconfigurability became increasingly relevant to the architectural process due to the crescent dependency of modern societies on reliable and adaptable systems. Such systems are supposed to adapt themselves to surrounding environmental changes with minimal service disruption, if any. This paper introduces an engine that statically applies reconfigurations to (formal) models of software architectures. Reconfigurations are specified using a domain specific language— ReCooPLa—which targets the manipulation of software coordinationstructures,typicallyusedinservice-orientedarchitectures(soa).Theengine is responsible for the compilation of ReCooPLa instances and their application to the relevant coordination structures. The resulting configurations are amenable to formal analysis of qualitative and quantitative (probabilistic) properties.
Resumo:
[Excerpt] Bioethanol from lignocellulosic materials (LCM), also called second generation bioethanol, is considered a promising alternative to first generation bioethanol. An efficient production process of lignocellulosic bioethanol involves an effective pretreatment of LCM to improve the accessibility of cellulose and thus enhance the enzymatic saccharification. One interesting approach is to use the whole slurry from treatment, since allows economical and industrial benefits: washing steps are avoided, water consumption is lower and the sugars from liquid phase can be used, increasing ethanol concentration [1]. However, during the pretreatment step some compounds (such as furans, phenolic compounds and weak acids) are produced. These compounds have an inhibitory effect on the microorganisms used for hydrolysate fermentation [2]. To overcome this, the use of a robust industrial strain together with agro-industrial by-products as nutritional supplementation was proposed to increase the ethanol productivities and yields. (...)
Resumo:
Fluorescence in situ hybridization (FISH) is a molecular technique widely used for the detection and characterization of microbial populations. FISH is affected by a wide variety of abiotic and biotic variables and the way they interact with each other. This is translated into a wide variability of FISH procedures found in the literature. The aim of this work is to systematically study the effects of pH, dextran sulfate and probe concentration in the FISH protocol, using a general peptide nucleic acid (PNA) probe for the Eubacteria domain. For this, response surface methodology was used to optimize these 3 PNA-FISH parameters for Gram-negative (Escherichia coli and Pseudomonas fluorescens) and Gram-positive species (Listeria innocua, Staphylococcus epidermidis and Bacillus cereus). The obtained results show that a probe concentration higher than 300 nM is favorable for both groups. Interestingly, a clear distinction between the two groups regarding the optimal pH and dextran sulfate concentration was found: a high pH (approx. 10), combined with lower dextran sulfate concentration (approx. 2% [w/v]) for Gram-negative species and near-neutral pH (approx. 8), together with higher dextran sulfate concentrations (approx. 10% [w/v]) for Gram-positive species. This behavior seems to result from an interplay between pH and dextran sulfate and their ability to influence probe concentration and diffusion towards the rRNA target. This study shows that, for an optimum hybridization protocol, dextran sulfate and pH should be adjusted according to the target bacteria.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational intelligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two illustrative Traffic Engineering methods are described, allowing to attain routing configurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
Current data mining engines are difficult to use, requiring optimizations by data mining experts in order to provide optimal results. To solve this problem a new concept was devised, by maintaining the functionality of current data mining tools and adding pervasive characteristics such as invisibility and ubiquity which focus on their users, providing better ease of use and usefulness, by providing autonomous and intelligent data mining processes. This article introduces an architecture to implement a data mining engine, composed by four major components: database; Middleware (control); Middleware (processing); and interface. These components are interlinked but provide independent scaling, allowing for a system that adapts to the user’s needs. A prototype has been developed in order to test the architecture. The results are very promising and showed their functionality and the need for further improvements.