50 resultados para process measurement
em Universidade do Minho
Resumo:
We report the observation of Higgs boson decays to WW∗ based on an excess over background of 6.1 standard deviations in the dilepton final state, where the Standard Model expectation is 5.8 standard deviations. Evidence for the vector-boson fusion (VBF) production process is obtained with a significance of 3.2 standard deviations. The results are obtained from a data sample corresponding to an integrated luminosity of 25 pb−1 from s√=7 and 8 TeV pp collisions recorded by the ATLAS detector at the LHC. For a Higgs boson mass of 125.36 GeV, the ratio of the measured value to the expected value of the total production cross section times branching fraction is 1.09+0.16−0.15 (stat.)+0.17−0.14 (syst.). The corresponding ratios for the gluon fusion and vector-boson fusion production mechanisms are 1.02±0.19 (stat.)+0.22−0.18 (syst.) and 1.27+0.44−0.40 (stat.)+0.30−0.21 (syst.), respectively. At s√=8 TeV, the total production cross sections are measured to be σ(gg→ H→WW∗)=4.6±0.9(stat.)+0.8−0.7(syst.)pb and σ(VBF H→WW∗)=0.51+0.17−0.15(stat.)+0.13−0.08(syst.)pb. The fiducial cross section is determined for the gluon-fusion process in exclusive final states with zero or one associated jet.
Resumo:
The assessment of existing timber structures is often limited to information obtained from non or semi destructive testing, as mechanical testing is in many cases not possible due to its destructive nature. Therefore, the available data provides only an indirect measurement of the reference mechanical properties of timber elements, often obtained through empirical based correlations. Moreover, the data must result from the combination of different tests, as to provide a reliable source of information for a structural analysis. Even if general guidelines are available for each typology of testing, there is still a need for a global methodology allowing to combine information from different sources and infer upon that information in a decision process. In this scope, the present work presents the implementation of a probabilistic based framework for safety assessment of existing timber elements. This methodology combines information gathered in different scales and follows a probabilistic framework allowing for the structural assessment of existing timber elements with possibility of inference and updating of its mechanical properties, through Bayesian methods. The probabilistic based framework is based in four main steps: (i) scale of information; (ii) measurement data; (iii) probability assignment; and (iv) structural analysis. In this work, the proposed methodology is implemented in a case study. Data was obtained through a multi-scale experimental campaign made to old chestnut timber beams accounting correlations of non and semi-destructive tests with mechanical properties. Finally, different inference scenarios are discussed aiming at the characterization of the safety level of the elements.
Resumo:
The innovative Horizon 2020 program sponsored by the European Union (EU) aims to promote and develop processes of waste integration in construction materials. However, several potential health hazards caused by building materials have been identified and, there-fore, there is an ongoing need to develop new recycling methods for hazardous wastes and effi-cient barriers in order to prevent toxic releases from the new construction solutions with wastes. This paper presents an overview that focus on two main aspects: the identification of the health risks related to radioactivity and heavy metals present in building materials and identification of these toxic substances in new construction solutions that contain recycled wastes. Different waste materials were selected and distinct methodologies of toxicity evaluation are presented to analyse the potential hazardous, the feasibility of using those wastes and the achievement of op-timal construction solutions involving wastes.
Resumo:
Given the current economic situation of the Portuguese municipalities, it is necessary to identify the priority investments in order to achieve a more efficient financial management. The classification of the road network of the municipality according to the occurrence of traffic accidents is fundamental to set priorities for road interventions. This paper presents a model for road network classification based on traffic accidents integrated in a geographic information system. Its practical application was developed through a case study in the municipality of Barcelos. An equation was defined to obtain a road safety index through the combination of the following indicators: severity, property damage only and accident costs. In addition to the road network classification, the application of the model allows to analyze the spatial coverage of accidents in order to determine the centrality and dispersion of the locations with the highest incidence of road accidents. This analysis can be further refined according to the nature of the accidents namely in collision, runoff and pedestrian crashes.
Resumo:
Due to the increasing acceptance of BPM, nowadays BPM tools are extensively used in organizations. Core to BPM are the process modeling languages, of which BPMN is the one that has been receiving most attention these days. Once a business process is described using BPMN, one can use a process simulation approach in order to find its optimized form. In this context, the simulation of business processes, such as those defined in BPMN, appears as an obvious way of improving processes. This paper analyzes the business process modeling and simulation areas, identifying the elements that must be present in the BPMN language in order to allow processes described in BPMN to be simulated. During this analysis a set of existing BPM tools, which support BPMN, are compared regarding their limitations in terms of simulation support.
Resumo:
Although most of the accidents occurred in Olive Oil Mill (OOM) resulted from “basic” risks, there is a need to apply adequate tools to support risk decisions that can meet the specificities of this sector. This study aims to analyse the views of Occupational, Safety & Health (OSH) practitioners about the risk assessment process in OOM, identifying the key difficulties inherent to the risk assessment process in these sector, as well as identifying some improvements to the current practice. This analysis was based on a questionnaire that was developed and applied to 13 OSH practitioners working at OOM. The results showed that the time available to perform the risk assessment is the more frequent limitation. They believe that the methodologies available are not an important limitation to this process. However, a specific risk assessment methodology, that includes acceptance criteria adjusted to the OOM reality, using risk metrics supported on the frequency of accidents and workdays lost, were indicated as being also an important contributions improve the process. A semi-quantitative approach, complemented with the use of the sector accident statistics, can be a good solution for this sector. However, further strategies should also be adopted, mainly those that can lead to an easy application of the risk assessment process.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão Industrial
Resumo:
The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.
Resumo:
Business Intelligence (BI) can be seen as a method that gathers information and data from information systems in order to help companies to be more accurate in their decision-making process. Traditionally BI systems were associated with the use of Data Warehouses (DW). The prime purpose of DW is to serve as a repository that stores all the relevant information required for making the correct decision. The necessity to integrate streaming data became crucial with the need to improve the efficiency and effectiveness of the decision process. In primary and secondary education, there is a lack of BI solutions. Due to the schools reality the main purpose of this study is to provide a Pervasive BI solution able to monitoring the schools and student data anywhere and anytime in real-time as well as disseminating the information through ubiquitous devices. The first task consisted in gathering data regarding the different choices made by the student since his enrolment in a certain school year until the end of it. Thereafter a dimensional model was developed in order to be possible building a BI platform. This paper presents the dimensional model, a set of pre-defined indicators, the Pervasive Business Intelligence characteristics and the prototype designed. The main contribution of this study was to offer to the schools a tool that could help them to make accurate decisions in real-time. Data dissemination was achieved through a localized application that can be accessed anywhere and anytime.
Resumo:
Children are an especially vulnerable population, particularly in respect to drug administration. It is estimated that neonatal and pediatric patients are at least three times more vulnerable to damage due to adverse events and medication errors than adults are. With the development of this framework, it is intended the provision of a Clinical Decision Support System based on a prototype already tested in a real environment. The framework will include features such as preparation of Total Parenteral Nutrition prescriptions, table pediatric and neonatal emergency drugs, medical scales of morbidity and mortality, anthropometry percentiles (weight, length/height, head circumference and BMI), utilities for supporting medical decision on the treatment of neonatal jaundice and anemia and support for technical procedures and other calculators and widespread use tools. The solution in development means an extension of INTCare project. The main goal is to provide an approach to get the functionality at all times of clinical practice and outside the hospital environment for dissemination, education and simulation of hypothetical situations. The aim is also to develop an area for the study and analysis of information and extraction of knowledge from the data collected by the use of the system. This paper presents the architecture, their requirements and functionalities and a SWOT analysis of the solution proposed.
Resumo:
Tese de Doutoramento em Ciências - Especialidade em Biologia
Resumo:
The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of s√=7 TeV corresponding to an integrated luminosity of 4.7 fb −1 . Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti- kt algorithm with distance parameters R=0.4 or R=0.6 , and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a Z boson, for 20≤pjetT<1000 GeV and pseudorapidities |η|<4.5 . The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region ( |η|<1.2 ) for jets with 55≤pjetT<500 GeV . For central jets at lower pT , the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for pjetT>1 TeV. The calibration of forward jets is derived from dijet pT balance measurements. The resulting uncertainty reaches its largest value of 6 % for low- pT jets at |η|=4.5 . Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5–3 %.
Resumo:
A measurement is presented of the tt¯ inclusive production cross section in pp collisions at a center-of-mass energy of s√=8 TeV using data collected by the ATLAS detector at the CERN Large Hadron Collider. The measurement was performed in the lepton+jets final state using a data set corresponding to an integrated luminosity of 20.3 fb−1. The cross section was obtained using a likelihood discriminant fit and b-jet identification was used to improve the signal-to-background ratio. The inclusive tt¯ production cross section was measured to be 260±1(stat)+22−23(stat)±8(lumi)±4(beam) pb assuming a top-quark mass of 172.5 GeV, in good agreement with the theoretical prediction of 253+13−15 pb. The tt¯→(e,μ)+jets production cross section in the fiducial region determined by the detector acceptance is also reported.
Resumo:
A measurement of spin correlation in tt¯ production is presented using data collected with the ATLAS detector at the Large Hadron Collider in proton-proton collisions at a center-of-mass energy of 8 TeV, corresponding to an integrated luminosity of 20.3 fb−1. The correlation between the top and antitop quark spins is extracted from dilepton tt¯ events by using the difference in azimuthal angle between the two charged leptons in the laboratory frame. In the helicity basis the measured degree of correlation corresponds to Ahelicity=0.38±0.04, in agreement with the Standard Model prediction. A search is performed for pair production of top squarks with masses close to the top quark mass decaying to predominantly right-handed top quarks and a light neutralino, the lightest supersymmetric particle. Top squarks with masses between the top quark mass and 191 GeV are excluded at the 95% confidence level.
Resumo:
The transverse polarization of Λ and Λ¯ hyperons produced in proton--proton collisions at a center-of-mass energy of 7 TeV is measured. The analysis uses 760 μb−1 of minimum bias data collected by the ATLAS detector at the LHC in the year 2010. The measured transverse polarization averaged over Feynman xF from 5×10−5 to 0.01 and transverse momentum pT from 0.8 to 15 GeV is −0.010±0.005(stat)±0.004(syst) for Λ and 0.002±0.006(stat)±0.004(syst) for Λ¯. It is also measured as a function of xF and pT, but no significant dependence on these variables is observed. Prior to this measurement, the polarization was measured at fixed-target experiments with center-of-mass energies up to about 40 GeV. The ATLAS results are compatible with the extrapolation of a fit from previous measurements to the xF range covered by this mesurement.