914 resultados para Application of Data-driven Modelling in Water Sciences
Resumo:
The book aims to introduce the reader to DEA in the most accessible manner possible. It is specifically aimed at those who have had no prior exposure to DEA and wish to learn its essentials, how it works, its key uses, and the mechanics of using it. The latter will include using DEA software. Students on degree or training courses will find the book especially helpful. The same is true of practitioners engaging in comparative efficiency assessments and performance management within their organisation. Examples are used throughout the book to help the reader consolidate the concepts covered. Table of content: List of Tables. List of Figures. Preface. Abbreviations. 1. Introduction to Performance Measurement. 2. Definitions of Efficiency and Related Measures. 3. Data Envelopment Analysis Under Constant Returns to Scale: Basic Principles. 4. Data Envelopment Analysis under Constant Returns to Scale: General Models. 5. Using Data Envelopment Analysis in Practice. 6. Data Envelopment Analysis under Variable Returns to Scale. 7. Assessing Policy Effectiveness and Productivity Change Using DEA. 8. Incorporating Value Judgements in DEA Assessments. 9. Extensions to Basic DEA Models. 10. A Limited User Guide for Warwick DEA Software. Author Index. Topic Index. References.
Resumo:
As levels of investment in advanced manufacturing systems increase, effective project management becomes ever more critical. This paper demonstrates how the model proposed by Mintzberg, Raisinghani and Theoret in 1976, which structures complicated strategic decision processes, can be applied to the design of new production systems for both descriptive and analytical research purposes. This paper sets a detailed case study concerning the design and development of an advanced manufacturing system within the Mintzberg decision model and so breaks down the decision sequence into constituent parts. It thus shows how a structured model can provide a framework for the researcher who wishes to study decision episodes in the design of manufacturing facilities in greater depth.
Resumo:
This thesis describes work done exploring the application of expert system techniques to the domain of designing durable concrete. The nature of concrete durability design is described and some problems from the domain are discussed. Some related work on expert systems in concrete durability are described. Various implementation languages are considered - PROLOG and OPS5, and rejected in favour of a shell - CRYSTAL3 (later CRYSTAL4). Criteria for useful expert system shells in the domain are discussed. CRYSTAL4 is evaluated in the light of these criteria. Modules in various sub-domains (mix-design, sulphate attack, steel-corrosion and alkali aggregate reaction) are developed and organised under a BLACKBOARD system (called DEX). Extensions to the CRYSTAL4 modules are considered for different knowledge representations. These include LOTUS123 spreadsheets implementing models incorporating some of the mathematical knowledge in the domain. Design databases are used to represent tabular design knowledge. Hypertext representations of the original building standards texts are proposed as a tool for providing a well structured and extensive justification/help facility. A standardised approach to module development is proposed using hypertext development as a structured basis for expert systems development. Some areas of deficient domain knowledge are highlighted particularly in the use of data from mathematical models and in gaps and inconsistencies in the original knowledge source Digests.
Resumo:
ACM Computing Classification System (1998): D.2.5, D.2.9, D.2.11.
Resumo:
06/UPR10/10
Resumo:
The Standard Model (SM) of particle physics predicts the existence of a Higgs field responsible for the generation of particles' mass. However, some aspects of this theory remain unsolved, supposing the presence of new physics Beyond the Standard Model (BSM) with the production of new particles at a higher energy scale compared to the current experimental limits. The search for additional Higgs bosons is, in fact, predicted by theoretical extensions of the SM including the Minimal Supersymmetry Standard Model (MSSM). In the MSSM, the Higgs sector consists of two Higgs doublets, resulting in five physical Higgs particles: two charged bosons $H^{\pm}$, two neutral scalars $h$ and $H$, and one pseudoscalar $A$. The work presented in this thesis is dedicated to the search of neutral non-Standard Model Higgs bosons decaying to two muons in the model independent MSSM scenario. Proton-proton collision data recorded by the CMS experiment at the CERN LHC at a center-of-mass energy of 13 TeV are used, corresponding to an integrated luminosity of $35.9\ \text{fb}^{-1}$. Such search is sensitive to neutral Higgs bosons produced either via gluon fusion process or in association with a $\text{b}\bar{\text{b}}$ quark pair. The extensive usage of Machine and Deep Learning techniques is a fundamental element in the discrimination between signal and background simulated events. A new network structure called parameterised Neural Network (pNN) has been implemented, replacing a whole set of single neural networks trained at a specific mass hypothesis value with a single neural network able to generalise well and interpolate in the entire mass range considered. The results of the pNN signal/background discrimination are used to set a model independent 95\% confidence level expected upper limit on the production cross section times branching ratio, for a generic $\phi$ boson decaying into a muon pair in the 130 to 1000 GeV range.
Resumo:
Legionella is a Gram-negative bacterium that represent a public health issue, with heavy social and economic impact. Therefore, it is mandatory to provide a proper environmental surveillance and risk assessment plan to perform Legionella control in water distribution systems in hospital and community buildings. The thesis joins several methodologies in a unique workflow applied for the identification of non-pneumophila Legionella species (n-pL), starting from standard methods as culture and gene sequencing (mip and rpoB), and passing through innovative approaches as MALDI-TOF MS technique and whole genome sequencing (WGS). The results obtained, were compared to identify the Legionella isolates, and lead to four presumptive novel Legionella species identification. One of these four new isolates was characterized and recognized at taxonomy level with the name of Legionella bononiensis (the 64th Legionella species). The workflow applied in this thesis, help to increase the knowledge of Legionella environmental species, improving the description of the environment itself and the events that promote the growth of Legionella in their ecological niche. The correct identification and characterization of the isolates permit to prevent their spread in man-made environment and contain the occurrence of cases, clusters, or outbreaks. Therefore, the experimental work undertaken, could support the preventive measures during environmental and clinical surveillance, improving the study of species often underestimated or still unknown.
Resumo:
The discovery of new materials and their functions has always been a fundamental component of technological progress. Nowadays, the quest for new materials is stronger than ever: sustainability, medicine, robotics and electronics are all key assets which depend on the ability to create specifically tailored materials. However, designing materials with desired properties is a difficult task, and the complexity of the discipline makes it difficult to identify general criteria. While scientists developed a set of best practices (often based on experience and expertise), this is still a trial-and-error process. This becomes even more complex when dealing with advanced functional materials. Their properties depend on structural and morphological features, which in turn depend on fabrication procedures and environment, and subtle alterations leads to dramatically different results. Because of this, materials modeling and design is one of the most prolific research fields. Many techniques and instruments are continuously developed to enable new possibilities, both in the experimental and computational realms. Scientists strive to enforce cutting-edge technologies in order to make progress. However, the field is strongly affected by unorganized file management, proliferation of custom data formats and storage procedures, both in experimental and computational research. Results are difficult to find, interpret and re-use, and a huge amount of time is spent interpreting and re-organizing data. This also strongly limit the application of data-driven and machine learning techniques. This work introduces possible solutions to the problems described above. Specifically, it talks about developing features for specific classes of advanced materials and use them to train machine learning models and accelerate computational predictions for molecular compounds; developing method for organizing non homogeneous materials data; automate the process of using devices simulations to train machine learning models; dealing with scattered experimental data and use them to discover new patterns.
Resumo:
To verify the methods used by the clinical trials that assessed the effect of tactile/kinesthetic stimulation on weight gain in preterm infants and highlight the similarities and differences among such studies. This review collected studies from two databases, PEDro and PubMed, in July of 2014, in addition to bibliographies. Two researchers assessed the relevant titles independently, and then chose which studies to read in full and include in this review by consensus. Clinical trials that studied tactile stimulation or massage therapy whether or not associated with kinesthetic stimulation of preterm infants; that assessed weight gain after the intervention; that had a control group and were composed in English, Portuguese, or Spanish were included. A total of 520 titles were found and 108 were selected for manuscript reading. Repeated studies were excluded, resulting in 40 different studies. Of these, 31 met all the inclusion criteria. There were many differences in the application of tactile/kinesthetic stimulation techniques among studies, which hindered the accurate reproduction of the procedure. Also, many studies did not describe the adverse events that occurred during stimulation, the course of action taken when such events occurred, and their effect on the outcome. These studies made a relevant contribution towards indicating tactile/kinesthetic stimulation as a promising tool. Nevertheless, there was no standard for application among them. Future studies should raise the level of methodological rigor and describe the adverse events. This may permit other researchers to be more aware of expected outcomes, and a standard technique could be established.
Resumo:
This paper deals with the application of the lumped dissipation model in the analysis of reinforced concrete structures, emphasizing the nonlinear behaviour of the materials The presented model is based on the original models developed by Cipollina and Florez-Lopez (1995) [12]. Florez-Lopez (1995) [13] and Picon and Florez-Lopez (2000) [14] However, some modifications were introduced in the functions that control the damage evolution in order to improve the results obtained. The efficiency of the new approach is evaluated by means of a comparison with experimental results on reinforced concrete structures such as simply supported beams, plane frames and beam-to-column connections Finally, the adequacy of the numerical model representing the global behaviour of framed structures is investigated and the limits of the analysis are discussed (C) 2009 Elsevier Ltd All rights reserved
Resumo:
The use of cytostatics drugs in anticancer therapy is increasing. Health care workers can be occupationally exposed to these drugs classified as carcinogenic, mutagenic or teratogenic. Cytostatics drugs are a heterogeneous group of chemicals widely used in the treatment of cancer, nevertheless have been proved to be also mutagens, carcinogens and teratogens. Workers may be exposed to this drug, being in the hospital settings the main focus dwelled upon the pharmacy, and nursing personnel. Alkaline comet assay is one of the most promising short-term genotoxicity assays for human risk assessment, being recommended to monitor populations chronically exposed to genotoxic agents. DNA glycosylase (OGG1) represents the main mechanism of protecting the integrity of the human DNA with respect to 8-OHdG, the most well studied biomarker of oxidative damage.