982 resultados para ensemble methods
Resumo:
Superheater corrosion causes vast annual losses for the power companies. With a reliable corrosion prediction method, the plants can be designed accordingly, and knowledge of fuel selection and determination of process conditions may be utilized to minimize superheater corrosion. Growing interest to use recycled fuels creates additional demands for the prediction of corrosion potential. Models depending on corrosion theories will fail, if relations between the inputs and the output are poorly known. A prediction model based on fuzzy logic and an artificial neural network is able to improve its performance as the amount of data increases. The corrosion rate of a superheater material can most reliably be detected with a test done in a test combustor or in a commercial boiler. The steel samples can be located in a special, temperature-controlled probe, and exposed to the corrosive environment for a desired time. These tests give information about the average corrosion potential in that environment. Samples may also be cut from superheaters during shutdowns. The analysis ofsamples taken from probes or superheaters after exposure to corrosive environment is a demanding task: if the corrosive contaminants can be reliably analyzed, the corrosion chemistry can be determined, and an estimate of the material lifetime can be given. In cases where the reason for corrosion is not clear, the determination of the corrosion chemistry and the lifetime estimation is more demanding. In order to provide a laboratory tool for the analysis and prediction, a newapproach was chosen. During this study, the following tools were generated: · Amodel for the prediction of superheater fireside corrosion, based on fuzzy logic and an artificial neural network, build upon a corrosion database developed offuel and bed material analyses, and measured corrosion data. The developed model predicts superheater corrosion with high accuracy at the early stages of a project. · An adaptive corrosion analysis tool based on image analysis, constructedas an expert system. This system utilizes implementation of user-defined algorithms, which allows the development of an artificially intelligent system for thetask. According to the results of the analyses, several new rules were developed for the determination of the degree and type of corrosion. By combining these two tools, a user-friendly expert system for the prediction and analyses of superheater fireside corrosion was developed. This tool may also be used for the minimization of corrosion risks by the design of fluidized bed boilers.
Establishing intercompany relationships: Motives and methods for successful collaborative engagement
Resumo:
This study explores the early phases of intercompany relationship building, which is a very important topic for purchasing and business development practitioners as well as for companies' upper management. There is a lot ofevidence that a proper engagement with markets increases a company's potential for achieving business success. Taking full advantage of the market possibilities requires, however, a holistic view of managing related decision-making chain. Most literature as well as the business processes of companies are lacking this holism. Typically they observe the process from the perspective of individual stages and thus lead to discontinuity and sub-optimization. This study contains a comprehensive introduction to and evaluation of literature related to various steps of the decision-making process. It is studied from a holistic perspective ofdetermining a company's vertical integration position within its demand/ supplynetwork context; translating the vertical integration objectives to feasible strategies and objectives; and operationalizing the decisions made through engagement with collaborative intercompany relationships. The empirical part of the research has been conducted in two sections. First the phenomenon of intercompany engagement is studied using two complementary case studies. Secondly a survey hasbeen conducted among the purchasing and business development managers of several electronics manufacturing companies, to analyze the processes, decision-makingcriteria and success factors of engagement for collaboration. The aim has been to identify the reasons why companies and their management act the way they do. As a combination of theoretical and empirical research an analysis has been produced of what would be an ideal way of engaging with markets. Based on the respective findings the study concludes by proposing a holistic framework for successful engagement. The evidence presented throughout the study demonstrates clear gaps, discontinuities and limitations in both current research and in practical purchasing decision-making chains. The most significant discontinuity is the identified disconnection between the supplier selection process and related criteria and the relationship success factors.
Resumo:
This thesis considers nondestructive optical methods for metal surface and subsurface inspection. The main purpose of this thesis was to study some optical methods in order to find out their applicability to industrial measurements. In laboratory testing the simplest light scattering approach, measurement of specular reflectance, was used for surface roughness evaluation. Surface roughness, curvature and finishing process of metal sheets were determined by specular reflectance measurements. Using a fixed angleof incidence, the specular reflectance method might be automated for industrialinspection. For defect detection holographic interferometry and thermography were compared. Using either holographic interferometry or thermography, relativelysmall-size defects in metal plates could be revealed. Holographic techniques have some limitations for industrial measurements. On the contrary, thermography has excellent prospects for on-line inspection, especially with scanning techniques.
Resumo:
Canopy characterization is a key factor to improve pesticide application methods in tree crops and vineyards. Development of quick, easy and efficient methods to determine the fundamental parameters used to characterize canopy structure is thus an important need. In this research the use of ultrasonic and LIDAR sensors have been compared with the traditional manual and destructive canopy measurement procedure. For both methods the values of key parameters such as crop height, crop width, crop volume or leaf area have been compared. Obtained results indicate that an ultrasonic sensor is an appropriate tool to determine the average canopy characteristics, while a LIDAR sensor provides more accuracy and detailed information about the canopy. Good correlations have been obtained between crop volume (CVU) values measured with ultrasonic sensors and leaf area index, LAI (R2 = 0.51). A good correlation has also been obtained between the canopy volume measured with ultrasonic and LIDAR sensors (R2 = 0.52). Laser measurements of crop height (CHL) allow one to accurately predict the canopy volume. The proposed new technologies seems very appropriate as complementary tools to improve the efficiency of pesticide applications, although further improvements are still needed.
Resumo:
La présente thèse s'intitule "Développent et Application des Méthodologies Computationnelles pour la Modélisation Qualitative". Elle comprend tous les différents projets que j'ai entrepris en tant que doctorante. Plutôt qu'une mise en oeuvre systématique d'un cadre défini a priori, cette thèse devrait être considérée comme une exploration des méthodes qui peuvent nous aider à déduire le plan de processus regulatoires et de signalisation. Cette exploration a été mue par des questions biologiques concrètes, plutôt que par des investigations théoriques. Bien que tous les projets aient inclus des systèmes divergents (réseaux régulateurs de gènes du cycle cellulaire, réseaux de signalisation de cellules pulmonaires) ainsi que des organismes (levure à fission, levure bourgeonnante, rat, humain), nos objectifs étaient complémentaires et cohérents. Le projet principal de la thèse est la modélisation du réseau de l'initiation de septation (SIN) du S.pombe. La cytokinèse dans la levure à fission est contrôlée par le SIN, un réseau signalant de protéines kinases qui utilise le corps à pôle-fuseau comme échafaudage. Afin de décrire le comportement qualitatif du système et prédire des comportements mutants inconnus, nous avons décidé d'adopter l'approche de la modélisation booléenne. Dans cette thèse, nous présentons la construction d'un modèle booléen étendu du SIN, comprenant la plupart des composantes et des régulateurs du SIN en tant que noeuds individuels et testable expérimentalement. Ce modèle utilise des niveaux d'activité du CDK comme noeuds de contrôle pour la simulation d'évènements du SIN à différents stades du cycle cellulaire. Ce modèle a été optimisé en utilisant des expériences d'un seul "knock-out" avec des effets phénotypiques connus comme set d'entraînement. Il a permis de prédire correctement un set d'évaluation de "knock-out" doubles. De plus, le modèle a fait des prédictions in silico qui ont été validées in vivo, permettant d'obtenir de nouvelles idées de la régulation et l'organisation hiérarchique du SIN. Un autre projet concernant le cycle cellulaire qui fait partie de cette thèse a été la construction d'un modèle qualitatif et minimal de la réciprocité des cyclines dans la S.cerevisiae. Les protéines Clb dans la levure bourgeonnante présentent une activation et une dégradation caractéristique et séquentielle durant le cycle cellulaire, qu'on appelle communément les vagues des Clbs. Cet évènement est coordonné avec la courbe d'activation inverse du Sic1, qui a un rôle inhibitoire dans le système. Pour l'identification des modèles qualitatifs minimaux qui peuvent expliquer ce phénomène, nous avons sélectionné des expériences bien définies et construit tous les modèles minimaux possibles qui, une fois simulés, reproduisent les résultats attendus. Les modèles ont été filtrés en utilisant des simulations ODE qualitatives et standardisées; seules celles qui reproduisaient le phénotype des vagues ont été gardées. L'ensemble des modèles minimaux peut être utilisé pour suggérer des relations regulatoires entre les molécules participant qui peuvent ensuite être testées expérimentalement. Enfin, durant mon doctorat, j'ai participé au SBV Improver Challenge. Le but était de déduire des réseaux spécifiques à des espèces (humain et rat) en utilisant des données de phosphoprotéines, d'expressions des gènes et des cytokines, ainsi qu'un réseau de référence, qui était mis à disposition comme donnée préalable. Notre solution pour ce concours a pris la troisième place. L'approche utilisée est expliquée en détail dans le dernier chapitre de la thèse. -- The present dissertation is entitled "Development and Application of Computational Methodologies in Qualitative Modeling". It encompasses the diverse projects that were undertaken during my time as a PhD student. Instead of a systematic implementation of a framework defined a priori, this thesis should be considered as an exploration of the methods that can help us infer the blueprint of regulatory and signaling processes. This exploration was driven by concrete biological questions, rather than theoretical investigation. Even though the projects involved divergent systems (gene regulatory networks of cell cycle, signaling networks in lung cells), as well as organisms (fission yeast, budding yeast, rat, human), our goals were complementary and coherent. The main project of the thesis is the modeling of the Septation Initiation Network (SIN) in S.pombe. Cytokinesis in fission yeast is controlled by the SIN, a protein kinase signaling network that uses the spindle pole body as scaffold. In order to describe the qualitative behavior of the system and predict unknown mutant behaviors we decided to adopt a Boolean modeling approach. In this thesis, we report the construction of an extended, Boolean model of the SIN, comprising most SIN components and regulators as individual, experimentally testable nodes. The model uses CDK activity levels as control nodes for the simulation of SIN related events in different stages of the cell cycle. The model was optimized using single knock-out experiments of known phenotypic effect as a training set, and was able to correctly predict a double knock-out test set. Moreover, the model has made in silico predictions that have been validated in vivo, providing new insights into the regulation and hierarchical organization of the SIN. Another cell cycle related project that is part of this thesis was to create a qualitative, minimal model of cyclin interplay in S.cerevisiae. CLB proteins in budding yeast present a characteristic, sequential activation and decay during the cell cycle, commonly referred to as Clb waves. This event is coordinated with the inverse activation curve of Sic1, which has an inhibitory role in the system. To generate minimal qualitative models that can explain this phenomenon, we selected well-defined experiments and constructed all possible minimal models that, when simulated, reproduce the expected results. The models were filtered using standardized qualitative ODE simulations; only the ones reproducing the wave-like phenotype were kept. The set of minimal models can be used to suggest regulatory relations among the participating molecules, which will subsequently be tested experimentally. Finally, during my PhD I participated in the SBV Improver Challenge. The goal was to infer species-specific (human and rat) networks, using phosphoprotein, gene expression and cytokine data and a reference network provided as prior knowledge. Our solution to the challenge was selected as in the final chapter of the thesis.
Resumo:
Agile software development methods are attempting to provide an answer to the software development industry's need of lighter weight, more agile processes that offer the possibility to react to changes during the software development process. The objective of this thesis is to analyze and experiment the possibility of using agile methods or practices also in small software projects, even in projects containing only one developer. In the practical part of the thesis a small software project was executed with some agile methods and practices that in the theoretical part of the thesis were found possible to be applied to the project. In the project a Bluetooth proxy application that is run in the S60 smartphone platform and PC was developed further to contain some new features. As a result it was found that certain agile practices can be useful even in the very small projects. The selection of the suitable practices depends on the project and the size of the project team.
Resumo:
Head space gas chromatography with flame-ionization detection (HS-GC-FID), ancl purge and trap gas chromatography-mass spectrometry (P&T-GC-MS) have been used to determine methyl-tert-butyl ether (MTBE) and benzene, toluene, and the ylenes (BTEX) in groundwater. In the work discussed in this paper measures of quality, e.g. recovery (94-111%), precision (4.6 - 12.2%), limits of detection (0.3 - 5.7 I~g L 1 for HS and 0.001 I~g L 1 for PT), and robust-ness, for both methods were compared. In addition, for purposes of comparison, groundwater samples from areas suffering from odor problems because of fuel spillage and tank leakage were analyzed by use of both techniques. For high concentration levels there was good correlation between results from both methods.
Resumo:
OBJECTIVE: Routinely collected health data, collected for administrative and clinical purposes, without specific a priori research questions, are increasingly used for observational, comparative effectiveness, health services research, and clinical trials. The rapid evolution and availability of routinely collected data for research has brought to light specific issues not addressed by existing reporting guidelines. The aim of the present project was to determine the priorities of stakeholders in order to guide the development of the REporting of studies Conducted using Observational Routinely-collected health Data (RECORD) statement. METHODS: Two modified electronic Delphi surveys were sent to stakeholders. The first determined themes deemed important to include in the RECORD statement, and was analyzed using qualitative methods. The second determined quantitative prioritization of the themes based on categorization of manuscript headings. The surveys were followed by a meeting of RECORD working committee, and re-engagement with stakeholders via an online commentary period. RESULTS: The qualitative survey (76 responses of 123 surveys sent) generated 10 overarching themes and 13 themes derived from existing STROBE categories. Highest-rated overall items for inclusion were: Disease/exposure identification algorithms; Characteristics of the population included in databases; and Characteristics of the data. In the quantitative survey (71 responses of 135 sent), the importance assigned to each of the compiled themes varied depending on the manuscript section to which they were assigned. Following the working committee meeting, online ranking by stakeholders provided feedback and resulted in revision of the final checklist. CONCLUSIONS: The RECORD statement incorporated the suggestions provided by a large, diverse group of stakeholders to create a reporting checklist specific to observational research using routinely collected health data. Our findings point to unique aspects of studies conducted with routinely collected health data and the perceived need for better reporting of methodological issues.
Resumo:
This thesis gives an overview of the use of the level set methods in the field of image science. The similar fast marching method is discussed for comparison, also the narrow band and the particle level set methods are introduced. The level set method is a numerical scheme for representing, deforming and recovering structures in an arbitrary dimensions. It approximates and tracks the moving interfaces, dynamic curves and surfaces. The level set method does not define how and why some boundary is advancing the way it is but simply represents and tracks the boundary. The principal idea of the level set method is to represent the N dimensional boundary in the N+l dimensions. This gives the generality to represent even the complex boundaries. The level set methods can be powerful tools to represent dynamic boundaries, but they can require lot of computing power. Specially the basic level set method have considerable computational burden. This burden can be alleviated with more sophisticated versions of the level set algorithm like the narrow band level set method or with the programmable hardware implementation. Also the parallel approach can be used in suitable applications. It is concluded that these methods can be used in a quite broad range of image applications, like computer vision and graphics, scientific visualization and also to solve problems in computational physics. Level set methods and methods derived and inspired by it will be in the front line of image processing also in the future.
Resumo:
Tämä tutkimus oli osa sähköistä liiketoimintaa ja langattomia sovelluksia tutkivaa projektia ja tutkimuksen tavoitteena oli selvittää ennustamisen rooli päätöksenteko- ja suunnitteluprosessissa ja määrittää parhaiten soveltuvat ja useimmin käytetyt teknologian ennustusmenetelmät. Ennustusmenetelmiä tarkasteltiin erityisesti uuden teknologian ja pitkän aikavälin ennustamisen näkökulmasta. Tutkimus perustui teknologista ennustamista, pitkän aikavälin suunnittelua ja innovaatioprosesseja käsittelevän kirjallisuuden analysointiin. Materiaalin perusteella kuvataan teknologian ennustamista informaation hankkimisvälineenä organisaatioiden suunnitteluprosessin apuna. Työssä arvioidaan myös seuraavat teknologisen ennustamisen menetelmät: trendianalyysi-, Delfoi-, cross-impact analyysi-, morfologinen analyysi- ja skenaario analyysimenetelmä. Työ tuo esille jokaisen ennustusmenetelmä ominaispiirteet, rajoitukset ja sovellusmahdollisuudet. Käyttäen esiteltyjä menetelmiä, saadaan kerättyä hyödyllistä informaatiota tulevaisuuden näkymistä, joita sitten voidaan käyttää hyväksi organisaatioiden suunnitteluprosesseissa.
Resumo:
Water is vital to humans and each of us needs at least 1.5 L of safe water a day to drink. Beginning as long ago as 1958 the World Health Organization (WHO) has published guidelines to help ensure water is safe to drink. Focused from the start on monitoring radionuclides in water, and continually cooperating with WHO, the International Standardization Organization (ISO) has been publishing standards on radioactivity test methods since 1978. As reliable, comparable and"fit for purpose" results are an essential requirement for any public health decision based on radioactivity measurements, international standards of tested and validated radionuclide test methods are an important tool for production of such measurements. This paper presents the ISO standards already published that could be used as normative references by testing laboratories in charge of radioactivity monitoring of drinking water as well as those currently under drafting and the prospect of standardized fast test methods in response to a nuclear accident.