76 resultados para Computational tools
Resumo:
This thesis gives an overview of the use of the level set methods in the field of image science. The similar fast marching method is discussed for comparison, also the narrow band and the particle level set methods are introduced. The level set method is a numerical scheme for representing, deforming and recovering structures in an arbitrary dimensions. It approximates and tracks the moving interfaces, dynamic curves and surfaces. The level set method does not define how and why some boundary is advancing the way it is but simply represents and tracks the boundary. The principal idea of the level set method is to represent the N dimensional boundary in the N+l dimensions. This gives the generality to represent even the complex boundaries. The level set methods can be powerful tools to represent dynamic boundaries, but they can require lot of computing power. Specially the basic level set method have considerable computational burden. This burden can be alleviated with more sophisticated versions of the level set algorithm like the narrow band level set method or with the programmable hardware implementation. Also the parallel approach can be used in suitable applications. It is concluded that these methods can be used in a quite broad range of image applications, like computer vision and graphics, scientific visualization and also to solve problems in computational physics. Level set methods and methods derived and inspired by it will be in the front line of image processing also in the future.
Resumo:
Tutkimuksen tavoitteena oli selvittää miten kehittää yrityksen nykyistä e-palvelujärjestelmää, Internet -teknologiaan perustuvaa sähköisiä kommunikaatio- ja tiedonjakojärjestelmää, yrityksen business-to-business asiakkuuksien johtamisessa. Tavoitteena oli myös luoda ehdotukset uusista e-palvelusopimusmalleista. Tutkimuksen teoriaosuudessa pyrittiin kehittämään aikaisempiin tutkimuksiin, tietokirjallisuuteen ja asiantuntijoihin perustuva viitekehysmalli. Empiirisessä osassa tutkimuksen tavoitteisiin pyrittiin haastattelemalla yrityksen asiakkaita ja henkilöstöä, sekä tarkastelemalla asiakaskontaktien nykyistä tilaa ja kehittymistä. Näiden tietojen perusteella selvitettiin e-palvelun käyttäjien tarpeita, profiilia ja valmiuksia palvelun käyttöön sekä palvelun nykyistä houkuttelevuutta. Tutkimuksen teoriaosan lähdeaineistona käytettiin kirjallisuutta, artikkeleita ja tilastoja asiakashallinnasta sekä e-palveluiden, erityisesti Internet ja verkkopalveluiden markkinoinnista, nykytilasta sekä palveluiden kehittämisestä. Lisäksi tutkittiin kirjallisuutta arvoverkostoanalyysistä, asiakkaan arvosta, informaatioteknologiasta, palvelun laadusta ja asiakastyytyväisyydestä. Tutkimuksen empiirinen osa perustuu yrityksen henkilöstöltä sekä asiakkailta haastatteluissa kerättyihin tietoihin, yrityksen ennalta keräämiin materiaaleihin sekä Taloustutkimuksen keräämiin tietoihin. Tutkimuksessa käytettiin case -menetelmää, joka oli yhdistelmä sekä kvalitatiivista että kvantitatiivista tutkimusta. Casen tarkoituksena oli testata mallin paikkansapitävyyttä ja käyttökelpoisuutta, sekä selvittää onko olemassa vielä muita tekijöitä, jotka vaikuttavat asiakkaan saamaan arvoon. Kvalitatiivinen aineisto perustuu teemahaastattelumenetelmää soveltaen haastateltuihin asiakkaisiin ja yrityksen työntekijöihin. Kvantitatiivinen tutkimus perustuu Taloustutkimuksen tutkimukseen ja yrityksen asiakaskontakteista kerättyyn tietoon. Haastatteluiden perusteella e-palvelut nähtiin hyödyllisinä ja tulevaisuudessa erittäin tärkeinä. E-palvelut nähdään yhtenä tärkeänä kanavana, perinteisten kanavien rinnalla, tehostaa business-to-business -asiakkuuksien johtamista. Tutkimuksen antamien tulosten mukaan asiakkaiden palveluun liittyvän tieto-, taito-, tarpeellisuus- ja kiinnostavuustasojen vaihtelevaisuus osoittaa selvän tarpeen eritasoisille e-palvelupaketti ratkaisuille. Tuloksista muodostettu ratkaisuehdotus käsittää neljän eri e-palvelupaketin rakentamisen asiakkaiden eri tarpeita mukaillen.
Resumo:
Tutkimuksen tavoitteena oli selvittää sisäisen kommunikoinnin tilannetta case-yrityksissä. Yritykset kuuluvat kahteen case-arvoverkostoon, jotka toimivat informaatio- ja kommunikaatioteknologian alalla. Sisäinen kommunikointi valittiin tutkimusalueeksi, koska se muodostaa perustan ulkoiselle, yritysten väliselle kommunikoinnille. Tutkimuksen painopiste oli web-pohjaisessa kommunikoinnissa ja webin ominaisuuksissa arvoverkoston näkökulmasta. Tutkimusprosessissa käytettiin sekä kvalitatiivisia että kvantitatiivisia menetelmiä. Tutkimuksen kvantitatiivinen osa toteutettiin web-kyselynä, jonka tulokset osoittivat, että case-yritysten sisäinen kommunikointi perustuu pääasiassa perinteisten kommunikointivälineiden käyttöön. Toisin sanoen, webin hyödyntäminen on vähäistä, mihin vaikuttavat monet eri tekijät. Webissä on kuitenkin useita ominaisuuksia, jotka parantavat kommunikointia arvoverkostossa ja siksi nämä web-pohjaiset välineet tulisi huomioida, kun suunnitellaan yleistä kommunikointijärjestelmää. Tutkimuksen teoreettisessa osassa määriteltiin vuorovaikutteisuus-ominaisuuteen perustuva kommunikointivälineiden luokittelu. Tämän lisäksi määriteltiin myös arvoverkoston käsite. Empiirinen osa koostui web-kyselyn toteutuksen ja tulosten raportoinnista, jonka jälkeen yhteenvetokappale koosti merkittävimmät havainnot sekä mahdolliset jatkotutkimusaiheet.
Resumo:
This thesis analyses the calculation of FanSave and PumpSave energy saving tools calculation. With these programs energy consumption of variable speed drive control for fans and pumps can be compared to other control methods. With FanSave centrifugal and axial fans can be examined and PumpSave deals with centrifugal pumps. By means of these programs also suitable frequency converter can be chosen from the ABB collection. Programs need as initial values information about the appliances like amount of flow and efficiencies. Operation time is important factor when calculating the annual energy consumption and information about it are the length and profile. Basic theory related to fans and pumps is introduced without more precise instructions for dimensioning. FanSave and PumpSave contain various methods for flow control. These control methods are introduced in the thesis based on their operational principles and suitability. Also squirrel cage motor and frequency converter are introduced because of their close involvement to fans and pumps. Second part of the thesis contains comparison between results of FanSave’s and PumpSave’s calculation and performance curve based calculation. Also laboratory tests were made with centrifugal and axial fan and also with centrifugal pump. With the results from this thesis the calculation of these programs can be adjusted to be more accurate and also some new features can be added.
Resumo:
Multispectral images are becoming more common in the field of remote sensing, computer vision, and industrial applications. Due to the high accuracy of the multispectral information, it can be used as an important quality factor in the inspection of industrial products. Recently, the development on multispectral imaging systems and the computational analysis on the multispectral images have been the focus of a growing interest. In this thesis, three areas of multispectral image analysis are considered. First, a method for analyzing multispectral textured images was developed. The method is based on a spectral cooccurrence matrix, which contains information of the joint distribution of spectral classes in a spectral domain. Next, a procedure for estimating the illumination spectrum of the color images was developed. Proposed method can be used, for example, in color constancy, color correction, and in the content based search from color image databases. Finally, color filters for the optical pattern recognition were designed, and a prototype of a spectral vision system was constructed. The spectral vision system can be used to acquire a low dimensional component image set for the two dimensional spectral image reconstruction. The data obtained by the spectral vision system is small and therefore convenient for storing and transmitting a spectral image.
Resumo:
This thesis concentrates on studying the operational disturbance behavior of machine tools integrated into FMS. Operational disturbances are short term failures of machine tools which are especially disruptive to unattended or unmanned operation of FMS. The main objective was to examine the effect of operational disturbances on reliability and operation time distribution for machine tools. The theoretical part of the thesis covers the fimdamentals of FMS relating to the subject of this study. The concept of FMS, its benefits and operator's role in FMS operation are reviewed. The importance of reliability is presented. The terms describing the operation time of machine tools are formed by adopting standards and references. The concept of failure and indicators describing reliability and operational performance for machine tools in FMSs are presented. The empirical part of the thesis describes the research methodology which is a combination of automated (ADC) and manual data collection. By using this methodology it is possible to have a complete view of the operation time distribution for studied machine tools. Data collection was carried out in four FMSs consisting of a total of 17 machine tools. Each FMS's basic features and the signals of ADC are described. The indicators describing the reliability and operation time distribution of machine tools were calculated according to collected data. The results showed that operational disturbances have a significant influence on machine tool reliability and operational performance. On average, an operational disturbance occurs every 8,6 hours of operation time and has a down time of 0,53 hours. Operational disturbances cause a 9,4% loss in operation time which is twice the amount of losses caused by technical failures (4,3%). Operational disturbances have a decreasing influence on the utilization rate. A poor operational disturbance behavior decreases the utilization rate. It was found that the features of a part family to be machined and the method technology related to it are defining the operational disturbance behavior of the machine tool. Main causes for operational disturbances were related to material quality variations, tool maintenance, NC program errors, ATC and machine tool control. Operator's role was emphasized. It was found that failure recording activity of the operators correlates with the utilization rate. The more precisely the operators record the failure, the higher is the utilization rate. Also the FMS organizations which record failures more precisely have fewer operational disturbances.
Resumo:
Workflow management systems aim at the controlled execution of complex application processes in distributed and heterogeneous environments. These systems will shape the structure of information systems in business and non-business environments. E business and system integration is a fertile soil for WF and groupware tools. This thesis aims to study WF and groupware tools in order to gather in house knowledge of WF to better utilize WF solutions in future, and to focus on SAP Business Workflow in order to find a global solution for Application Link Enabling support for system integration. Piloting this solution in Nokia collects the experience of SAP R/3 WF tool for other development projects in future. The literary part of this study will guide to the world of business process automation providing a general description of the history, use and potentials of WF & groupware software. The empirical part of this study begins with the background of the case study describing the IT environment initiating the case by introducing SAP R/3 in Nokia, the communication technique in use and WF tool. Case study is focused in one solution with SAP Business Workflow. This study provides a concept to monitor communication between ERP systems and to increase the quality of system integration. Case study describes a way create support model for ALE/EDI interfaces. Support model includes monitoring organization and the workflow processes to solve the most common IDoc related errors.
Resumo:
In the paper machine, it is not a desired feature for the boundary layer flows in the fabric and the roll surfaces to travel into the closing nips, creating overpressure. In this thesis, the aerodynamic behavior of the grooved roll and smooth rolls is compared in order to understand the nip flow phenomena, which is the main reason why vacuum and grooved roll constructions are designed. A common method to remove the boundary layer flow from the closing nip is to use the vacuum roll construction. The downside of the use of vacuum rolls is high operational costs due to pressure losses in the vacuum roll shell. The deep grooved roll has the same goal, to create a pressure difference over the paper web and keep the paper attached to the roll or fabric surface in the drying pocket of the paper machine. A literature review revealed that the aerodynamic functionality of the grooved roll is not very well known. In this thesis, the aerodynamic functionality of the grooved roll in interaction with a permeable or impermeable wall is studied by varying the groove properties. Computational fluid dynamics simulations are utilized as the research tool. The simulations have been performed with commercial fluid dynamics software, ANSYS Fluent. Simulation results made with 3- and 2-dimensional fluid dynamics models are compared to laboratory scale measurements. The measurements have been made with a grooved roll simulator designed for the research. The variables in the comparison are the paper or fabric wrap angle, surface velocities, groove geometry and wall permeability. Present-day computational and modeling resources limit grooved roll fluid dynamics simulations in the paper machine scale. Based on the analysis of the aerodynamic functionality of the grooved roll, a grooved roll simulation tool is proposed. The smooth roll simulations show that the closing nip pressure does not depend on the length of boundary layer development. The surface velocity increase affects the pressure distribution in the closing and opening nips. The 3D grooved roll model reveals the aerodynamic functionality of the grooved roll. With the optimal groove size it is possible to avoid closing nip overpressure and keep the web attached to the fabric surface in the area of the wrap angle. The groove flow friction and minor losses play a different role when the wrap angle is changed. The proposed 2D grooved roll simulation tool is able to replicate the grooved aerodynamic behavior with reasonable accuracy. A small wrap angle predicts the pressure distribution correctly with the chosen approach for calculating the groove friction losses. With a large wrap angle, the groove friction loss shows too large pressure gradients, and the way of calculating the air flow friction losses in the groove has to be reconsidered. The aerodynamic functionality of the grooved roll is based on minor and viscous losses in the closing and opening nips as well as in the grooves. The proposed 2D grooved roll model is a simplification in order to reduce computational and modeling efforts. The simulation tool makes it possible to simulate complex paper machine constructions in the paper machine scale. In order to use the grooved roll as a replacement for the vacuum roll, the grooved roll properties have to be considered on the basis of the web handling application.
Resumo:
Virtual screening is a central technique in drug discovery today. Millions of molecules can be tested in silico with the aim to only select the most promising and test them experimentally. The topic of this thesis is ligand-based virtual screening tools which take existing active molecules as starting point for finding new drug candidates. One goal of this thesis was to build a model that gives the probability that two molecules are biologically similar as function of one or more chemical similarity scores. Another important goal was to evaluate how well different ligand-based virtual screening tools are able to distinguish active molecules from inactives. One more criterion set for the virtual screening tools was their applicability in scaffold-hopping, i.e. finding new active chemotypes. In the first part of the work, a link was defined between the abstract chemical similarity score given by a screening tool and the probability that the two molecules are biologically similar. These results help to decide objectively which virtual screening hits to test experimentally. The work also resulted in a new type of data fusion method when using two or more tools. In the second part, five ligand-based virtual screening tools were evaluated and their performance was found to be generally poor. Three reasons for this were proposed: false negatives in the benchmark sets, active molecules that do not share the binding mode, and activity cliffs. In the third part of the study, a novel visualization and quantification method is presented for evaluation of the scaffold-hopping ability of virtual screening tools.
Resumo:
Software testing is one of the essential parts in software engineering process. The objective of the study was to describe software testing tools and the corresponding use. The thesis contains examples of software testing tools usage. The study was conducted as a literature study, with focus on current software testing practices and quality assurance standards. In the paper a tool classifier was employed, and testing tools presented in study were classified according to it. We found that it is difficult to distinguish current available tools by certain testing activities as many of them contain functionality that exceeds scopes of a single testing type.
Resumo:
The problem of understanding how humans perceive the quality of a reproduced image is of interest to researchers of many fields related to vision science and engineering: optics and material physics, image processing (compression and transfer), printing and media technology, and psychology. A measure for visual quality cannot be defined without ambiguity because it is ultimately the subjective opinion of an “end-user” observing the product. The purpose of this thesis is to devise computational methods to estimate the overall visual quality of prints, i.e. a numerical value that combines all the relevant attributes of the perceived image quality. The problem is limited to consider the perceived quality of printed photographs from the viewpoint of a consumer, and moreover, the study focuses only on digital printing methods, such as inkjet and electrophotography. The main contributions of this thesis are two novel methods to estimate the overall visual quality of prints. In the first method, the quality is computed as a visible difference between the reproduced image and the original digital (reference) image, which is assumed to have an ideal quality. The second method utilises instrumental print quality measures, such as colour densities, measured from printed technical test fields, and connects the instrumental measures to the overall quality via subjective attributes, i.e. attributes that directly contribute to the perceived quality, using a Bayesian network. Both approaches were evaluated and verified with real data, and shown to predict well the subjective evaluation results.
Resumo:
Crystallization is a purification method used to obtain crystalline product of a certain crystal size. It is one of the oldest industrial unit processes and commonly used in modern industry due to its good purification capability from rather impure solutions with reasonably low energy consumption. However, the process is extremely challenging to model and control because it involves inhomogeneous mixing and many simultaneous phenomena such as nucleation, crystal growth and agglomeration. All these phenomena are dependent on supersaturation, i.e. the difference between actual liquid phase concentration and solubility. Homogeneous mass and heat transfer in the crystallizer would greatly simplify modelling and control of crystallization processes, such conditions are, however, not the reality, especially in industrial scale processes. Consequently, the hydrodynamics of crystallizers, i.e. the combination of mixing, feed and product removal flows, and recycling of the suspension, needs to be thoroughly investigated. Understanding of hydrodynamics is important in crystallization, especially inlargerscale equipment where uniform flow conditions are difficult to attain. It is also important to understand different size scales of mixing; micro-, meso- and macromixing. Fast processes, like nucleation and chemical reactions, are typically highly dependent on micro- and mesomixing but macromixing, which equalizes the concentrations of all the species within the entire crystallizer, cannot be disregarded. This study investigates the influence of hydrodynamics on crystallization processes. Modelling of crystallizers with the mixed suspension mixed product removal (MSMPR) theory (ideal mixing), computational fluid dynamics (CFD), and a compartmental multiblock model is compared. The importance of proper verification of CFD and multiblock models is demonstrated. In addition, the influence of different hydrodynamic conditions on reactive crystallization process control is studied. Finally, the effect of extreme local supersaturation is studied using power ultrasound to initiate nucleation. The present work shows that mixing and chemical feeding conditions clearly affect induction time and cluster formation, nucleation, growth kinetics, and agglomeration. Consequently, the properties of crystalline end products, e.g. crystal size and crystal habit, can be influenced by management of mixing and feeding conditions. Impurities may have varying impacts on crystallization processes. As an example, manganese ions were shown to replace magnesium ions in the crystal lattice of magnesium sulphate heptahydrate, increasing the crystal growth rate significantly, whereas sodium ions showed no interaction at all. Modelling of continuous crystallization based on MSMPR theory showed that the model is feasible in a small laboratoryscale crystallizer, whereas in larger pilot- and industrial-scale crystallizers hydrodynamic effects should be taken into account. For that reason, CFD and multiblock modelling are shown to be effective tools for modelling crystallization with inhomogeneous mixing. The present work shows also that selection of the measurement point, or points in the case of multiprobe systems, is crucial when process analytical technology (PAT) is used to control larger scale crystallization. The thesis concludes by describing how control of local supersaturation by highly localized ultrasound was successfully applied to induce nucleation and to control polymorphism in reactive crystallization of L-glutamic acid.
Resumo:
Print quality and the printability of paper are very important attributes when modern printing applications are considered. In prints containing images, high print quality is a basic requirement. Tone unevenness and non uniform glossiness of printed products are the most disturbing factors influencing overall print quality. These defects are caused by non ideal interactions of paper, ink and printing devices in high speed printing processes. Since print quality is a perceptive characteristic, the measurement of unevenness according to human vision is a significant problem. In this thesis, the mottling phenomenon is studied. Mottling is a printing defect characterized by a spotty, non uniform appearance in solid printed areas. Print mottle is usually the result of uneven ink lay down or non uniform ink absorption across the paper surface, especially visible in mid tone imagery or areas of uniform color, such as solids and continuous tone screen builds. By using existing knowledge on visual perception and known methods to quantify print tone variation, a new method for print unevenness evaluation is introduced. The method is compared to previous results in the field and is supported by psychometric experiments. Pilot studies are made to estimate the effect of optical paper characteristics prior to printing, on the unevenness of the printed area after printing. Instrumental methods for print unevenness evaluation have been compared and the results of the comparison indicate that the proposed method produces better results in terms of visual evaluation correspondence. The method has been successfully implemented as ail industrial application and is proved to be a reliable substitute to visual expertise.
Resumo:
Appearance of the vibration is the very important problem in long tool turning and milling. Current solutions of minimizing vibrations provided by different tool suppliers are very expensive. This Master’s Thesis is presenting the new type of vibration free machining tools produced by Konepaja ASTEX Gear Oy that have cheaper production costs compare to competitors’ products. Vibration problems in machining and their today’s solutions are analyzed in this work. The new vibration damping invention is presented and described. Moreover, the production, laboratory experimental modal analysis and practical testing of the new vibration free prototypes are observed and analyzed on the pages of this Thesis. Based on the testing results the new invention is acknowledged to be successful and approved for further studies and developments.
Resumo:
The use of domain-specific languages (DSLs) has been proposed as an approach to cost-e ectively develop families of software systems in a restricted application domain. Domain-specific languages in combination with the accumulated knowledge and experience of previous implementations, can in turn be used to generate new applications with unique sets of requirements. For this reason, DSLs are considered to be an important approach for software reuse. However, the toolset supporting a particular domain-specific language is also domain-specific and is per definition not reusable. Therefore, creating and maintaining a DSL requires additional resources that could be even larger than the savings associated with using them. As a solution, di erent tool frameworks have been proposed to simplify and reduce the cost of developments of DSLs. Developers of tool support for DSLs need to instantiate, customize or configure the framework for a particular DSL. There are di erent approaches for this. An approach is to use an application programming interface (API) and to extend the basic framework using an imperative programming language. An example of a tools which is based on this approach is Eclipse GEF. Another approach is to configure the framework using declarative languages that are independent of the underlying framework implementation. We believe this second approach can bring important benefits as this brings focus to specifying what should the tool be like instead of writing a program specifying how the tool achieves this functionality. In this thesis we explore this second approach. We use graph transformation as the basic approach to customize a domain-specific modeling (DSM) tool framework. The contributions of this thesis includes a comparison of di erent approaches for defining, representing and interchanging software modeling languages and models and a tool architecture for an open domain-specific modeling framework that e ciently integrates several model transformation components and visual editors. We also present several specific algorithms and tool components for DSM framework. These include an approach for graph query based on region operators and the star operator and an approach for reconciling models and diagrams after executing model transformation programs. We exemplify our approach with two case studies MICAS and EFCO. In these studies we show how our experimental modeling tool framework has been used to define tool environments for domain-specific languages.