947 resultados para modeling and model calibration


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, public policy has been offering subsidized credit for machine purchase to family farmers. However, there is no methodological procedure to select a suitable tractor for these farmers' situation. In this way, we aimed to develop a selection model for smallholder farmers from Pelotas city region in the state of Rio Grande do Sul. Building a multicriteria model to aid decisions is divided into three main stages: structuring stage (identifying stakeholders, decisional context and model creation), evaluation stage (stakeholder preference quantification) and recommendation stage (choice selection). The Multicriteria method is able to identify and value the criteria used in tractor selection by regional family farmers. Six main evaluation areas were identified: operational cost (weight 0.20), purchase cost (weight 0.22), maintainability (weight 0.10), tractor capacity (weight 0.26), ergonomics (weight 0.14) and safety (weight 0.08). The best-rated tractor model (14.7 kW rated power) also was the one purchased by 53.3% of local families.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on experimental tests, it was obtained the equations for drying, equilibrium moisture content, latent heat of vaporization of water contained in the product and the equation of specific heat of cassava starch pellets, essential parameters for realizing modeling and mathematical simulation of mechanical drying of cassava starch for a new technique proposed, consisting of preformed by pelleting and subsequent artificial drying of starch pellets. Drying tests were conducted in an experimental chamber by varying the air temperature, relative humidity, air velocity and product load. The specific heat of starch was determined by differential scanning calorimetry. The generated equations were validated through regression analysis, finding an appropriate correlation of the data, which indicates that by using these equations, can accurately model and simulate the drying process of cassava starch pellets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä diplomityössä määritellään biopolttoainetta käyttävän voimalaitoksen käytönaikainen tuotannon optimointimenetelmä. Määrittelytyö liittyy MW Powerin MultiPower CHP –voimalaitoskonseptin jatkokehitysprojektiin. Erilaisten olemassa olevien optimointitapojen joukosta valitaan tarkoitukseen sopiva, laitosmalliin ja kustannusfunktioon perustuva menetelmä, jonka tulokset viedään automaatiojärjestelmään PID-säätimien asetusarvojen muodossa. Prosessin mittaustulosten avulla lasketaan laitoksen energia- ja massataseet, joiden tuloksia käytetään seuraavan optimointihetken lähtötietoina. Optimoinnin kohdefunktio on kustannusfunktio, jonka termit ovat voimalaitoksen käytöstä aiheutuvia tuottoja ja kustannuksia. Prosessia optimoidaan säätimille annetut raja-arvot huomioiden niin, että kokonaiskate maksimoituu. Kun laitokselle kertyy käyttöikää ja historiadataa, voidaan prosessin optimointia nopeuttaa hakemalla tilastollisesti historiadatasta nykytilanteen olosuhteita vastaava hetki. Kyseisen historian hetken katetta verrataan kustannusfunktion optimoinnista saatuun katteeseen. Paremman katteen antavan menetelmän laskemat asetusarvot otetaan käyttöön prosessin ohjausta varten. Mikäli kustannusfunktion laskenta eikä historiadatan perusteella tehty haku anna paranevaa katetta, niiden laskemia asetusarvoja ei oteta käyttöön. Sen sijaan optimia aletaan hakea deterministisellä optimointialgoritmilla, joka hakee nykyhetken ympäristöstä paremman katteen antavia säätimien asetusarvoja. Säätöjärjestelmä on mahdollista toteuttaa myös tulevaisuutta ennustavana. Työn käytännön osuudessa voimalaitosmalli luodaan kahden eri mallinnusohjelman avulla, joista toisella kuvataan kattilan ja toisella voimalaitosprosessin toimintaa. Mallinnuksen tuloksena saatuja prosessiarvoja hyödynnetään lähtötietoina käyttökatteen laskennassa. Kate lasketaan kustannusfunktion perusteella. Tuotoista suurimmat liittyvät sähkön ja lämmön myyntiin sekä tuotantotukeen, ja suurimmat kustannukset liittyvät investoinnin takaisinmaksuun ja polttoaineen ostoon. Kustannusfunktiolle tehdään herkkyystarkastelu, jossa seurataan katteen muutosta prosessin teknisiä arvoja muutettaessa. Tuloksia vertaillaan referenssivoimalaitoksella suoritettujen verifiointimittausten tuloksiin, ja havaitaan, että tulokset eivät ole täysin yhteneviä. Erot johtuvat sekä mallinnuksen puutteista että mittausten lyhyehköistä tarkasteluajoista. Automatisoidun optimointijärjestelmän käytännön toteutusta alustetaan määrittelemällä käyttöön otettava optimointitapa, siihen liittyvät säätöpiirit ja tarvittavat lähtötiedot. Projektia tullaan jatkamaan järjestelmän ohjelmoinnilla, testauksella ja virityksellä todellisessa voimalaitosympäristössä ja myöhemmin ennustavan säädön toteuttamisella.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parametric cost modeling is a technique, where cost estimating relationships are built to meet products’ parameters. Parameters are directly defined from product features, when it is possible to solve product cost exact before even a single product is manufactured and calculated with general cost accounting. The parametric model can be used in product design, sourcing and comparing product cost of similar products. The model reveals the cost origin more clear than general accounting. The purpose of this thesis was to find out parameters for modeling elevator doors and validate the parameters to meet actual costs. The other target was to simulate cost impact and changes in the cost structure. The results were compared to previous calculations and actual costs and model was tested in new product design. The results of the calculations revealed, that material consumption is the most significant issue in design effectiveness as well as the complexity of the components and structure. To develop the model more research is needed for other continents cost structure and waste calculation principles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä työssä tutkittiin eri mitoitusmenetelmien soveltuvuutta hitsattujen rakenteiden vä-symislaskennassa. Käytetyt menetelmät olivat rakenteellinen jännityksen menetelmä, te-hollisen lovijännityksen menetelmä ja murtumismekaniikka. Lisäksi rakenteellisen jänni-tyksen määrittämiseksi käytettiin kolmea eri menetelmää. Menetelmät olivat pintaa pitkin ekstrapolointi, paksuuden yli linearisointi ja Dongin menetelmä. Väsymiskestävyys määritettiin kahdelle hitsiliitoksen yksityiskohdalle. Laskenta tehtiin käyttäen elementtimenetelmää rakenteen 3D-mallille. Tutkittavasta aggregaattirungosta oli olemassa FE-malli mutta alimallinnustekniikkaa hyödyntämällä pystyttiin yksityiskohtai-semmin tutkimaan vain pientä osaa koko rungon mallista. Rakenteellisen jännityksen menetelmä perustuu nimellisiin jännityksiin. Kyseinen mene-telmä ei vaadi geometrian muokkausta. Yleensä rakenteellisen jännityksen menetelmää käytetään hitsin rajaviivan väsymislaskennassa, mutta joissain tapauksissa sitä on käytetty juuren puolen laskennassa. Tässä työssä rakenteellisen jännityksen menetelmää käytettiin myös juuren puolen tutkimisessa. Tehollista lovijännitystä tutkitaan mallintamalla 1 mm fiktiiviset pyöristykset sekä rajaviivalle että juuren puolelle. Murtumismekaniikan so-veltuvuutta tutkittiin käyttämällä Franc2D särön kasvun simulointiohjelmaa. Väsymislaskennan tulokset eivät merkittävästi poikkea eri laskentamenetelmien välillä. Ainoastaan rakenteellisen jännityksen Dongin menetelmällä saadaan poikkeavia tuloksia. Tämä johtuu pääasiassa siitä, että menetelmän laskentaetäisyydestä ei ole tietoa. Raken-teellisen jännityksen menetelmällä, tehollisen lovijännityksen menetelmällä ja murtumis-mekaniikalla saadaan samansuuntaiset tulokset. Suurin ero menetelmien välillä on mal-linnuksen ja laskennan vaatima työmäärä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diplomityön tavoitteena on soveltaa perinteisesti teollisesta tuotannosta lähtöisin olevia prosessijohtamisen menetelmiä sosiaali- ja terveydenhuollon vastaanottotoiminnan kehittämiseen. Tarkoituksena on asiantuntijapalveluprosessien mallintamisen avulla kartoittaa prosessien nykytilaa tunnistamalla ongelma- ja kehittämiskohteita, joiden ratkaisemiseksi annettavien kehittämistoimenpide-ehdotusten muodostamisessa sovelletaan prosessijohtamisen menetelmien periaatteita. Tutkimuksen teoreettisen viitekehyksen mukaisesti tarkastellaan mm. prosessien mallintamista, eli kuvaamista ja analysointia, sekä sosiaali- ja terveydenhuoltoa palvelutuotantona. Tutkimuksessa hyödynnetään sekä kvalitatiivisia että kvantitatiivisia menetelmiä, ja tutkimusaineistoa kerätään haastattelujen, havainnoinnin ja tilastojen avulla. Asiantuntijoiden palveluprosesseista laaditaan aineiston perusteella kuvaukset, ja tunnistetaan prosesseihin tai laajemminkin sosiaali- ja terveydenhuolto-organisaatioon liittyviä ongelma- ja kehittämiskohteita. Kehittämistoimenpiteissä korostuvat asiakaslähtöisyys ja tuottavuus, ja niissä huomioidaan niin asiakkaan, henkilöstön, prosessien kuin asiakasohjauksen näkökulma. Työn keskeisimmät tulokset ovat prosessikuvaukset, tunnistetut ongelma- ja kehittämiskohteet sekä prosessien uudelleenmäärittämiseksi muodostetut kehittämistoimenpide-ehdotukset, joiden hyötyjä ja vaikutuksia mm. suorituskykyyn arvioidaan. Tulosten perusteella voidaan päätellä, että prosessijohtamisen menetelmät soveltuvat sosiaali- ja terveydenhuollon toiminnan kehittämiseen, kunhan huomioidaan toimintaympäristön erityispiirteet ja haasteet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Technological capabilities are built to support different types of collaboration, and this gives the justification to widely observe, how activity environments are influenced by technology. Technology as an enabler can be addressed from different perspectives, other than merely technological. Dynamic, evolving environment is at the same time interesting but also challenging. As a multinational collaboration environment, the maritime surveillance is an good example of time critical and evolving environment, where technological solutions enable new ways of collaboration. Justification for the inspiration to use maritime environment as the baseline for understanding the challenges in creating and maintaining adequate level of situational awareness, derives from the complexity of the collaboration and information sharing environment elements, needed to be taken into account, when analyzing criticalities related to decision making. Situational awareness is an important element supporting decision making, and challenges related to it can also be observed in the maritime environment. This dissertation describes the structures and factors involved in this complex setting, found from the case studies that should be taken into account when trying to understand, how these elements affect the activities. This dissertation focuses on the gray area that is between a life threatening situation and normal everyday activities. From the multinational experimentation series case studies, MNE5 and MNE6 it was possible to observe situations that were not life threatening for the participants themselves, but not also basic every day activities. These case studies provided a unique possibility to see situations, where gaining of situational awareness and decision making are challenged with time critical crisis situations. Unfortunately organizations do not normally take the benefit from the everyday work to prepare themselves for possible emerging crisis situations. This dissertation focuses on creating a conceptual model and a concept that supports organizations – also outside the maritime community – to improve their ability to support gaining of situational awareness from the individual training level, all the way to changes in organizational structures in aiming for better support for decision making from the individual level to the highest decision making level. Quick changes and unpredictability are reality in organizations and organizations do not have the possibility to control all the factors that affect their functioning. Since we cannot be prepared for everything, and predict every crisis, individual activities inside teams and as a part of organizations, need to be supported with guidance, tools and training in order to support acting in challenging situations. In fact the ideology of the conceptual model created, lies especially in the aim of not controlling everything in beforehand, but supporting organizations with concrete procedures to help individuals to react in different, unpredictable situations, instead of focusing on traditional risk prevention and management. Technological capabilities are not automatically solutions for functional challenges; this is why it is justified to broaden the problem area observation from the technological perspective. This dissertation demonstrates that it is possible to support collaboration in a multinational environment with technological solutions, but it requires the recognition of technological limitations and accepting the possible restrictions related to technological innovations. Technology should not be considered value per se, the value of technology should be defined according to the support of activities, including strategic and operational environment evaluation, identification of organizational elements, and taking into account also the social factors and their challenges. Then we are one step closer to providing technological solutions that support the actual activities by taking into account the variables of the activity environment in question. The multidisciplinary view to approach the information sharing and collaboration framework, is derived especially from the complexity of decision making and building of situational awareness, since they are not build or created in vacuity, but in the organizational framework by the people doing it with the technological capabilities, enabled by the organizational structures. Introduced case studies were related to maritime environment, but according to the research results, it is valid to argue, that based on the lessons learned it is possible to create and further develop conceptual model and to create a general concept to support a wider range of organizations in their attempt to gain better level of situational awareness (SA) and to support decision making. To proof the versatile usage of the developed concept, I have introduced the case study findings to the health care environment and reflected the identified elements from the trauma center to the created concept. The main contribution to complete this adventure is the presented situational awareness concept created in the respect to NATO concept structure. This has been done to tackle the challenge of collaboration by focusing on situational awareness in the information sharing context by providing a theoretical ground and understanding, of how these issues should be approached, and how these elements can be generalized and used to support activities in other environments as well. This dissertation research has been a several year evolving process reflecting and affecting presented case studies and this learning experience from the case studies has also affected the goals and research questions of this dissertation. This venture has been written from a retro perspective according to ideology of process modeling and design rationale to present to the reader how this entire journey took place and what where the critical milestones that affected the end result, conceptual model. Support in a challenging information sharing framework can be provided with the right type of combination of tools, procedures and individual effort. This dissertation will provide insights to those with a new approach to war technology for the organizations to gain a better level of awareness and to improve the capabilities in decision making. This dissertation will present, from the war technology starting point, a new approach and possibility for the organizations to create a better level of awareness and support for decision making with the right combination of tools, procedures and individual effort.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän diplomityön tärkeimpänä tavoitteena on kuvata kohdekonsernin ydinprosessi asiakasrajapinnasta takuutarkastukseen sekä sitä tukevat prosessit riittävällä tarkkuudella. Lisäksi tavoitteena on laatia kuvaamisen pohjalta yksinkertainen työkalu yksittäisten hankkeiden johtamisen sekä sisäisen koulutuksen tarpeisiin. Tavoitteena on työn kautta edistää konsernin strategisten tavoitteiden saavuttamista. Työ jakautuu teoria- ja empiriaosaan. Teoriaosassa luodaan viitekehys työn empiriaosalle. Se pitää sisällään prosessiajattelun ja prosessijohtamisen käsittelyä sekä prosessien kuvaamisen ja niiden suorituskyvyn mittaamisen teoreettista taustaa. Työn empiria- eli käytännön osassa kuvataan konsernin strategian mukainen ydinprosessi ja määritetään siihen porttimallin mukaiset portit sekä niissä tarkastettavat minimivaatimukset. Lisäksi esitellään ydinprosessia tukevia prosesseja ja laaditaan suorituskykymittaristo sekä käsitellään jatkuvaa parantamista konsernissa käytännössä. Käytännön osan lopussa esitellään kuvauksen pohjalta laadittu työkalu sekä sen tuomat mahdollisuudet. Teoriaosa on luonteeltaan kirjallisuustutkimus, joka muodostuu kirjojen ja artikkelien pohjalta. Empiriaosassa on sekä tapaustutkimuksen että konstruktiivisen tutkimuksen piirteitä ja menetelminä siinä on käytetty haastatteluja, palavereita ja havainnointia. Empiriaosassa on hyödynnetty myös konsernin omaa aineistoa. Keskeisimpiä tuloksia ovat ydinprosessin kuvaus sekä sen pohjalta laadittu konsernin tarpeisiin vastaava konkreettinen työkalu. Lisäksi tuloksena voidaan pitää havaittua prosessiajattelun edistymistä konsernissa. Kohdekonsernin ydinprosessi koostuu kuudesta vaiheesta: 1) asiakastarpeiden selvitys/myynti, 2) tarjous- ja sopimusvaihe, 3) suunnittelu, 4) työmaavaihe, 5) ennakkotarkastus- ja luovutusvaihe ja 6) loppuselvitys- ja takuuvaihe. Ydinprosessin vaiheet muodostuvat määriteltyjen porttien kautta ja ne muodostavat konsernin prosessikartan rungon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Japanese quail Coturnix japonica originated from North Africa, Europe and Asia, is used worldwide as an experimental animal and model for aviculture. The current paper characterizes Eimeria bateri, Eimeria tsunodai and Eimeria uzura recovered from C. japonica. Based on the fact that quails have a global distribution, as are their coccidia, the findings of this study should provide the means for diagnosis of those Eimeria spp. in other regions and continents. Eimeria bateri showed the greatest intensity of infection and shed oocysts from the fourth day after infection; in contrast, E. tsunodai and E. uzura shed oocysts from the fifth day after infection. The three species shared a high degree of similarity and were all polymorphic. Yet, the application of line regressions, histograms and ANOVA provided means for the identification of these species. Finally, the algorithm was very efficient since verified that resultant values were not superimposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advancements in IC processing technology has led to the innovation and growth happening in the consumer electronics sector and the evolution of the IT infrastructure supporting this exponential growth. One of the most difficult obstacles to this growth is the removal of large amount of heatgenerated by the processing and communicating nodes on the system. The scaling down of technology and the increase in power density is posing a direct and consequential effect on the rise in temperature. This has resulted in the increase in cooling budgets, and affects both the life-time reliability and performance of the system. Hence, reducing on-chip temperatures has become a major design concern for modern microprocessors. This dissertation addresses the thermal challenges at different levels for both 2D planer and 3D stacked systems. It proposes a self-timed thermal monitoring strategy based on the liberal use of on-chip thermal sensors. This makes use of noise variation tolerant and leakage current based thermal sensing for monitoring purposes. In order to study thermal management issues from early design stages, accurate thermal modeling and analysis at design time is essential. In this regard, spatial temperature profile of the global Cu nanowire for on-chip interconnects has been analyzed. It presents a 3D thermal model of a multicore system in order to investigate the effects of hotspots and the placement of silicon die layers, on the thermal performance of a modern ip-chip package. For a 3D stacked system, the primary design goal is to maximise the performance within the given power and thermal envelopes. Hence, a thermally efficient routing strategy for 3D NoC-Bus hybrid architectures has been proposed to mitigate on-chip temperatures by herding most of the switching activity to the die which is closer to heat sink. Finally, an exploration of various thermal-aware placement approaches for both the 2D and 3D stacked systems has been presented. Various thermal models have been developed and thermal control metrics have been extracted. An efficient thermal-aware application mapping algorithm for a 2D NoC has been presented. It has been shown that the proposed mapping algorithm reduces the effective area reeling under high temperatures when compared to the state of the art.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän tutkimuksen tarkoituksena on selvittää pystytäänkö OMX 25 Helsinki kohde-etuusindeksin warranttien hintoja ennustamaan käyttämällä erilaisia optiohinnoittelumalleja. Tutkielman aineisto koostuu OMXH25-indeksiä seuraavien warranttien hinta-aikasarjatiedoista vuosilta 2009-2011. Tutkimuksessa käytettiin kolmea eri hinnoittelumallia warranttien hinnoitteluvirheiden tutkimiseen. Perinteistä Black-Scholes-hinnoittelumallia käytettiin siten, että warranttiaineistosta joh-dettu implisiittinen volatiliteetti regressoitiin maturiteetin ja toteutushinnan mu-kaan, jonka jälkeen regression perusteella valittiin kulloiseenkin tilanteeseen sopiva volatiliteettiestimaatti. Black-Scholes-mallin lisäksi tutkimuksessa käy-tettiin kahta GARCH-pohjaista optiohinnoittelumallia. Mallien estimoimia hin-toja verrattiin markkinoiden warranttihintoihin. Tulosten perusteella voitiin todeta, että mallit onnistuvat hinnoittelemaan war-rantteja paremmin lyhyen ajan päähän mallien kalibroinnista. Tulokset vaihte-livat suuresti eri vuosien välillä eikä minkään käytetyn mallin nähty suoriutu-van systemaattisesti muita malleja paremmin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data management consists of collecting, storing, and processing the data into the format which provides value-adding information for decision-making process. The development of data management has enabled of designing increasingly effective database management systems to support business needs. Therefore as well as advanced systems are designed for reporting purposes, also operational systems allow reporting and data analyzing. The used research method in the theory part is qualitative research and the research type in the empirical part is case study. Objective of this paper is to examine database management system requirements from reporting managements and data managements perspectives. In the theory part these requirements are identified and the appropriateness of the relational data model is evaluated. In addition key performance indicators applied to the operational monitoring of production are studied. The study has revealed that the appropriate operational key performance indicators of production takes into account time, quality, flexibility and cost aspects. Especially manufacturing efficiency has been highlighted. In this paper, reporting management is defined as a continuous monitoring of given performance measures. According to the literature review, the data management tool should cover performance, usability, reliability, scalability, and data privacy aspects in order to fulfill reporting managements demands. A framework is created for the system development phase based on requirements, and is used in the empirical part of the thesis where such a system is designed and created for reporting management purposes for a company which operates in the manufacturing industry. Relational data modeling and database architectures are utilized when the system is built for relational database platform.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tool center point calibration is a known problem in industrial robotics. The major focus of academic research is to enhance the accuracy and repeatability of next generation robots. However, operators of currently available robots are working within the limits of the robot´s repeatability and require calibration methods suitable for these basic applications. This study was conducted in association with Stresstech Oy, which provides solutions for manufacturing quality control. Their sensor, based on the Barkhausen noise effect, requires accurate positioning. The accuracy requirement admits a tool center point calibration problem if measurements are executed with an industrial robot. Multiple possibilities are available in the market for automatic tool center point calibration. Manufacturers provide customized calibrators to most robot types and tools. With the handmade sensors and multiple robot types that Stresstech uses, this would require great deal of labor. This thesis introduces a calibration method that is suitable for all robots which have two digital input ports free. It functions with the traditional method of using a light barrier to detect the tool in the robot coordinate system. However, this method utilizes two parallel light barriers to simultaneously measure and detect the center axis of the tool. Rotations about two axes are defined with the center axis. The last rotation about the Z-axis is calculated for tools that have different width of X- and Y-axes. The results indicate that this method is suitable for calibrating the geometric tool center point of a Barkhausen noise sensor. In the repeatability tests, a standard deviation inside robot repeatability was acquired. The Barkhausen noise signal was also evaluated after recalibration and the results indicate correct calibration. However, future studies should be conducted using a more accurate manipulator, since the method employs the robot itself as a measuring device.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Successful management of rivers requires an understanding of the fluvial processes that govern them. This, in turn cannot be achieved without a means of quantifying their geomorphology and hydrology and the spatio-temporal interactions between them, that is, their hydromorphology. For a long time, it has been laborious and time-consuming to measure river topography, especially in the submerged part of the channel. The measurement of the flow field has been challenging as well, and hence, such measurements have long been sparse in natural environments. Technological advancements in the field of remote sensing in the recent years have opened up new possibilities for capturing synoptic information on river environments. This thesis presents new developments in fluvial remote sensing of both topography and water flow. A set of close-range remote sensing methods is employed to eventually construct a high-resolution unified empirical hydromorphological model, that is, river channel and floodplain topography and three-dimensional areal flow field. Empirical as well as hydraulic theory-based optical remote sensing methods are tested and evaluated using normal colour aerial photographs and sonar calibration and reference measurements on a rocky-bed sub-Arctic river. The empirical optical bathymetry model is developed further by the introduction of a deep-water radiance parameter estimation algorithm that extends the field of application of the model to shallow streams. The effect of this parameter on the model is also assessed in a study of a sandy-bed sub-Arctic river using close-range high-resolution aerial photography, presenting one of the first examples of fluvial bathymetry modelling from unmanned aerial vehicles (UAV). Further close-range remote sensing methods are added to complete the topography integrating the river bed with the floodplain to create a seamless high-resolution topography. Boat- cart- and backpack-based mobile laser scanning (MLS) are used to measure the topography of the dry part of the channel at a high resolution and accuracy. Multitemporal MLS is evaluated along with UAV-based photogrammetry against terrestrial laser scanning reference data and merged with UAV-based bathymetry to create a two-year series of seamless digital terrain models. These allow the evaluation of the methodology for conducting high-resolution change analysis of the entire channel. The remote sensing based model of hydromorphology is completed by a new methodology for mapping the flow field in 3D. An acoustic Doppler current profiler (ADCP) is deployed on a remote-controlled boat with a survey-grade global navigation satellite system (GNSS) receiver, allowing the positioning of the areally sampled 3D flow vectors in 3D space as a point cloud and its interpolation into a 3D matrix allows a quantitative volumetric flow analysis. Multitemporal areal 3D flow field data show the evolution of the flow field during a snow-melt flood event. The combination of the underwater and dry topography with the flow field yields a compete model of river hydromorphology at the reach scale.