17 resultados para Approximate Bayesian computation, Posterior distribution, Quantile distribution, Response time data

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to apply approximate Bayesian computation in combination with Marcov chain Monte Carlo methods in order to estimate the parameters of tuberculosis transmission. The methods are applied to San Francisco data and the results are compared with the outcomes of previous works. Moreover, a methodological idea with the aim to reduce computational time is also described. Despite the fact that this approach is proved to work in an appropriate way, further analysis is needed to understand and test its behaviour in different cases. Some related suggestions to its further enhancement are described in the corresponding chapter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this study was todo a statistical analysis of ecological type from optical satellite data, using Tipping's sparse Bayesian algorithm. This thesis uses "the Relevence Vector Machine" algorithm in ecological classification betweenforestland and wetland. Further this bi-classification technique was used to do classification of many other different species of trees and produces hierarchical classification of entire subclasses given as a target class. Also, we carried out an attempt to use airborne image of same forest area. Combining it with image analysis, using different image processing operation, we tried to extract good features and later used them to perform classification of forestland and wetland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the state and parameter estimation in state space models. The estimation of states and parameters is an important task when mathematical modeling is applied to many different application areas such as the global positioning systems, target tracking, navigation, brain imaging, spread of infectious diseases, biological processes, telecommunications, audio signal processing, stochastic optimal control, machine learning, and physical systems. In Bayesian settings, the estimation of states or parameters amounts to computation of the posterior probability density function. Except for a very restricted number of models, it is impossible to compute this density function in a closed form. Hence, we need approximation methods. A state estimation problem involves estimating the states (latent variables) that are not directly observed in the output of the system. In this thesis, we use the Kalman filter, extended Kalman filter, Gauss–Hermite filters, and particle filters to estimate the states based on available measurements. Among these filters, particle filters are numerical methods for approximating the filtering distributions of non-linear non-Gaussian state space models via Monte Carlo. The performance of a particle filter heavily depends on the chosen importance distribution. For instance, inappropriate choice of the importance distribution can lead to the failure of convergence of the particle filter algorithm. In this thesis, we analyze the theoretical Lᵖ particle filter convergence with general importance distributions, where p ≥2 is an integer. A parameter estimation problem is considered with inferring the model parameters from measurements. For high-dimensional complex models, estimation of parameters can be done by Markov chain Monte Carlo (MCMC) methods. In its operation, the MCMC method requires the unnormalized posterior distribution of the parameters and a proposal distribution. In this thesis, we show how the posterior density function of the parameters of a state space model can be computed by filtering based methods, where the states are integrated out. This type of computation is then applied to estimate parameters of stochastic differential equations. Furthermore, we compute the partial derivatives of the log-posterior density function and use the hybrid Monte Carlo and scaled conjugate gradient methods to infer the parameters of stochastic differential equations. The computational efficiency of MCMC methods is highly depend on the chosen proposal distribution. A commonly used proposal distribution is Gaussian. In this kind of proposal, the covariance matrix must be well tuned. To tune it, adaptive MCMC methods can be used. In this thesis, we propose a new way of updating the covariance matrix using the variational Bayesian adaptive Kalman filter algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In mathematical modeling the estimation of the model parameters is one of the most common problems. The goal is to seek parameters that fit to the measurements as well as possible. There is always error in the measurements which implies uncertainty to the model estimates. In Bayesian statistics all the unknown quantities are presented as probability distributions. If there is knowledge about parameters beforehand, it can be formulated as a prior distribution. The Bays’ rule combines the prior and the measurements to posterior distribution. Mathematical models are typically nonlinear, to produce statistics for them requires efficient sampling algorithms. In this thesis both Metropolis-Hastings (MH), Adaptive Metropolis (AM) algorithms and Gibbs sampling are introduced. In the thesis different ways to present prior distributions are introduced. The main issue is in the measurement error estimation and how to obtain prior knowledge for variance or covariance. Variance and covariance sampling is combined with the algorithms above. The examples of the hyperprior models are applied to estimation of model parameters and error in an outlier case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The identifiability of the parameters of a heat exchanger model without phase change was studied in this Master’s thesis using synthetically made data. A fast, two-step Markov chain Monte Carlo method (MCMC) was tested with a couple of case studies and a heat exchanger model. The two-step MCMC-method worked well and decreased the computation time compared to the traditional MCMC-method. The effect of measurement accuracy of certain control variables to the identifiability of parameters was also studied. The accuracy used did not seem to have a remarkable effect to the identifiability of parameters. The use of the posterior distribution of parameters in different heat exchanger geometries was studied. It would be computationally most efficient to use the same posterior distribution among different geometries in the optimisation of heat exchanger networks. According to the results, this was possible in the case when the frontal surface areas were the same among different geometries. In the other cases the same posterior distribution can be used for optimisation too, but that will give a wider predictive distribution as a result. For condensing surface heat exchangers the numerical stability of the simulation model was studied. As a result, a stable algorithm was developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis was focussed on statistical analysis methods and proposes the use of Bayesian inference to extract information contained in experimental data by estimating Ebola model parameters. The model is a system of differential equations expressing the behavior and dynamics of Ebola. Two sets of data (onset and death data) were both used to estimate parameters, which has not been done by previous researchers in (Chowell, 2004). To be able to use both data, a new version of the model has been built. Model parameters have been estimated and then used to calculate the basic reproduction number and to study the disease-free equilibrium. Estimates of the parameters were useful to determine how well the model fits the data and how good estimates were, in terms of the information they provided about the possible relationship between variables. The solution showed that Ebola model fits the observed onset data at 98.95% and the observed death data at 93.6%. Since Bayesian inference can not be performed analytically, the Markov chain Monte Carlo approach has been used to generate samples from the posterior distribution over parameters. Samples have been used to check the accuracy of the model and other characteristics of the target posteriors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to invert the ionospheric electron density profile from Riometer (Relative Ionospheric opacity meter) measurement. The newly Riometer instrument KAIRA (Kilpisjärvi Atmospheric Imaging Receiver Array) is used to measure the cosmic HF radio noise absorption that taking place in the D-region ionosphere between 50 to 90 km. In order to invert the electron density profile synthetic data is used to feed the unknown parameter Neq using spline height method, which works by taking electron density profile at different altitude. Moreover, smoothing prior method also used to sample from the posterior distribution by truncating the prior covariance matrix. The smoothing profile approach makes the problem easier to find the posterior using MCMC (Markov Chain Monte Carlo) method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tarkoituksena oli selvittää markkinamuutosten merkitystä erään yrityksen erään tulosyksikön toimintaan tietyllä vientialueella. Vaikuttavia tekijöitä arvioitiin erityisesti asiakasyritysten tarpeiden ja edustajayrityksen kanssa tehdyn yhteistyön kannalta. Työn alussa esitellään toimeksiantajayritys, tutkimuskohteena oleva tulosyksikkö ja markkinatilanne. Työn teoriaosuudessa käsitellään organisationaalista ostokäyttäytymistä ja markkinointia jakelukanavien kautta sekä esitellään analyysin käsitteellisenä kehyksenä käytetty palvelun laadun kuilumalli. Työn käytännönosan aineisto kerättiin kvalitatiivisilla haastatteluilla jakeluketjun kolmella tasolla. Aineistoa analysoitiin vertaamalla osapuolten näkemyksiä toiminnasta. Suurimpia eroja tuli esille yrityksen joidenkin heikkouksien ja uhkien tunnistamisessa. Samaa mieltä oltiin joustavuuden, yleisen palvelun laadun ja tuotelaadun tärkeydestä.Työn tuloksina on esitetty toimenpidesuosituksia, joiden yhteisenä piirteenä on toimeksiantajayrityksen selkeämpi profiloituminen tutkimuksen perusteella määritellyn laatutason tarjoajaksi. Esitettyjen toimenpiteiden avulla yrityksen kykyä tarjota kokonaispalvelua ja teknisesti laadukkaita tuotteita käytettäisiin edelleen hyväksi yhteistyökumppaneiden sitouttamisessa ja motivoimisessa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ympäristörasituksen vähentäminen, tiukkenevat päästörajat ja ehtyvät öljyvarat ovat ajaneet kulkuvälineteollisuuden hakemaan uusia vaihtoehtoja ajoneuvojen energiatehokkuuden kehittämiseksi. Hybriditeknologia tarjoaa ratkaisuja kustannustehokkuuden ja ympäristöystävällisyyden parantamiseksi. Hybriditeknologian yleistyessä myös työkoneympäristössä saadaan paitsi kehitettyä energiatehokkaampia ja pienemmillä käyttökustannuksilla olevia työkoneita, niin myös tuotua siviiliajoneuvoista tuttuja turvallisuusominaisuuksia työkoneympäristöön. Perinteisten diesel-moottorien korvaaminen nopeavasteisilla ja tarkasti säädettävillä sähkömoottoreilla tarjoaa mahdollisuuksia toteuttaa tarkempia ja monipuolisempia säätöjärjestelmiä kuin perinteisessä ympäristössä. Tässä diplomityössä suunnitellaan luistonestojärjestelmä ja elektroninen tasauspyörästö hybridityökoneympäristöön. Järjestelmä voi käyttökohteesta riippuen pienentää huomattavasti käyttökustannuksia ja mahdollistaa uusien sovellusten tuomista markkinoille, kuten esimerkiksi kääntymisen avustaminen differentiaalisella ohjauksella.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Potentiometric sensors are very attractive tools for chemical analysis because of their simplicity, low power consumption and low cost. They are extensively used in clinical diagnostics and in environmental monitoring. Modern applications of both fields require improvements in the conventional construction and in the performance of the potentiometric sensors, as the trends are towards portable, on-site diagnostics and autonomous sensing in remote locations. The aim of this PhD work was to improve some of the sensor properties that currently hamper the implementation of the potentiometric sensors in modern applications. The first part of the work was concentrated on the development of a solid-state reference electrode (RE) compatible with already existing solid-contact ion-selective electrodes (ISE), both of which are needed for all-solid-state potentiometric sensing systems. A poly(vinyl chloride) membrane doped with a moderately lipophilic salt, tetrabutylammonium-tetrabutylborate (TBA-TBB), was found to show a satisfactory stability of potential in sample solutions with different concentrations. Its response time was nevertheless slow, as it required several minutes to reach the equilibrium. The TBA-TBB membrane RE worked well together with solid-state ISEs in several different situations and on different substrates enabling a miniature design. Solid contacts (SC) that mediate the ion-to-electron transduction are crucial components of well-functioning potentiometric sensors. This transduction process converting the ionic conduction of an ion-selective membrane to the electronic conduction in the circuit was studied with the help of electrochemical impedance spectroscopy (EIS). The solid contacts studied were (i) the conducting polymer (CP) poly(3,4-ethylienedioxythiophene) (PEDOT) and (ii) a carbon cloth having a high surface area. The PEDOT films were doped with a large immobile anion poly(styrene sulfonate) (PSS-) or with a small mobile anion Cl-. As could be expected, the studied PEDOT solid-contact mediated the ion-toelectron transduction more efficiently than the bare glassy carbon substrate, onto which they were electropolymerized, while the impedance of the PEDOT films depended on the mobility of the doping ion and on the ions in the electrolyte. The carbon cloth was found to be an even more effective ion-to-electron transducer than the PEDOT films and it also proved to work as a combined electrical conductor and solid contact when covered with an ion-selective membrane or with a TBA-TBB-based reference membrane. The last part of the work was focused on improving the reproducibility and the potential stability of the SC-ISEs, a problem that culminates to the stability of the standard potential E°. It was proven that the E° of a SC-ISE with a conducting polymer as a solid contact could be adjusted by reducing or oxidizing the CP solid contact by applying current pulses or a potential to it, as the redox state of the CP solid-contact influences the overall potential of the ISE. The slope and thus the analytical performance of the SC-ISEs were retained despite the adjustment of the E°. The shortcircuiting of the SC-ISE with a conventional large-capacitance RE was found to be a feasible instrument-free method to control the E°. With this method, the driving force for the oxidation/reduction of the CP was the potential difference between the RE and the SC-ISE, and the position of the adjusted potential could be controlled by choosing a suitable concentration for the short-circuiting electrolyte. The piece-to-piece reproducibility of the adjusted potential was promising, and the day-today reproducibility for a specific sensor was excellent. The instrumentfree approach to control the E° is very attractive considering practical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The monitoring and control of hydrogen sulfide (H2S) level is of great interest for a wide range of application areas including food quality control, defense and antiterrorist applications and air quality monitoring e.g. in mines. H2S is a very poisonous and flammable gas. Exposure to low concentrations of H2S can result in eye irritation, a sore throat and cough, shortness of breath, and fluid retention in the lungs. These symptoms usually disappear in a few weeks. Long-term, low-level exposure may result in fatigue, loss of appetite, headache, irritability, poor memory, and dizziness. Higher concentrations of 700 - 800 ppm tend to be fatal. H2S has a characteristic smell of rotten egg. However, because of temporary paralysis of olfactory nerves, the smelling capability at concentrations higher than 100 ppm is severely compromised. In addition, volatile H2S is one of the main products during the spoilage of poultry meat in anaerobic conditions. Currently, no commercial H2S sensor is available which can operate under anaerobic conditions and can be easily integrated in the food packaging. This thesis presents a step-wise progress in the development of printed H2S gas sensors. Efforts were made in the formulation, characterization and optimization of functional printable inks and coating pastes based on composites of a polymer and a metal salt as well as a composite of a metal salt and an organic acid. Different processing techniques including inkjet printing, flexographic printing, screen printing and spray coating were utilized in the fabrication of H2S sensors. The dispersions were characterized by measuring turbidity, surface tension, viscosity and particle size. The sensing films were characterized using X-ray photoelectron spectroscopy, X-ray diffraction, atomic force microscopy and an electrical multimeter. Thin and thick printed or coated films were developed for gas sensing applications with the aim of monitoring the H2S concentrations in real life applications. Initially, a H2S gas sensor based on a composite of polyaniline and metal salt was developed. Both aqueous and solvent-based dispersions were developed and characterized. These dispersions were then utilized in the fabrication of roll-to-roll printed H2S gas sensors. However, the humidity background, long term instability and comparatively lower detection limit made these sensors less favourable for real practical applications. To overcome these problems, copper acetate based sensors were developed for H2S gas sensing. Stable inks with excellent printability were developed by tuning the surface tension, viscosity and particle size. This enabled the formation of inkjet-printed high quality copper acetate films with excellent sensitivity towards H2S. Furthermore, these sensors showed negligible humidity effects and improved selectivity, response time, lower limit of detection and coefficient of variation. The lower limit of detection of copper acetate based sensors was further improved to sub-ppm level by incorporation of catalytic gold nano-particles and subsequent plasma treatment of the sensing film. These sensors were further integrated in an inexpensive wirelessly readable RLC-circuit (where R is resistor, L is inductor and C is capacitor). The performance of these sensors towards biogenic H2S produced during the spoilage of poultry meat in the modified atmosphere package was also demonstrated in this thesis. This serves as a proof of concept that these sensors can be utilized in real life applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The review of intelligent machines shows that the demand for new ways of helping people in perception of the real world is becoming higher and higher every year. This thesis provides information about design and implementation of machine vision for mobile assembly robot. The work has been done as a part of LUT project in Laboratory of Intelligent Machines. The aim of this work is to create a working vision system. The qualitative and quantitative research were done to complete this task. In the first part, the author presents the theoretical background of such things as digital camera work principles, wireless transmission basics, creation of live stream, methods used for pattern recognition. Formulas, dependencies and previous research related to the topic are shown. In the second part, the equipment used for the project is described. There is information about the brands, models, capabilities and also requirements needed for implementation. Although, the author gives a description of LabVIEW software, its add-ons and OpenCV which are used in the project. Furthermore, one can find results in further section of considered thesis. They mainly represented by screenshots from cameras, working station and photos of the system. The key result of this thesis is vision system created for the needs of mobile assembly robot. Therefore, it is possible to see graphically what was done on examples. Future research in this field includes optimization of the pattern recognition algorithm. This will give less response time for recognizing objects. Presented by author system can be used also for further activities which include artificial intelligence usage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In modern society, the body health is a very important issue to everyone. With the development of the science and technology, the new and developed body health monitoring device and technology will play the key role in the daily medical activities. This paper focus on making progress in the design of the wearable vital sign system. A vital sign monitoring system has been proposed and designed. The whole detection system is composed of signal collecting subsystem, signal processing subsystem, short-range wireless communication subsystem and user interface subsystem. The signal collecting subsystem is composed of light source and photo diode, after emiting light of two different wavelength, the photo diode collects the light signal reflected by human body tissue. The signal processing subsystem is based on the analog front end AFE4490 and peripheral circuits, the collected analog signal would be filtered and converted into digital signal in this stage. After a series of processing, the signal would be transmitted to the short-range wireless communication subsystem through SPI, this subsystem is mainly based on Bluetooth 4.0 protocol and ultra-low power System on Chip(SoC) nRF51822. Finally, the signal would be transmitted to the user end. After proposing and building the system, this paper focus on the research of the key component in the system, that is, the photo detector. Based on the study of the perovskite materials, a low temperature processed photo detector has been proposed, designed and researched. The device is made up of light absorbing layer, electron transporting and hole blocking layer, hole transporting and electron blocking layer, conductive substrate layer and metal electrode layer. The light absorbing layer is the important part of whole device, and it is fabricated by perovskite materials. After accepting the light, the electron-hole pair would be produced in this layer, and due to the energy level difference, the electron and hole produced would be transmitted to metal electrode and conductive substrate electrode through electron transporting layer and hole transporting layer respectively. In this way the response current would be produced. Based on this structure, the specific fabrication procedure including substrate cleaning; PEDOT:PSS layer preparation; pervoskite layer preparation; PCBM layer preparation; C60, BCP, and Ag electrode layer preparation. After the device fabrication, a series of morphological characterization and performance testing has been done. The testing procedure including film-forming quality inspection, response current and light wavelength analysis, linearity and response time and other optical and electrical properties testing. The testing result shows that the membrane has been fabricated uniformly; the device can produce obvious response current to the incident light with the wavelength from 350nm to 800nm, and the response current could be changed along with the light wavelength. When the light wavelength keeps constant, there exists a good linear relationship between the intensity of the response current and the power of the incident light, based on which the device could be used as the photo detector to collect the light information. During the changing period of the light signal, the response time of the device is several microseconds, which is acceptable working as a photo detector in our system. The testing results show that the device has good electronic and optical properties, and the fabrication procedure is also repeatable, the properties of the devices has good uniformity, which illustrates the fabrication method and procedure could be used to build the photo detector in our wearable system. Based on a series of testing results, the paper has drawn the conclusion that the photo detector fabricated could be integrated on the flexible substrate and is also suitable for the monitoring system proposed, thus made some progress on the research of the wearable monitoring system and device. Finally, some future prospect in system design aspect and device design and fabrication aspect are proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Only recently, during the past five years, consumer electronics has been evolving rapidly. Many products have started to include “smart home” capabilities, enabling communication and interoperability of various smart devices. Even more devices and sensors can be remote controlled and monitored through cloud services. While the smart home systems have become very affordable to average consumer compared to the early solutions decades ago, there are still many issues and things that need to be fixed or improved upon: energy efficiency, connectivity with other devices and applications, security and privacy concerns, reliability, and response time. This paper focuses on designing Internet of Things (IoT) node and platform architectures that take these issues into account, notes other currently used solutions, and selects technologies in order to provide better solution. The node architecture aims for energy efficiency and modularity, while the platform architecture goals are in scalability, portability, maintainability, performance, and modularity. Moreover, the platform architecture attempts to improve user experience by providing higher reliability and lower response time compared to the alternative platforms. The architectures were developed iteratively using a development process involving research, planning, design, implementation, testing, and analysis. Additionally, they were documented using Kruchten’s 4+1 view model, which is used to describe the use cases and different views of the architectures. The node architecture consisted of energy efficient hardware, FC3180 microprocessor and CC2520 RF transceiver, modular operating system, Contiki, and a communication protocol, AllJoyn, used for providing better interoperability with other IoT devices and applications. The platform architecture provided reliable low response time control, monitoring, and initial setup capabilities by utilizing web technologies on various devices such as smart phones, tablets, and computers. Furthermore, an optional cloud service was provided in order to control devices and monitor sensors remotely by utilizing scalable high performance technologies in the backend enabling low response time and high reliability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One challenge on data assimilation (DA) methods is how the error covariance for the model state is computed. Ensemble methods have been proposed for producing error covariance estimates, as error is propagated in time using the non-linear model. Variational methods, on the other hand, use the concepts of control theory, whereby the state estimate is optimized from both the background and the measurements. Numerical optimization schemes are applied which solve the problem of memory storage and huge matrix inversion needed by classical Kalman filter methods. Variational Ensemble Kalman filter (VEnKF), as a method inspired the Variational Kalman Filter (VKF), enjoys the benefits from both ensemble methods and variational methods. It avoids filter inbreeding problems which emerge when the ensemble spread underestimates the true error covariance. In VEnKF this is tackled by resampling the ensemble every time measurements are available. One advantage of VEnKF over VKF is that it needs neither tangent linear code nor adjoint code. In this thesis, VEnKF has been applied to a two-dimensional shallow water model simulating a dam-break experiment. The model is a public code with water height measurements recorded in seven stations along the 21:2 m long 1:4 m wide flume’s mid-line. Because the data were too sparse to assimilate the 30 171 model state vector, we chose to interpolate the data both in time and in space. The results of the assimilation were compared with that of a pure simulation. We have found that the results revealed by the VEnKF were more realistic, without numerical artifacts present in the pure simulation. Creating a wrapper code for a model and DA scheme might be challenging, especially when the two were designed independently or are poorly documented. In this thesis we have presented a non-intrusive approach of coupling the model and a DA scheme. An external program is used to send and receive information between the model and DA procedure using files. The advantage of this method is that the model code changes needed are minimal, only a few lines which facilitate input and output. Apart from being simple to coupling, the approach can be employed even if the two were written in different programming languages, because the communication is not through code. The non-intrusive approach is made to accommodate parallel computing by just telling the control program to wait until all the processes have ended before the DA procedure is invoked. It is worth mentioning the overhead increase caused by the approach, as at every assimilation cycle both the model and the DA procedure have to be initialized. Nonetheless, the method can be an ideal approach for a benchmark platform in testing DA methods. The non-intrusive VEnKF has been applied to a multi-purpose hydrodynamic model COHERENS to assimilate Total Suspended Matter (TSM) in lake Säkylän Pyhäjärvi. The lake has an area of 154 km2 with an average depth of 5:4 m. Turbidity and chlorophyll-a concentrations from MERIS satellite images for 7 days between May 16 and July 6 2009 were available. The effect of the organic matter has been computationally eliminated to obtain TSM data. Because of computational demands from both COHERENS and VEnKF, we have chosen to use 1 km grid resolution. The results of the VEnKF have been compared with the measurements recorded at an automatic station located at the North-Western part of the lake. However, due to TSM data sparsity in both time and space, it could not be well matched. The use of multiple automatic stations with real time data is important to elude the time sparsity problem. With DA, this will help in better understanding the environmental hazard variables for instance. We have found that using a very high ensemble size does not necessarily improve the results, because there is a limit whereby additional ensemble members add very little to the performance. Successful implementation of the non-intrusive VEnKF and the ensemble size limit for performance leads to an emerging area of Reduced Order Modeling (ROM). To save computational resources, running full-blown model in ROM is avoided. When the ROM is applied with the non-intrusive DA approach, it might result in a cheaper algorithm that will relax computation challenges existing in the field of modelling and DA.