956 resultados para Probabilistic robotics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical resistivity tomography (ERT) is a well-established method for geophysical characterization and has shown potential for monitoring geologic CO2 sequestration, due to its sensitivity to electrical resistivity contrasts generated by liquid/gas saturation variability. In contrast to deterministic inversion approaches, probabilistic inversion provides the full posterior probability density function of the saturation field and accounts for the uncertainties inherent in the petrophysical parameters relating the resistivity to saturation. In this study, the data are from benchtop ERT experiments conducted during gas injection into a quasi-2D brine-saturated sand chamber with a packing that mimics a simple anticlinal geological reservoir. The saturation fields are estimated by Markov chain Monte Carlo inversion of the measured data and compared to independent saturation measurements from light transmission through the chamber. Different model parameterizations are evaluated in terms of the recovered saturation and petrophysical parameter values. The saturation field is parameterized (1) in Cartesian coordinates, (2) by means of its discrete cosine transform coefficients, and (3) by fixed saturation values in structural elements whose shape and location is assumed known or represented by an arbitrary Gaussian Bell structure. Results show that the estimated saturation fields are in overall agreement with saturations measured by light transmission, but differ strongly in terms of parameter estimates, parameter uncertainties and computational intensity. Discretization in the frequency domain (as in the discrete cosine transform parameterization) provides more accurate models at a lower computational cost compared to spatially discretized (Cartesian) models. A priori knowledge about the expected geologic structures allows for non-discretized model descriptions with markedly reduced degrees of freedom. Constraining the solutions to the known injected gas volume improved estimates of saturation and parameter values of the petrophysical relationship. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Probabilistic inversion methods based on Markov chain Monte Carlo (MCMC) simulation are well suited to quantify parameter and model uncertainty of nonlinear inverse problems. Yet, application of such methods to CPU-intensive forward models can be a daunting task, particularly if the parameter space is high dimensional. Here, we present a 2-D pixel-based MCMC inversion of plane-wave electromagnetic (EM) data. Using synthetic data, we investigate how model parameter uncertainty depends on model structure constraints using different norms of the likelihood function and the model constraints, and study the added benefits of joint inversion of EM and electrical resistivity tomography (ERT) data. Our results demonstrate that model structure constraints are necessary to stabilize the MCMC inversion results of a highly discretized model. These constraints decrease model parameter uncertainty and facilitate model interpretation. A drawback is that these constraints may lead to posterior distributions that do not fully include the true underlying model, because some of its features exhibit a low sensitivity to the EM data, and hence are difficult to resolve. This problem can be partly mitigated if the plane-wave EM data is augmented with ERT observations. The hierarchical Bayesian inverse formulation introduced and used herein is able to successfully recover the probabilistic properties of the measurement data errors and a model regularization weight. Application of the proposed inversion methodology to field data from an aquifer demonstrates that the posterior mean model realization is very similar to that derived from a deterministic inversion with similar model constraints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past few decades, the rise of criminal, civil and asylum cases involving young people lacking valid identification documents has generated an increase in the demand of age estimation. The chronological age or the probability that an individual is older or younger than a given age threshold are generally estimated by means of some statistical methods based on observations performed on specific physical attributes. Among these statistical methods, those developed in the Bayesian framework allow users to provide coherent and transparent assignments which fulfill forensic and medico-legal purposes. The application of the Bayesian approach is facilitated by using probabilistic graphical tools, such as Bayesian networks. The aim of this work is to test the performances of the Bayesian network for age estimation recently presented in scientific literature in classifying individuals as older or younger than 18 years of age. For these exploratory analyses, a sample related to the ossification status of the medial clavicular epiphysis available in scientific literature was used. Results obtained in the classification are promising: in the criminal context, the Bayesian network achieved, on the average, a rate of correct classifications of approximatively 97%, whilst in the civil context, the rate is, on the average, close to the 88%. These results encourage the continuation of the development and the testing of the method in order to support its practical application in casework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the rise of criminal, civil and administrative judicial situations involving people lacking valid identity documents, age estimation of living persons has become an important operational procedure for numerous forensic and medicolegal services worldwide. The chronological age of a given person is generally estimated from the observed degree of maturity of some selected physical attributes by means of statistical methods. However, their application in the forensic framework suffers from some conceptual and practical drawbacks, as recently claimed in the specialised literature. The aim of this paper is therefore to offer an alternative solution for overcoming these limits, by reiterating the utility of a probabilistic Bayesian approach for age estimation. This approach allows one to deal in a transparent way with the uncertainty surrounding the age estimation process and to produce all the relevant information in the form of posterior probability distribution about the chronological age of the person under investigation. Furthermore, this probability distribution can also be used for evaluating in a coherent way the possibility that the examined individual is younger or older than a given legal age threshold having a particular legal interest. The main novelty introduced by this work is the development of a probabilistic graphical model, i.e. a Bayesian network, for dealing with the problem at hand. The use of this kind of probabilistic tool can significantly facilitate the application of the proposed methodology: examples are presented based on data related to the ossification status of the medial clavicular epiphysis. The reliability and the advantages of this probabilistic tool are presented and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a pose-based algorithm to solve the full SLAM problem for an autonomous underwater vehicle (AUV), navigating in an unknown and possibly unstructured environment. The technique incorporate probabilistic scan matching with range scans gathered from a mechanical scanning imaging sonar (MSIS) and the robot dead-reckoning displacements estimated from a Doppler velocity log (DVL) and a motion reference unit (MRU). The proposed method utilizes two extended Kalman filters (EKF). The first, estimates the local path travelled by the robot while grabbing the scan as well as its uncertainty and provides position estimates for correcting the distortions that the vehicle motion produces in the acoustic images. The second is an augment state EKF that estimates and keeps the registered scans poses. The raw data from the sensors are processed and fused in-line. No priory structural information or initial pose are considered. The algorithm has been tested on an AUV guided along a 600 m path within a marina environment, showing the viability of the proposed approach

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coverage Path Planning (CPP) is the task of determining a path that passes over all points of an area or volume of interest while avoiding obstacles. This task is integral to many robotic applications, such as vacuum cleaning robots, painter robots, autonomous underwater vehicles creating image mosaics, demining robots, lawn mowers, automated harvesters, window cleaners and inspection of complex structures, just to name a few. A considerable body of research has addressed the CPP problem. However, no updated surveys on CPP reflecting recent advances in the field have been presented in the past ten years. In this paper, we present a review of the most successful CPP methods, focusing on the achievements made in the past decade. Furthermore, we discuss reported field applications of the described CPP methods. This work aims to become a starting point for researchers who are initiating their endeavors in CPP. Likewise, this work aims to present a comprehensive review of the recent breakthroughs in the field, providing links to the most interesting and successful works

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Editorial material

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to describe the probabilistic structure of the annual series of extreme daily rainfall (Preabs), available from the weather station of Ubatuba, State of São Paulo, Brazil (1935-2009), by using the general distribution of extreme value (GEV). The autocorrelation function, the Mann-Kendall test, and the wavelet analysis were used in order to evaluate the presence of serial correlations, trends, and periodical components. Considering the results obtained using these three statistical methods, it was possible to assume the hypothesis that this temporal series is free from persistence, trends, and periodicals components. Based on quantitative and qualitative adhesion tests, it was found that the GEV may be used in order to quantify the probabilities of the Preabs data. The best results of GEV were obtained when the parameters of this function were estimated using the method of maximum likelihood. The method of L-moments has also shown satisfactory results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tässä diplomityössä tehtiin Olkiluodon ydinvoimalaitoksella sijaitsevan käytetyn ydinpolttoaineen allasvarastointiin perustuvan välivaraston todennäköisyysperustainen ulkoisten uhkien riskianalyysi. Todennäköisyysperustainen riskianalyysi (PRA) on yleisesti käytetty riskien tunnistus- ja lähestymistapa ydinvoimalaitoksella. Työn tarkoituksena oli laatia täysin uusi ulkoisten uhkien PRA-analyysi, koska Suomessa ei ole aiemmin tehty vastaavanlaisia tämän tutkimusalueen riskitarkasteluja. Riskitarkastelun motiivina ovat myös maailmalla tapahtuneiden luonnonkatastrofien vuoksi korostunut ulkoisten uhkien rooli käytetyn ydinpolttoaineen välivarastoinnin turvallisuudessa. PRA analyysin rakenne pohjautui tutkimuksen alussa luotuun metodologiaan. Analyysi perustuu mahdollisten ulkoisten uhkien tunnistamiseen pois lukien ihmisen aikaansaamat tahalliset vahingot. Tunnistettujen ulkoisten uhkien esiintymistaajuuksien ja vahingoittamispotentiaalin perusteella ulkoiset uhat joko karsittiin pois tutkimuksessa määriteltyjen karsintakriteerien avulla tai analysoitiin tarkemmin. Tutkimustulosten perusteella voitiin todeta, että tiedot hyvin harvoin tapahtuvista ulkoisista uhista ovat epätäydellisiä. Suurinta osaa näistä hyvin harvoin tapahtuvista ulkoisista uhista ei ole koskaan esiintynyt eikä todennäköisesti koskaan tule esiintymään Olkiluodon vaikutusalueella tai edes Suomessa. Esimerkiksi salaman iskujen ja öljyaltistuksen roolit ja vaikutukset erilaisten komponenttien käytettävyyteen ovat epävarmasti tunnettuja. Tutkimuksen tuloksia voidaan pitää kokonaisuudessaan merkittävinä, koska niiden perusteella voidaan osoittaa ne ulkoiset uhat, joiden vaikutuksia olisi syytä tutkia tarkemmin. Yksityiskohtaisempi tietoisuus hyvin harvoin esiintyvistä ulkoisista uhista tarkentaisi alkutapahtumataajuuksien estimaatteja.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeller för intermolekulär växelvärkan utnyttjas brett inom biologin. Analys av kontakter mellan proteiner och läkemedelsforskning representerar typiska tillämpningsområden för dylika modeller. En modell som beskriver sådana molekylära växelverkningar kan utformas med hjälp av biofysisk teori, vilket tenderar att resultera i ytterst tung beräkningsbörda även för enkla tillämpningar. Ett alternativt sätt att formulera modeller är att utnyttja stora databaser som innehåller strukturmätningar gjorda med hjälp av till exempel röntgendiffraktion. Då man använder sig av empiriska mätdata direkt, möjliggör en statistisk modell att osäkerheten och inexaktheten i datat tas till hänsyn på ett adekvat sätt, samtidigt som beräkningsbördan håller sig på en rimligare nivå jämfört med kvantmekaniska metoder som i princip borde ge de optimala resultaten. I avhandlingen utvecklades en 3D modell för numerisk undersökning av intermolekulär växelverkan baserad på Bayesiansk statistik. Modellens syfte är att åstadkomma prognoser för det hurdana eller vilka molekylstrukturer prefereras i en given kontext, d.v.s. är mer sannolika inom ramen för interaktion. Modellen testades i essentiella molekyläromgivningar - en liten molekyl vid sin bindningsplats hos ett protein och en gränsyta mellan proteinerna i ett komplex. De erhållna numeriska resultaten motsvarar väl experimentella resultat som tidigare rapporterats i litteraturen, exempelvis kvalitativa bindningsaffiniteter och kemisk kännedom av vissa aminosyrors rumsliga förmågor att utgöra bindningar. I avhandlingen gjordes ytterligare preliminära tester av den statistiska ansatsen för modellering av den centrala molekylära strukturella anpassningsbarheten. I praktiken är den utvecklade modellen ämnad som ett led i en mer omfattande analysmetod, så som en s.k. farmakofor modell. Molekyylivuorovaikutusten mallintamista hyödynnetään laajasti biologisten kysymysten tarkastelussa. Tyypillisiä esimerkkejä sovelluskohteista ovat proteiinien väliset kontaktit ja lääkesuunnittelu. Vuorovaikutuksia kuvaavan mallin lähtökohta voi olla molekyyleihin liittyvä teoria, jolloin soveltamiseen liittyvä laskenta saattaa olla erityisen raskasta, tai suuri havaintojoukko joka on saatu aikaan esimerkiksi mittaamalla rakenteita röntgendiffraktio menetelmällä. Tilastollinen malli mahdollistaa havaintoaineistossa olevan epätarkkuuden ja epävarmuuden huomioimisen, samalla pitäen laskennallisen kuorman pienempänä verrattuna periaatteessa parhaan tuloksen antavaan kvanttimekaaniseen mallinnukseen. Väitöstyössä kehitettiin bayesiläiseen tilastotieteeseen perustuva 3D malli molekyylien välisten vuorovaikutusten laskennalliseen tarkasteluun. Mallin tehtävä on tuottaa ennusteita sen suhteen, minkä tai millaisten molekyylirakenteiden väliset kompleksit ovat etusijalla, toisin sanoen todennäköisempiä, vuorovaikutustilanteessa. Työssä kehitetyn menetelmän toimivuutta testattiin käyttötarkoituksen suhteen olennaisissa molekyyliympäristöissä - pieni molekyyli sitoutumiskohdassaan proteiinissa sekä rajapinta kahden proteiinin välilllä proteiinikompleksissa. Saadut laskennalliset tulokset vastasivat hyvin vertailuun käytettyjä kirjallisuudesta saatuja kokeellisia tuloksia, kuten laadullisia sitoutumisaffiniteetteja, sekä kemiallista tietoa esimerkiksi tiettyjen aminohappojen avaruudellisesta sidoksenmuodostuksesta. Väitöstyössä myös alustavasti testattiin tilastollista lähestymistapaa tärkeän molekyylien rakenteellisen mukautuvuuden mallintamiseen. Käytännössä malli on tarkoitettu osaksi jotakin laajempaa analyysimenetelmää, kuten farmakoforimallia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Object detection is a fundamental task of computer vision that is utilized as a core part in a number of industrial and scientific applications, for example, in robotics, where objects need to be correctly detected and localized prior to being grasped and manipulated. Existing object detectors vary in (i) the amount of supervision they need for training, (ii) the type of a learning method adopted (generative or discriminative) and (iii) the amount of spatial information used in the object model (model-free, using no spatial information in the object model, or model-based, with the explicit spatial model of an object). Although some existing methods report good performance in the detection of certain objects, the results tend to be application specific and no universal method has been found that clearly outperforms all others in all areas. This work proposes a novel generative part-based object detector. The generative learning procedure of the developed method allows learning from positive examples only. The detector is based on finding semantically meaningful parts of the object (i.e. a part detector) that can provide additional information to object location, for example, pose. The object class model, i.e. the appearance of the object parts and their spatial variance, constellation, is explicitly modelled in a fully probabilistic manner. The appearance is based on bio-inspired complex-valued Gabor features that are transformed to part probabilities by an unsupervised Gaussian Mixture Model (GMM). The proposed novel randomized GMM enables learning from only a few training examples. The probabilistic spatial model of the part configurations is constructed with a mixture of 2D Gaussians. The appearance of the parts of the object is learned in an object canonical space that removes geometric variations from the part appearance model. Robustness to pose variations is achieved by object pose quantization, which is more efficient than previously used scale and orientation shifts in the Gabor feature space. Performance of the resulting generative object detector is characterized by high recall with low precision, i.e. the generative detector produces large number of false positive detections. Thus a discriminative classifier is used to prune false positive candidate detections produced by the generative detector improving its precision while keeping high recall. Using only a small number of positive examples, the developed object detector performs comparably to state-of-the-art discriminative methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.