906 resultados para Timed and Probabilistic Automata


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of statistical models for forensic fingerprint identification purposes has been the subject of increasing research attention in recent years. This can be partly seen as a response to a number of commentators who claim that the scientific basis for fingerprint identification has not been adequately demonstrated. In addition, key forensic identification bodies such as ENFSI [1] and IAI [2] have recently endorsed and acknowledged the potential benefits of using statistical models as an important tool in support of the fingerprint identification process within the ACE-V framework. In this paper, we introduce a new Likelihood Ratio (LR) model based on Support Vector Machines (SVMs) trained with features discovered via morphometric and spatial analyses of corresponding minutiae configurations for both match and close non-match populations often found in AFIS candidate lists. Computed LR values are derived from a probabilistic framework based on SVMs that discover the intrinsic spatial differences of match and close non-match populations. Lastly, experimentation performed on a set of over 120,000 publicly available fingerprint images (mostly sourced from the National Institute of Standards and Technology (NIST) datasets) and a distortion set of approximately 40,000 images, is presented, illustrating that the proposed LR model is reliably guiding towards the right proposition in the identification assessment of match and close non-match populations. Results further indicate that the proposed model is a promising tool for fingerprint practitioners to use for analysing the spatial consistency of corresponding minutiae configurations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many studies have forecasted the possible impact of climate change on plant distribution using models based on ecological niche theory. In their basic implementation, niche-based models do not constrain predictions by dispersal limitations. Hence, most niche-based modelling studies published so far have assumed dispersal to be either unlimited or null. However, depending on the rate of climatic change, the landscape fragmentation and the dispersal capabilities of individual species, these assumptions are likely to prove inaccurate, leading to under- or overestimation of future species distributions and yielding large uncertainty between these two extremes. As a result, the concepts of "potentially suitable" and "potentially colonisable" habitat are expected to differ significantly. To quantify to what extent these two concepts can differ, we developed MIGCLIM, a model simulating plant dispersal under climate change and landscape fragmentation scenarios. MIGCLIM implements various parameters, such as dispersal distance, increase in reproductive potential over time, barriers to dispersal or long distance dispersal. Several simulations were run for two virtual species in a study area of the western Swiss Alps, by varying dispersal distance and other parameters. Each simulation covered the hundred-year period 2001-2100 and three different IPCC-based temperature warming scenarios were considered. Our results indicate that: (i) using realistic parameter values, the future potential distributions generated using MIGCLIM can differ significantly (up to more than 95% decrease in colonized surface) from those that ignore dispersal; (ii) this divergence increases both with increasing climate warming and over longer time periods; (iii) the uncertainty associated with the warming scenario can be nearly as large as the one related to dispersal parameters; (iv) accounting for dispersal, even roughly, can importantly reduce uncertainty in projections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il est important pour les entreprises de compresser les informations détaillées dans des sets d'information plus compréhensibles. Au chapitre 1, je résume et structure la littérature sur le sujet « agrégation d'informations » en contrôle de gestion. Je récapitule l'analyse coûts-bénéfices que les comptables internes doivent considérer quand ils décident des niveaux optimaux d'agrégation d'informations. Au-delà de la perspective fondamentale du contenu d'information, les entreprises doivent aussi prendre en considération des perspectives cogni- tives et comportementales. Je développe ces aspects en faisant la part entre la comptabilité analytique, les budgets et plans, et la mesure de la performance. Au chapitre 2, je focalise sur un biais spécifique qui se crée lorsque les informations incertaines sont agrégées. Pour les budgets et plans, des entreprises doivent estimer les espérances des coûts et des durées des projets, car l'espérance est la seule mesure de tendance centrale qui est linéaire. A la différence de l'espérance, des mesures comme le mode ou la médiane ne peuvent pas être simplement additionnés. En considérant la forme spécifique de distributions des coûts et des durées, l'addition des modes ou des médianes résultera en une sous-estimation. Par le biais de deux expériences, je remarque que les participants tendent à estimer le mode au lieu de l'espérance résultant en une distorsion énorme de l'estimati¬on des coûts et des durées des projets. Je présente également une stratégie afin d'atténuer partiellement ce biais. Au chapitre 3, j'effectue une étude expérimentale pour comparer deux approches d'esti¬mation du temps qui sont utilisées en comptabilité analytique, spécifiquement « coûts basés sur les activités (ABC) traditionnelles » et « time driven ABC » (TD-ABC). Au contraire des affirmations soutenues par les défenseurs de l'approche TD-ABC, je constate que cette dernière n'est pas nécessairement appropriée pour les calculs de capacité. Par contre, je démontre que le TD-ABC est plus approprié pour les allocations de coûts que l'approche ABC traditionnelle. - It is essential for organizations to compress detailed sets of information into more comprehensi¬ve sets, thereby, establishing sharp data compression and good decision-making. In chapter 1, I review and structure the literature on information aggregation in management accounting research. I outline the cost-benefit trade-off that management accountants need to consider when they decide on the optimal levels of information aggregation. Beyond the fundamental information content perspective, organizations also have to account for cognitive and behavi¬oral perspectives. I elaborate on these aspects differentiating between research in cost accounti¬ng, budgeting and planning, and performance measurement. In chapter 2, I focus on a specific bias that arises when probabilistic information is aggregated. In budgeting and planning, for example, organizations need to estimate mean costs and durations of projects, as the mean is the only measure of central tendency that is linear. Different from the mean, measures such as the mode or median cannot simply be added up. Given the specific shape of cost and duration distributions, estimating mode or median values will result in underestimations of total project costs and durations. In two experiments, I find that participants tend to estimate mode values rather than mean values resulting in large distortions of estimates for total project costs and durations. I also provide a strategy that partly mitigates this bias. In the third chapter, I conduct an experimental study to compare two approaches to time estimation for cost accounting, i.e., traditional activity-based costing (ABC) and time-driven ABC (TD-ABC). Contrary to claims made by proponents of TD-ABC, I find that TD-ABC is not necessarily suitable for capacity computations. However, I also provide evidence that TD-ABC seems better suitable for cost allocations than traditional ABC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessment of locomotion through simple tests such as timed up and go (TUG) or walking trials can provide valuable information for the evaluation of treatment and the early diagnosis of people with Parkinson's disease (PD). Common methods used in clinics are either based on complex motion laboratory settings or simple timing outcomes using stop watches. The goal of this paper is to present an innovative technology based on wearable sensors on-shoe and processing algorithm, which provides outcome measures characterizing PD motor symptoms during TUG and gait tests. Our results on ten PD patients and ten age-matched elderly subjects indicate an accuracy ± precision of 2.8 ± 2.4 cm/s and 1.3 ± 3.0 cm for stride velocity and stride length estimation compared to optical motion capture, with the advantage of being practical to use in home or clinics without any discomfort for the subject. In addition, the use of novel spatio-temporal parameters, including turning, swing width, path length, and their intercycle variability, was also validated and showed interesting tendencies for discriminating patients in ON and OFF states and control subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the purpose, design, methodology and target audience of E-learning courses in forensic interpretation offered by the authors since 2010, including practical experiences made throughout the implementation period of this project. This initiative was motivated by the fact that reporting results of forensic examinations in a logically correct and scientifically rigorous way is a daily challenge for any forensic practitioner. Indeed, interpretation of raw data and communication of findings in both written and oral statements are topics where knowledge and applied skills are needed. Although most forensic scientists hold educational records in traditional sciences, only few actually followed full courses that focussed on interpretation issues. Such courses should include foundational principles and methodology - including elements of forensic statistics - for the evaluation of forensic data in a way that is tailored to meet the needs of the criminal justice system. In order to help bridge this gap, the authors' initiative seeks to offer educational opportunities that allow practitioners to acquire knowledge and competence in the current approaches to the evaluation and interpretation of forensic findings. These cover, among other aspects, probabilistic reasoning (including Bayesian networks and other methods of forensic statistics, tools and software), case pre-assessment, skills in the oral and written communication of uncertainty, and the development of independence and self-confidence to solve practical inference problems. E-learning was chosen as a general format because it helps to form a trans-institutional online-community of practitioners from varying forensic disciplines and workfield experience such as reporting officers, (chief) scientists, forensic coordinators, but also lawyers who all can interact directly from their personal workplaces without consideration of distances, travel expenses or time schedules. In the authors' experience, the proposed learning initiative supports participants in developing their expertise and skills in forensic interpretation, but also offers an opportunity for the associated institutions and the forensic community to reinforce the development of a harmonized view with regard to interpretation across forensic disciplines, laboratories and judicial systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This manual summarizes the roadside tree and brush control methods used by all of Iowa's 99 counties. It is based on interviews conducted in Spring 2002 with county engineers, roadside managers and others. The target audience of this manual is the novice county engineer or roadside manager. Iowa law is nearly silent on roadside tree and brush control, so individual counties have been left to decide on the level of control they want to achieve and maintain. Different solutions have been developed but the goal of every county remains the same: to provide safe roads for the traveling public. Counties in eastern and southern Iowa appear to face the greatest brush control challenge. Most control efforts can be divided into two categories: mechanical and chemical. Mechanical control includes cutting tools and supporting equipment. A chain saw is the most widely used cutting tool. Tractor mounted boom mowers and brush cutters are used to prune miles of brush but have significant safety and aesthetic limitations and boom mowers are easily broken by inexperienced operators. The advent of tree shears and hydraulic thumbs offer unprecedented versatility. Bulldozers are often considered a method of last resort since they reduce large areas to bare ground. Any chipper that violently grabs brush should not be used. Chemical control is the application of herbicide to different parts of a plant: foliar spray is applied to leaves; basal bark spray is applied to the tree trunk; a cut stump treatment is applied to the cambium ring of a cut surface. There is reluctance by many to apply herbicide into the air due to drift concerns. One-third of Iowa counties do not use foliar spray. By contrast, several accepted control methods are directed toward the ground. Freshly cut stumps should be treated to prevent resprouting. Basal bark spray is highly effective in sensitive areas such as near houses. Interest in chemical control is slowly increasing as herbicides and application methods are refined. Fall burning, a third, distinctly separate technique is underused as a brush control method and can be effective if timed correctly. In all, control methods tend to reflect agricultural patterns in a county. The use of chain saws and foliar sprays tends to increase in counties where row crops predominate, and boom mowing tends to increase in counties where grassland predominates. For counties with light to moderate roadside brush, rotational maintenance is the key to effective control. The most comprehensive approach to control is to implement an integrated roadside vegetation management (IRVM) program. An IRVM program is usually directed by a Roadside Manager whose duties may be shared with another position. Funding for control programs comes from the Rural Services Basic portion of a county's budget. The average annual county brush control budget is about $76,000. That figure is thought not to include shared expenses such as fuel and buildings. Start up costs for an IRVM program are less if an existing control program is converted. In addition, IRVM budgets from three different northeastern Iowa counties are offered for comparison in this manual. The manual also includes a chapter on temporary traffic control in rural work zones, a summary of the Iowa Code as it relates to brush control, and rules on avoiding seasonal disturbance of the endangered Indiana bat. Appendices summarize survey and forest cover data, an equipment inventory, sample forms for record keeping, a sample brush control policy, a few legal opinions, a literature search, and a glossary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider diffusion of a passive substance C in a temporarily and spatially inhomogeneous two-dimensional medium. As a realization for the latter we choose a phase-separating medium consisting of two substances A and B, whose dynamics is determined by the Cahn-Hilliard equation. Assuming different diffusion coefficients of C in A and B, we find that the variance of the distribution function of the said substance grows less than linearly in time. We derive a simple identity for the variance using a probabilistic ansatz and are then able to identify the interface between A and B as the main cause for this nonlinear dependence. We argue that, finally, for very large times the here temporarily dependent diffusion "constant" goes like t-1/3 to a constant asymptotic value D¿. The latter is calculated approximately by employing the effective-medium approximation and by fitting the simulation data to the said time dependence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper deals with the development and application of the generic methodology for automatic processing (mapping and classification) of environmental data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve the problem of spatial data mapping (regression). The Probabilistic Neural Network (PNN) is considered as an automatic tool for spatial classifications. The automatic tuning of isotropic and anisotropic GRNN/PNN models using cross-validation procedure is presented. Results are compared with the k-Nearest-Neighbours (k-NN) interpolation algorithm using independent validation data set. Real case studies are based on decision-oriented mapping and classification of radioactively contaminated territories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider diffusion of a passive substance C in a temporarily and spatially inhomogeneous two-dimensional medium. As a realization for the latter we choose a phase-separating medium consisting of two substances A and B, whose dynamics is determined by the Cahn-Hilliard equation. Assuming different diffusion coefficients of C in A and B, we find that the variance of the distribution function of the said substance grows less than linearly in time. We derive a simple identity for the variance using a probabilistic ansatz and are then able to identify the interface between A and B as the main cause for this nonlinear dependence. We argue that, finally, for very large times the here temporarily dependent diffusion "constant" goes like t-1/3 to a constant asymptotic value D¿. The latter is calculated approximately by employing the effective-medium approximation and by fitting the simulation data to the said time dependence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detailed knowledge of the anatomy and connectivity pattern of cortico-basal ganglia circuits is essential to an understanding of abnormal cortical function and pathophysiology associated with a wide range of neurological and neuropsychiatric diseases. We aim to study the spatial extent and topography of human basal ganglia connectivity in vivo. Additionally, we explore at an anatomical level the hypothesis of coexistent segregated and integrative cortico-basal ganglia loops. We use probabilistic tractography on magnetic resonance diffusion weighted imaging data to segment basal ganglia and thalamus in 30 healthy subjects based on their cortical and subcortical projections. We introduce a novel method to define voxel-based connectivity profiles that allow representation of projections from a source to more than one target region. Using this method, we localize specific relay nuclei within predefined functional circuits. We find strong correlation between tractography-based basal ganglia parcellation and anatomical data from previously reported invasive tracing studies in nonhuman primates. Additionally, we show in vivo the anatomical basis of segregated loops and the extent of their overlap in prefrontal, premotor, and motor networks. Our findings in healthy humans support the notion that probabilistic diffusion tractography can be used to parcellate subcortical gray matter structures on the basis of their connectivity patterns. The coexistence of clearly segregated and also overlapping connections from cortical sites to basal ganglia subregions is a neuroanatomical correlate of both parallel and integrative networks within them. We believe that this method can be used to examine pathophysiological concepts in a number of basal ganglia-related disorders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.