901 resultados para probabilistic roadmap


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technological forecasting, defined as quantified probabilistic prediction of timings and degree of change in the technological parameters, capabilities desirability or needs at different times in the future, is applied to birth control technology (BCT) as a means of revealing the paths of most promising research through identifying the necessary points for breakthroughs. The present status of BCT in the areas of pills and the IUD, male contraceptives, immumological approaches, post-coital pills, abortion, sterilization, luteolytic agents, laser technologies, and control of the sex of the child, are each summarized and evaluated in turn. Fine mapping is done to identify the most potentially promising areas of BCT. These include efforts to make oral contraception easier, improvement of the design of the IUD, clinical evaluation of the male contraceptive danazol, the effecting of biochemical changes in the seminal fluid, and researching of immunological approaches and the effects of other new drugs such as prostaglandins. The areas that require immediate and large research inputs are oral contraception and the IUD. On the basis of population and technological forecasts, it is deduced that research efforts could most effectively aid countries like India through the immediate production of an oral contraceptive pill or IUD with long-lasting effects. Development of a pill for males or an immunization against pre gnancy would also have a significant impact. However, the major impediment to birth control programs to date is attitudes, which must be changed through education.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Downscaling to station-scale hydrologic variables from large-scale atmospheric variables simulated by general circulation models (GCMs) is usually necessary to assess the hydrologic impact of climate change. This work presents CRF-downscaling, a new probabilistic downscaling method that represents the daily precipitation sequence as a conditional random field (CRF). The conditional distribution of the precipitation sequence at a site, given the daily atmospheric (large-scale) variable sequence, is modeled as a linear chain CRF. CRFs do not make assumptions on independence of observations, which gives them flexibility in using high-dimensional feature vectors. Maximum likelihood parameter estimation for the model is performed using limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) optimization. Maximum a posteriori estimation is used to determine the most likely precipitation sequence for a given set of atmospheric input variables using the Viterbi algorithm. Direct classification of dry/wet days as well as precipitation amount is achieved within a single modeling framework. The model is used to project the future cumulative distribution function of precipitation. Uncertainty in precipitation prediction is addressed through a modified Viterbi algorithm that predicts the n most likely sequences. The model is applied for downscaling monsoon (June-September) daily precipitation at eight sites in the Mahanadi basin in Orissa, India, using the MIROC3.2 medium-resolution GCM. The predicted distributions at all sites show an increase in the number of wet days, and also an increase in wet day precipitation amounts. A comparison of current and future predicted probability density functions for daily precipitation shows a change in shape of the density function with decreasing probability of lower precipitation and increasing probability of higher precipitation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Particle filters find important applications in the problems of state and parameter estimations of dynamical systems of engineering interest. Since a typical filtering algorithm involves Monte Carlo simulations of the process equations, sample variance of the estimator is inversely proportional to the number of particles. The sample variance may be reduced if one uses a Rao-Blackwell marginalization of states and performs analytical computations as much as possible. In this work, we propose a semi-analytical particle filter, requiring no Rao-Blackwell marginalization, for state and parameter estimations of nonlinear dynamical systems with additively Gaussian process/observation noises. Through local linearizations of the nonlinear drift fields in the process/observation equations via explicit Ito-Taylor expansions, the given nonlinear system is transformed into an ensemble of locally linearized systems. Using the most recent observation, conditionally Gaussian posterior density functions of the linearized systems are analytically obtained through the Kalman filter. This information is further exploited within the particle filter algorithm for obtaining samples from the optimal posterior density of the states. The potential of the method in state/parameter estimations is demonstrated through numerical illustrations for a few nonlinear oscillators. The proposed filter is found to yield estimates with reduced sample variance and improved accuracy vis-a-vis results from a form of sequential importance sampling filter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

House loss during unplanned bushfires is a complex phenomenon where design, configuration, material and siting, can significantly influence the loss. In collaboration with the Bushfire Cooperative Research Centre the CSIRO has developed a tool to assess the vulnerability of a specific house at the urban interface. The tool is based on a spatial profiling of urban assets including their design, material, surrounding objects and their relationship amongst one another. The analysis incorporates both probabilistic and deterministic parameters, and is based on the impact of radiant heat, flame and embers on the surrounding elements and the structure itself. It provides a breakdown of the attributes and design parameters that contribute to the vulnerability level. This paper describes the tool which allows the user to explore the vulnerability of a house to varying levels of bushfire attacks. The tool is aimed at government agencies interested in building design, town planning and community education for bushfire risk mitigation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fuzzy Waste Load Allocation Model (FWLAM), developed in an earlier study, derives the optimal fractional levels, for the base flow conditions, considering the goals of the Pollution Control Agency (PCA) and dischargers. The Modified Fuzzy Waste Load Allocation Model (MFWLAM) developed subsequently is a stochastic model and considers the moments (mean, variance and skewness) of water quality indicators, incorporating uncertainty due to randomness of input variables along with uncertainty due to imprecision. The risk of low water quality is reduced significantly by using this modified model, but inclusion of new constraints leads to a low value of acceptability level, A, interpreted as the maximized minimum satisfaction in the system. To improve this value, a new model, which is a combination Of FWLAM and MFWLAM, is presented, allowing for some violations in the constraints of MFWLAM. This combined model is a multiobjective optimization model having the objectives, maximization of acceptability level and minimization of violation of constraints. Fuzzy multiobjective programming, goal programming and fuzzy goal programming are used to find the solutions. For the optimization model, Probabilistic Global Search Lausanne (PGSL) is used as a nonlinear optimization tool. The methodology is applied to a case study of the Tunga-Bhadra river system in south India. The model results in a compromised solution of a higher value of acceptability level as compared to MFWLAM, with a satisfactory value of risk. Thus the goal of risk minimization is achieved with a comparatively better value of acceptability level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of identification of stiffness, mass and damping properties of linear structural systems, based on multiple sets of measurement data originating from static and dynamic tests is considered. A strategy, within the framework of Kalman filter based dynamic state estimation, is proposed to tackle this problem. The static tests consists of measurement of response of the structure to slowly moving loads, and to static loads whose magnitude are varied incrementally; the dynamic tests involve measurement of a few elements of the frequency response function (FRF) matrix. These measurements are taken to be contaminated by additive Gaussian noise. An artificial independent variable τ, that simultaneously parameterizes the point of application of the moving load, the magnitude of the incrementally varied static load and the driving frequency in the FRFs, is introduced. The state vector is taken to consist of system parameters to be identified. The fact that these parameters are independent of the variable τ is taken to constitute the set of ‘process’ equations. The measurement equations are derived based on the mechanics of the problem and, quantities, such as displacements and/or strains, are taken to be measured. A recursive algorithm that employs a linearization strategy based on Neumann’s expansion of structural static and dynamic stiffness matrices, and, which provides posterior estimates of the mean and covariance of the unknown system parameters, is developed. The satisfactory performance of the proposed approach is illustrated by considering the problem of the identification of the dynamic properties of an inhomogeneous beam and the axial rigidities of members of a truss structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Kachchh region of Gujarat, India bore the brunt of a disastrous earthquake of magnitude M-w=7.6 that occurred on January 26, 2001. The major cause of failure of various structures including earthen dams was noted to be the presence of liquefiable alluvium in the foundation soil. Results of back-analysis of failures of Chang, Tappar, Kaswati and Rudramata earth dams using pseudo-static limit equilibrium approach presented in this paper confirm that the presence of liquefiable layer contributed to lesser factors of safety leading to a base type of failure that was also observed in the field. Following the earthquake, earth dams have been rehabilitated by the concerned authority and it is imperative that the reconstructed sections of earth dams be reanalyzed. It is also increasingly realized that risk assessment of dams in view of the large-scale investment made and probabilistic analysis is necessary. In this study, it is demonstrated that the probabilistic approach when used in conjunction with deterministic approach helps in providing a rational solution for quantification of safety of the dam and in the estimation of risk associated with the dam construction. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anatomical brain networks change throughout life and with diseases. Genetic analysis of these networks may help identify processes giving rise to heritable brain disorders, but we do not yet know which network measures are promising for genetic analyses. Many factors affect the downstream results, such as the tractography algorithm used to define structural connectivity. We tested nine different tractography algorithms and four normalization methods to compute brain networks for 853 young healthy adults (twins and their siblings). We fitted genetic structural equation models to all nine network measures, after a normalization step to increase network consistency across tractography algorithms. Probabilistic tractography algorithms with global optimization (such as Probtrackx and Hough) yielded higher heritability statistics than 'greedy' algorithms (such as FACT) which process small neighborhoods at each step. Some global network measures (probtrackx-derived GLOB and ST) showed significant genetic effects, making them attractive targets for genome-wide association studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Randomness in the source condition other than the heterogeneity in the system parameters can also be a major source of uncertainty in the concentration field. Hence, a more general form of the problem formulation is necessary to consider randomness in both source condition and system parameters. When the source varies with time, the unsteady problem, can be solved using the unit response function. In the case of random system parameters, the response function becomes a random function and depends on the randomness in the system parameters. In the present study, the source is modelled as a random discrete process with either a fixed interval or a random interval (the Poisson process). In this study, an attempt is made to assess the relative effects of various types of source uncertainties on the probabilistic behaviour of the concentration in a porous medium while the system parameters are also modelled as random fields. Analytical expressions of mean and covariance of concentration due to random discrete source are derived in terms of mean and covariance of unit response function. The probabilistic behaviour of the random response function is obtained by using a perturbation-based stochastic finite element method (SFEM), which performs well for mild heterogeneity. The proposed method is applied for analysing both the 1-D as well as the 3-D solute transport problems. The results obtained with SFEM are compared with the Monte Carlo simulation for 1-D problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of detecting statistically significant sequential patterns in multineuronal spike trains. These patterns are characterized by ordered sequences of spikes from different neurons with specific delays between spikes. We have previously proposed a data-mining scheme to efficiently discover such patterns, which occur often enough in the data. Here we propose a method to determine the statistical significance of such repeating patterns. The novelty of our approach is that we use a compound null hypothesis that not only includes models of independent neurons but also models where neurons have weak dependencies. The strength of interaction among the neurons is represented in terms of certain pair-wise conditional probabilities. We specify our null hypothesis by putting an upper bound on all such conditional probabilities. We construct a probabilistic model that captures the counting process and use this to derive a test of significance for rejecting such a compound null hypothesis. The structure of our null hypothesis also allows us to rank-order different significant patterns. We illustrate the effectiveness of our approach using spike trains generated with a simulator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a ‘magnitude-based inference’ approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work focuses on the role of macroseismology in the assessment of seismicity and probabilistic seismic hazard in Northern Europe. The main type of data under consideration is a set of macroseismic observations available for a given earthquake. The macroseismic questionnaires used to collect earthquake observations from local residents since the late 1800s constitute a special part of the seismological heritage in the region. Information of the earthquakes felt on the coasts of the Gulf of Bothnia between 31 March and 2 April 1883 and on 28 July 1888 was retrieved from the contemporary Finnish and Swedish newspapers, while the earthquake of 4 November 1898 GMT is an example of an early systematic macroseismic survey in the region. A data set of more than 1200 macroseismic questionnaires is available for the earthquake in Central Finland on 16 November 1931. Basic macroseismic investigations including preparation of new intensity data point (IDP) maps were conducted for these earthquakes. Previously disregarded usable observations were found in the press. The improved collection of IDPs of the 1888 earthquake shows that this event was a rare occurrence in the area. In contrast to earlier notions it was felt on both sides of the Gulf of Bothnia. The data on the earthquake of 4 November 1898 GMT were augmented with historical background information discovered in various archives and libraries. This earthquake was of some concern to the authorities, because extra fire inspections were conducted in three towns at least, i.e. Tornio, Haparanda and Piteå, located in the centre of the area of perceptibility. This event posed the indirect hazard of fire, although its magnitude around 4.6 was minor on the global scale. The distribution of slightly damaging intensities was larger than previously outlined. This may have resulted from the amplification of the ground shaking in the soft soil of the coast and river valleys where most of the population was found. The large data set of the 1931 earthquake provided an opportunity to apply statistical methods and assess methodologies that can be used when dealing with macroseismic intensity. It was evaluated using correspondence analysis. Different approaches such as gridding were tested to estimate the macroseismic field from the intensity values distributed irregularly in space. In general, the characteristics of intensity warrant careful consideration. A more pervasive perception of intensity as an ordinal quantity affected by uncertainties is advocated. A parametric earthquake catalogue comprising entries from both the macroseismic and instrumental era was used for probabilistic seismic hazard assessment. The parametric-historic methodology was applied to estimate seismic hazard at a given site in Finland and to prepare a seismic hazard map for Northern Europe. The interpretation of these results is an important issue, because the recurrence times of damaging earthquakes may well exceed thousands of years in an intraplate setting such as Northern Europe. This application may therefore be seen as an example of short-term hazard assessment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coptotermes Wasmann (Isoptera: Rhinotermitidae) is one of the most economically important subterranean termite genera and some species are successful invaders. However, despite its important pest status, the taxonomic validity of many named Coptotermes species remains unclear. In this study, we reviewed all named species within the genus and investigated evidence supporting the validity of each named species. Species were systematically scrutinized according to the region of their original description: Southeast Asia, India, China, Africa, the Neotropics, and Australia. We estimate that of the currently 69 named species described by accepted nomenclatural rules, only 21 taxa have solid evidence for validity, 44 names have uncertain status, and the remaining species names should be synonymized or were made unavailable. Species with high degrees of invasiveness may be known under additional junior synonyms due to independent parochial descriptions. Molecular data for a vast majority of species are scarce and significant effort is needed to complete the taxonomic and phylogenetic revision of the genus. Because of the wide distribution of Coptotermes, we advocate for an integrative taxonomic effort to establish the distribution of each putative species, provide specimens and corresponding molecular data, check original descriptions and type specimens (if available), and provide evidence for a more robust phylogenetic position of each species. This study embodies both consensus and contention of those studying Coptotermes and thus pinpoints the current uncertainty of many species. This project is intended to be a roadmap for identifying those Coptotermes species names that need to be more thoroughly investigated, as an incentive to complete a necessary revision process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this research is to draw up a clear construction of an anticipatory communicative decision-making process and a successful implementation of a Bayesian application that can be used as an anticipatory communicative decision-making support system. This study is a decision-oriented and constructive research project, and it includes examples of simulated situations. As a basis for further methodological discussion about different approaches to management research, in this research, a decision-oriented approach is used, which is based on mathematics and logic, and it is intended to develop problem solving methods. The approach is theoretical and characteristic of normative management science research. Also, the approach of this study is constructive. An essential part of the constructive approach is to tie the problem to its solution with theoretical knowledge. Firstly, the basic definitions and behaviours of an anticipatory management and managerial communication are provided. These descriptions include discussions of the research environment and formed management processes. These issues define and explain the background to further research. Secondly, it is processed to managerial communication and anticipatory decision-making based on preparation, problem solution, and solution search, which are also related to risk management analysis. After that, a solution to the decision-making support application is formed, using four different Bayesian methods, as follows: the Bayesian network, the influence diagram, the qualitative probabilistic network, and the time critical dynamic network. The purpose of the discussion is not to discuss different theories but to explain the theories which are being implemented. Finally, an application of Bayesian networks to the research problem is presented. The usefulness of the prepared model in examining a problem and the represented results of research is shown. The theoretical contribution includes definitions and a model of anticipatory decision-making. The main theoretical contribution of this study has been to develop a process for anticipatory decision-making that includes management with communication, problem-solving, and the improvement of knowledge. The practical contribution includes a Bayesian Decision Support Model, which is based on Bayesian influenced diagrams. The main contributions of this research are two developed processes, one for anticipatory decision-making, and the other to produce a model of a Bayesian network for anticipatory decision-making. In summary, this research contributes to decision-making support by being one of the few publicly available academic descriptions of the anticipatory decision support system, by representing a Bayesian model that is grounded on firm theoretical discussion, by publishing algorithms suitable for decision-making support, and by defining the idea of anticipatory decision-making for a parallel version. Finally, according to the results of research, an analysis of anticipatory management for planned decision-making is presented, which is based on observation of environment, analysis of weak signals, and alternatives to creative problem solving and communication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two algorithms are outlined, each of which has interesting features for modeling of spatial variability of rock depth. In this paper, reduced level of rock at Bangalore, India, is arrived from the 652 boreholes data in the area covering 220 sqa <.km. Support vector machine (SVM) and relevance vector machine (RVM) have been utilized to predict the reduced level of rock in the subsurface of Bangalore and to study the spatial variability of the rock depth. The support vector machine (SVM) that is firmly based on the theory of statistical learning theory uses regression technique by introducing epsilon-insensitive loss function has been adopted. RVM is a probabilistic model similar to the widespread SVM, but where the training takes place in a Bayesian framework. Prediction results show the ability of learning machine to build accurate models for spatial variability of rock depth with strong predictive capabilities. The paper also highlights the capability ofRVM over the SVM model.