121 resultados para PROBABILISTIC TELEPORTATION


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of techniques for scaling up classifiers so that they can be applied to problems with large datasets of training examples is one of the objectives of data mining. Recently, AdaBoost has become popular among machine learning community thanks to its promising results across a variety of applications. However, training AdaBoost on large datasets is a major problem, especially when the dimensionality of the data is very high. This paper discusses the effect of high dimensionality on the training process of AdaBoost. Two preprocessing options to reduce dimensionality, namely the principal component analysis and random projection are briefly examined. Random projection subject to a probabilistic length preserving transformation is explored further as a computationally light preprocessing step. The experimental results obtained demonstrate the effectiveness of the proposed training process for handling high dimensional large datasets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigates the potential of Relevance Vector Machine (RVM)-based approach to predict the ultimate capacity of laterally loaded pile in clay. RVM is a sparse approximate Bayesian kernel method. It can be seen as a probabilistic version of support vector machine. It provides much sparser regressors without compromising performance, and kernel bases give a small but worthwhile improvement in performance. RVM model outperforms the two other models based on root-mean-square-error (RMSE) and mean-absolute-error (MAE) performance criteria. It also stimates the prediction variance. The results presented in this paper clearly highlight that the RVM is a robust tool for prediction Of ultimate capacity of laterally loaded piles in clay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned the calculation of flame structure of one-dimensional laminar premixed flames using the technique of operator-splitting. The technique utilizes an explicit method of solution with one step Euler for chemistry and a novel probabilistic scheme for diffusion. The relationship between diffusion phenomenon and Gauss-Markoff process is exploited to obtain an unconditionally stable explicit difference scheme for diffusion. The method has been applied to (a) a model problem, (b) hydrazine decomposition, (c) a hydrogen-oxygen system with 28 reactions with constant Dρ 2 approximation, and (d) a hydrogen-oxygen system (28 reactions) with trace diffusion approximation. Certain interesting aspects of behaviour of the solution with non-unity Lewis number are brought out in the case of hydrazine flame. The results of computation in the most complex case are shown to compare very favourably with those of Warnatz, both in terms of accuracy of results as well as computational time, thus showing that explicit methods can be effective in flame computations. Also computations using the Gear-Hindmarsh for chemistry and the present approach for diffusion have been carried out and comparison of the two methods is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of learning correct decision rules to minimize the probability of misclassification is a long-standing problem of supervised learning in pattern recognition. The problem of learning such optimal discriminant functions is considered for the class of problems where the statistical properties of the pattern classes are completely unknown. The problem is posed as a game with common payoff played by a team of mutually cooperating learning automata. This essentially results in a probabilistic search through the space of classifiers. The approach is inherently capable of learning discriminant functions that are nonlinear in their parameters also. A learning algorithm is presented for the team and convergence is established. It is proved that the team can obtain the optimal classifier to an arbitrary approximation. Simulation results with a few examples are presented where the team learns the optimal classifier.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss the effect of fluctuations of the random potential in directions transverse to the current flow in a modified Migdal-Kadanoff approach to probabilistic scaling of conductance with size L, in d-dimensional metallic systems. The conductance cumulants are finite and vary as Ld−1−n for n greater-or-equal, slanted 2 i.e. conductance fluctuations are constant for d = 3. The mean conductance has a non-classical correction with Image Full-size image (<1K) for d greater-or-equal, slanted 2. The form of the higher cumulants is strongly influenced by the transverse potential fluctuations and may be compared with the results of perturbative diagrammatic approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relaxation labeling processes are a class of mechanisms that solve the problem of assigning labels to objects in a manner that is consistent with respect to some domain-specific constraints. We reformulate this using the model of a team of learning automata interacting with an environment or a high-level critic that gives noisy responses as to the consistency of a tentative labeling selected by the automata. This results in an iterative linear algorithm that is itself probabilistic. Using an explicit definition of consistency we give a complete analysis of this probabilistic relaxation process using weak convergence results for stochastic algorithms. Our model can accommodate a range of uncertainties in the compatibility functions. We prove a local convergence result and show that the point of convergence depends both on the initial labeling and the constraints. The algorithm is implementable in a highly parallel fashion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Uncertainties associated with the structural model and measured vibration data may lead to unreliable damage detection. In this paper, we show that geometric and measurement uncertainty cause considerable problem in damage assessment which can be alleviated by using a fuzzy logic-based approach for damage detection. Curvature damage factor (CDF) of a tapered cantilever beam are used as damage indicators. Monte Carlo simulation (MCS) is used to study the changes in the damage indicator due to uncertainty in the geometric properties of the beam. Variation in these CDF measures due to randomness in structural parameter, further contaminated with measurement noise, are used for developing and testing a fuzzy logic system (FLS). Results show that the method correctly identifies both single and multiple damages in the structure. For example, the FLS detects damage with an average accuracy of about 95 percent in a beam having geometric uncertainty of 1 percent COV and measurement noise of 10 percent in single damage scenario. For multiple damage case, the FLS identifies damages in the beam with an average accuracy of about 94 percent in the presence of above mentioned uncertainties. The paper brings together the disparate areas of probabilistic analysis and fuzzy logic to address uncertainty in structural damage detection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The behaviour of laterally loaded piles is considerably influenced by the uncertainties in soil properties. Hence probabilistic models for assessment of allowable lateral load are necessary. Cone penetration test (CPT) data are often used to determine soil strength parameters, whereby the allowable lateral load of the pile is computed. In the present study, the maximum lateral displacement and moment of the pile are obtained based on the coefficient of subgrade reaction approach, considering the nonlinear soil behaviour in undrained clay. The coefficient of subgrade reaction is related to the undrained shear strength of soil, which can be obtained from CPT data. The soil medium is modelled as a one-dimensional random field along the depth, and it is described by the standard deviation and scale of fluctuation of the undrained shear strength of soil. Inherent soil variability, measurement uncertainty and transformation uncertainty are taken into consideration. The statistics of maximum lateral deflection and moment are obtained using the first-order, second-moment technique. Hasofer-Lind reliability indices for component and system failure criteria, based on the allowable lateral displacement and moment capacity of the pile section, are evaluated. The geotechnical database from the Konaseema site in India is used as a case example. It is shown that the reliability-based design approach for pile foundations, considering the spatial variability of soil, permits a rational choice of allowable lateral loads.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An iterative algorithm baaed on probabilistic estimation is described for obtaining the minimum-norm solution of a very large, consistent, linear system of equations AX = g where A is an (m times n) matrix with non-negative elements, x and g are respectively (n times 1) and (m times 1) vectors with positive components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technological forecasting, defined as quantified probabilistic prediction of timings and degree of change in the technological parameters, capabilities desirability or needs at different times in the future, is applied to birth control technology (BCT) as a means of revealing the paths of most promising research through identifying the necessary points for breakthroughs. The present status of BCT in the areas of pills and the IUD, male contraceptives, immumological approaches, post-coital pills, abortion, sterilization, luteolytic agents, laser technologies, and control of the sex of the child, are each summarized and evaluated in turn. Fine mapping is done to identify the most potentially promising areas of BCT. These include efforts to make oral contraception easier, improvement of the design of the IUD, clinical evaluation of the male contraceptive danazol, the effecting of biochemical changes in the seminal fluid, and researching of immunological approaches and the effects of other new drugs such as prostaglandins. The areas that require immediate and large research inputs are oral contraception and the IUD. On the basis of population and technological forecasts, it is deduced that research efforts could most effectively aid countries like India through the immediate production of an oral contraceptive pill or IUD with long-lasting effects. Development of a pill for males or an immunization against pre gnancy would also have a significant impact. However, the major impediment to birth control programs to date is attitudes, which must be changed through education.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Downscaling to station-scale hydrologic variables from large-scale atmospheric variables simulated by general circulation models (GCMs) is usually necessary to assess the hydrologic impact of climate change. This work presents CRF-downscaling, a new probabilistic downscaling method that represents the daily precipitation sequence as a conditional random field (CRF). The conditional distribution of the precipitation sequence at a site, given the daily atmospheric (large-scale) variable sequence, is modeled as a linear chain CRF. CRFs do not make assumptions on independence of observations, which gives them flexibility in using high-dimensional feature vectors. Maximum likelihood parameter estimation for the model is performed using limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) optimization. Maximum a posteriori estimation is used to determine the most likely precipitation sequence for a given set of atmospheric input variables using the Viterbi algorithm. Direct classification of dry/wet days as well as precipitation amount is achieved within a single modeling framework. The model is used to project the future cumulative distribution function of precipitation. Uncertainty in precipitation prediction is addressed through a modified Viterbi algorithm that predicts the n most likely sequences. The model is applied for downscaling monsoon (June-September) daily precipitation at eight sites in the Mahanadi basin in Orissa, India, using the MIROC3.2 medium-resolution GCM. The predicted distributions at all sites show an increase in the number of wet days, and also an increase in wet day precipitation amounts. A comparison of current and future predicted probability density functions for daily precipitation shows a change in shape of the density function with decreasing probability of lower precipitation and increasing probability of higher precipitation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Particle filters find important applications in the problems of state and parameter estimations of dynamical systems of engineering interest. Since a typical filtering algorithm involves Monte Carlo simulations of the process equations, sample variance of the estimator is inversely proportional to the number of particles. The sample variance may be reduced if one uses a Rao-Blackwell marginalization of states and performs analytical computations as much as possible. In this work, we propose a semi-analytical particle filter, requiring no Rao-Blackwell marginalization, for state and parameter estimations of nonlinear dynamical systems with additively Gaussian process/observation noises. Through local linearizations of the nonlinear drift fields in the process/observation equations via explicit Ito-Taylor expansions, the given nonlinear system is transformed into an ensemble of locally linearized systems. Using the most recent observation, conditionally Gaussian posterior density functions of the linearized systems are analytically obtained through the Kalman filter. This information is further exploited within the particle filter algorithm for obtaining samples from the optimal posterior density of the states. The potential of the method in state/parameter estimations is demonstrated through numerical illustrations for a few nonlinear oscillators. The proposed filter is found to yield estimates with reduced sample variance and improved accuracy vis-a-vis results from a form of sequential importance sampling filter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fuzzy Waste Load Allocation Model (FWLAM), developed in an earlier study, derives the optimal fractional levels, for the base flow conditions, considering the goals of the Pollution Control Agency (PCA) and dischargers. The Modified Fuzzy Waste Load Allocation Model (MFWLAM) developed subsequently is a stochastic model and considers the moments (mean, variance and skewness) of water quality indicators, incorporating uncertainty due to randomness of input variables along with uncertainty due to imprecision. The risk of low water quality is reduced significantly by using this modified model, but inclusion of new constraints leads to a low value of acceptability level, A, interpreted as the maximized minimum satisfaction in the system. To improve this value, a new model, which is a combination Of FWLAM and MFWLAM, is presented, allowing for some violations in the constraints of MFWLAM. This combined model is a multiobjective optimization model having the objectives, maximization of acceptability level and minimization of violation of constraints. Fuzzy multiobjective programming, goal programming and fuzzy goal programming are used to find the solutions. For the optimization model, Probabilistic Global Search Lausanne (PGSL) is used as a nonlinear optimization tool. The methodology is applied to a case study of the Tunga-Bhadra river system in south India. The model results in a compromised solution of a higher value of acceptability level as compared to MFWLAM, with a satisfactory value of risk. Thus the goal of risk minimization is achieved with a comparatively better value of acceptability level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of identification of stiffness, mass and damping properties of linear structural systems, based on multiple sets of measurement data originating from static and dynamic tests is considered. A strategy, within the framework of Kalman filter based dynamic state estimation, is proposed to tackle this problem. The static tests consists of measurement of response of the structure to slowly moving loads, and to static loads whose magnitude are varied incrementally; the dynamic tests involve measurement of a few elements of the frequency response function (FRF) matrix. These measurements are taken to be contaminated by additive Gaussian noise. An artificial independent variable τ, that simultaneously parameterizes the point of application of the moving load, the magnitude of the incrementally varied static load and the driving frequency in the FRFs, is introduced. The state vector is taken to consist of system parameters to be identified. The fact that these parameters are independent of the variable τ is taken to constitute the set of ‘process’ equations. The measurement equations are derived based on the mechanics of the problem and, quantities, such as displacements and/or strains, are taken to be measured. A recursive algorithm that employs a linearization strategy based on Neumann’s expansion of structural static and dynamic stiffness matrices, and, which provides posterior estimates of the mean and covariance of the unknown system parameters, is developed. The satisfactory performance of the proposed approach is illustrated by considering the problem of the identification of the dynamic properties of an inhomogeneous beam and the axial rigidities of members of a truss structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Kachchh region of Gujarat, India bore the brunt of a disastrous earthquake of magnitude M-w=7.6 that occurred on January 26, 2001. The major cause of failure of various structures including earthen dams was noted to be the presence of liquefiable alluvium in the foundation soil. Results of back-analysis of failures of Chang, Tappar, Kaswati and Rudramata earth dams using pseudo-static limit equilibrium approach presented in this paper confirm that the presence of liquefiable layer contributed to lesser factors of safety leading to a base type of failure that was also observed in the field. Following the earthquake, earth dams have been rehabilitated by the concerned authority and it is imperative that the reconstructed sections of earth dams be reanalyzed. It is also increasingly realized that risk assessment of dams in view of the large-scale investment made and probabilistic analysis is necessary. In this study, it is demonstrated that the probabilistic approach when used in conjunction with deterministic approach helps in providing a rational solution for quantification of safety of the dam and in the estimation of risk associated with the dam construction. (C) 2007 Elsevier B.V. All rights reserved.