976 resultados para Sequential machine theory.
Resumo:
Statistical learning algorithms provide a viable framework for geotechnical engineering modeling. This paper describes two statistical learning algorithms applied for site characterization modeling based on standard penetration test (SPT) data. More than 2700 field SPT values (N) have been collected from 766 boreholes spread over an area of 220 sqkm area in Bangalore. To get N corrected value (N,), N values have been corrected (Ne) for different parameters such as overburden stress, size of borehole, type of sampler, length of connecting rod, etc. In three-dimensional site characterization model, the function N-c=N-c (X, Y, Z), where X, Y and Z are the coordinates of a point corresponding to N, value, is to be approximated in which N, value at any half-space point in Bangalore can be determined. The first algorithm uses least-square support vector machine (LSSVM), which is related to aridge regression type of support vector machine. The second algorithm uses relevance vector machine (RVM), which combines the strengths of kernel-based methods and Bayesian theory to establish the relationships between a set of input vectors and a desired output. The paper also presents the comparative study between the developed LSSVM and RVM model for site characterization. Copyright (C) 2009 John Wiley & Sons,Ltd.
Resumo:
The operation of a stand-alone, as opposed to grid connected generation system, using a slip-ring induction machine as the electrical generator, is considered. In contrast to an alternator, a slip-ring induction machine can run at variable speed and still deliver constant frequency power to loads. This feature enables optimization of the system when the prime mover is inherently variable speed in nature eg. wind turbines, as well as diesel driven systems, where there is scope for economizing on fuel consumption. Experimental results from a system driven by a 44 bhp diesel engine are presented. Operation at subsynchronous as well as super-synchronous speeds is examined. The measurement facilitates the understanding of the system as well as its design.
Resumo:
This study describes two machine learning techniques applied to predict liquefaction susceptibility of soil based on the standard penetration test (SPT) data from the 1999 Chi-Chi, Taiwan earthquake. The first machine learning technique which uses Artificial Neural Network (ANN) based on multi-layer perceptions (MLP) that are trained with Levenberg-Marquardt backpropagation algorithm. The second machine learning technique uses the Support Vector machine (SVM) that is firmly based on the theory of statistical learning theory, uses classification technique. ANN and SVM have been developed to predict liquefaction susceptibility using corrected SPT (N-1)(60)] and cyclic stress ratio (CSR). Further, an attempt has been made to simplify the models, requiring only the two parameters (N-1)(60) and peck ground acceleration (a(max)/g)], for the prediction of liquefaction susceptibility. The developed ANN and SVM models have also been applied to different case histories available globally. The paper also highlights the capability of the SVM over the ANN models.
Resumo:
In this paper, reduced level of rock at Bangalore, India is arrived from the 652 boreholes data in the area covering 220 sq.km. In the context of prediction of reduced level of rock in the subsurface of Bangalore and to study the spatial variability of the rock depth, ordinary kriging and Support Vector Machine (SVM) models have been developed. In ordinary kriging, the knowledge of the semivariogram of the reduced level of rock from 652 points in Bangalore is used to predict the reduced level of rock at any point in the subsurface of Bangalore, where field measurements are not available. A cross validation (Q1 and Q2) analysis is also done for the developed ordinary kriging model. The SVM is a novel type of learning machine based on statistical learning theory, uses regression technique by introducing e-insensitive loss function has been used to predict the reduced level of rock from a large set of data. A comparison between ordinary kriging and SVM model demonstrates that the SVM is superior to ordinary kriging in predicting rock depth.
Resumo:
Optimal maintenance policies for a machine with degradation in performance with age and subject to failure are derived using optimal control theory. The optimal policies are shown to be, normally, of bang-coast nature, except in the case when probability of machine failure is a function of maintenance. It is also shown, in the deterministic case that a higher depreciation rate tends to reverse this policy to coast-bang. When the probability of failure is a function of maintenance, considerable computational effort is needed to obtain an optimal policy and the resulting policy is not easily implementable. For this case also, an optimal policy in the class of bang-coast policies is derived, using a semi-Markov decision model. A simple procedure for modifying the probability of machine failure with maintenance is employed. The results obtained extend and unify the recent results for this problem along both theoretical and practical lines. Numerical examples are presented to illustrate the results obtained.
Resumo:
The paper proposes a study of symmetrical and related components, based on the theory of linear vector spaces. Using the concept of equivalence, the transformation matrixes of Clarke, Kimbark, Concordia, Boyajian and Koga are shown to be column equivalent to Fortescue's symmetrical-component transformation matrix. With a constraint on power, criteria are presented for the choice of bases for voltage and current vector spaces. In particular, it is shown that, for power invariance, either the same orthonormal (self-reciprocal) basis must be chosen for both voltage and current vector spaces, or the basis of one must be chosen to be reciprocal to that of the other. The original �¿, ��, 0 components of Clarke are modified to achieve power invariance. For machine analysis, it is shown that invariant transformations lead to reciprocal mutual inductances between the equivalent circuits. The relative merits of the various components are discussed.
Resumo:
The design of machine foundations are done on the basis of two principal criteria viz., vibration amplitude should be within the permissible limits and natural frequency of machine-foundation-soil system should be away from the operating frequency (i.e. avoidance of resonance condition). In this paper the nondimensional amplitude factor M-m or M-r m and the nondimensional frequency factor a(o m) at resonance are related using elastic half space theory and is used as a new approach for a simplified design procedure for the design of machine foundations for all the modes of vibration fiz. vertical, horizontal, rocking and torsional for rigid base pressure distribution and weighted average displacement condition. The analysis show that one need not know the value of Poisson's ratio for rotating mass system for all the modes of vibration.
Resumo:
We consider a visual search problem studied by Sripati and Olson where the objective is to identify an oddball image embedded among multiple distractor images as quickly as possible. We model this visual search task as an active sequential hypothesis testing problem (ASHT problem). Chernoff in 1959 proposed a policy in which the expected delay to decision is asymptotically optimal. The asymptotics is under vanishing error probabilities. We first prove a stronger property on the moments of the delay until a decision, under the same asymptotics. Applying the result to the visual search problem, we then propose a ``neuronal metric'' on the measured neuronal responses that captures the discriminability between images. From empirical study we obtain a remarkable correlation (r = 0.90) between the proposed neuronal metric and speed of discrimination between the images. Although this correlation is lower than with the L-1 metric used by Sripati and Olson, this metric has the advantage of being firmly grounded in formal decision theory.
Resumo:
This paper considers sequential hypothesis testing in a decentralized framework. We start with two simple decentralized sequential hypothesis testing algorithms. One of which is later proved to be asymptotically Bayes optimal. We also consider composite versions of decentralized sequential hypothesis testing. A novel nonparametric version for decentralized sequential hypothesis testing using universal source coding theory is developed. Finally we design a simple decentralized multihypothesis sequential detection algorithm.
Resumo:
We consider nonparametric or universal sequential hypothesis testing when the distribution under the null hypothesis is fully known but the alternate hypothesis corresponds to some other unknown distribution. These algorithms are primarily motivated from spectrum sensing in Cognitive Radios and intruder detection in wireless sensor networks. We use easily implementable universal lossless source codes to propose simple algorithms for such a setup. The algorithms are first proposed for discrete alphabet. Their performance and asymptotic properties are studied theoretically. Later these are extended to continuous alphabets. Their performance with two well known universal source codes, Lempel-Ziv code and KT-estimator with Arithmetic Encoder are compared. These algorithms are also compared with the tests using various other nonparametric estimators. Finally a decentralized version utilizing spatial diversity is also proposed and analysed.
Resumo:
We show how machine learning techniques based on Bayesian inference can be used to reach new levels of realism in the computer simulation of molecular materials, focusing here on water. We train our machine-learning algorithm using accurate, correlated quantum chemistry, and predict energies and forces in molecular aggregates ranging from clusters to solid and liquid phases. The widely used electronic-structure methods based on density-functional theory (DFT) give poor accuracy for molecular materials like water, and we show how our techniques can be used to generate systematically improvable corrections to DFT. The resulting corrected DFT scheme gives remarkably accurate predictions for the relative energies of small water clusters and of different ice structures, and greatly improves the description of the structure and dynamics of liquid water.
Resumo:
Biomimetic pattern recogntion (BPR), which is based on "cognition" instead of "classification", is much closer to the function of human being. The basis of BPR is the Principle of homology-continuity (PHC), which means the difference between two samples of the same class must be gradually changed. The aim of BPR is to find an optimal covering in the feature space, which emphasizes the "similarity" among homologous group members, rather than "division" in traditional pattern recognition. Some applications of BPR are surveyed, in which the results of BPR are much better than the results of Support Vector Machine. A novel neuron model, Hyper sausage neuron (HSN), is shown as a kind of covering units in BPR. The mathematical description of HSN is given and the 2-dimensional discriminant boundary of HSN is shown. In two special cases, in which samples are distributed in a line segment and a circle, both the HSN networks and RBF networks are used for covering. The results show that HSN networks act better than RBF networks in generalization, especially for small sample set, which are consonant with the results of the applications of BPR. And a brief explanation of the HSN networks' advantages in covering general distributed samples is also given.