34 resultados para evidence-based approach


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many knowledge based systems (KBS) transform a situation information into an appropriate decision using an in built knowledge base. As the knowledge in real world situation is often uncertain, the degree of truth of a proposition provides a measure of uncertainty in the underlying knowledge. This uncertainty can be evaluated by collecting `evidence' about the truth or falsehood of the proposition from multiple sources. In this paper we propose a simple framework for representing uncertainty in using the notion of an evidence space.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The behaviour of laterally loaded piles is considerably influenced by the uncertainties in soil properties. Hence probabilistic models for assessment of allowable lateral load are necessary. Cone penetration test (CPT) data are often used to determine soil strength parameters, whereby the allowable lateral load of the pile is computed. In the present study, the maximum lateral displacement and moment of the pile are obtained based on the coefficient of subgrade reaction approach, considering the nonlinear soil behaviour in undrained clay. The coefficient of subgrade reaction is related to the undrained shear strength of soil, which can be obtained from CPT data. The soil medium is modelled as a one-dimensional random field along the depth, and it is described by the standard deviation and scale of fluctuation of the undrained shear strength of soil. Inherent soil variability, measurement uncertainty and transformation uncertainty are taken into consideration. The statistics of maximum lateral deflection and moment are obtained using the first-order, second-moment technique. Hasofer-Lind reliability indices for component and system failure criteria, based on the allowable lateral displacement and moment capacity of the pile section, are evaluated. The geotechnical database from the Konaseema site in India is used as a case example. It is shown that the reliability-based design approach for pile foundations, considering the spatial variability of soil, permits a rational choice of allowable lateral loads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a constraint Jacobian matrix based approach to obtain the stiffness matrix of widely used deployable pantograph masts with scissor-like elements (SLE). The stiffness matrix is obtained in symbolic form and the results obtained agree with those obtained with the force and displacement methods available in literature. Additional advantages of this approach are that the mobility of a mast can be evaluated, redundant links and joints in the mast can be identified and practical masts with revolute joints can be analysed. Simulations for a hexagonal mast and an assembly with four hexagonal masts is presented as illustrations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper focuses on the reliability-based design optimization of gravity wall bridge abutments when subjected to active condition during earthquakes. An analytical study considering the effect of uncertainties in the seismic analysis of bridge abutments is presented. Planar failure surface has been considered in conjunction with the pseudostatic limit equilibrium method for the calculation of the seismic active earth pressure. Analysis is conducted to evaluate the external stability of bridge abutments when subjected to earthquake loads. Reliability analysis is used to estimate the probability of failure in three modes of failure viz. sliding failure of the wall on its base, overturning failure about its toe (or eccentricity failure of the resultant force) and bearing failure of foundation soil below the base of wall. The properties of backfill and foundation soil below the base of abutment are treated as random variables. In addition, the uncertainties associated with characteristics of earthquake ground motions such as horizontal seismic acceleration and shear wave velocity propagating through backfill soil are considered. The optimum proportions of the abutment needed to maintain the stability are obtained against three modes of failure by targeting various component and system reliability indices. Studies have also been made to study the influence of various parameters on the seismic stability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a novel, language-neutral approach for searching online handwritten text using Frechet distance. Online handwritten data, which is available as a time series (x,y,t), is treated as representing a parameterized curve in two-dimensions and the problem of searching online handwritten text is posed as a problem of matching two curves in a two-dimensional Euclidean space. Frechet distance is a natural measure for matching curves. The main contribution of this paper is the formulation of a variant of Frechet distance that can be used for retrieving words even when only a prefix of the word is given as query. Extensive experiments on UNIPEN dataset(1) consisting of over 16,000 words written by 7 users show that our method outperforms the state-of-the-art DTW method. Experiments were also conducted on a Multilingual dataset, generated on a PDA, with encouraging results. Our approach can be used to implement useful, exciting features like auto-completion of handwriting in PDAs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It Is well established that a sequence template along with the database is a powerful tool for identifying the biological function of proteins. Here, we describe a method for predicting the catalytic nature of certain proteins among the several protein structures deposited in the Protein Data Bank (PDB) For the present study, we considered a catalytic triad template (Ser-His-Asp) found in serine proteases We found that a geometrically optimized active site template can be used as a highly selective tool for differentiating an active protein among several inactive proteins, based on their Ser-His-Asp interactions. For any protein to be proteolytic in nature, the bond angle between Ser O-gamma-Ser H-gamma His N-epsilon 2 in the catalytic triad needs to be between 115 degrees and 140 degrees The hydrogen bond distance between Ser H-gamma His N-epsilon 2 is more flexible in nature and it varies from 2 0 angstrom to 27 angstrom while in the case of His H-delta 1 Asp O-delta 1, it is from 1.6 angstrom to 2.0 angstrom In terms of solvent accessibility, most of the active proteins lie in the range of 10-16 angstrom(2), which enables easy accessibility to the substrate These observations hold good for most catalytic triads and they can be employed to predict proteolytic nature of these catalytic triads (C) 2010 Elsevier B V All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an algorithm for solid model reconstruction from 2D sectional views based on volume-based approach. None of the existing work in automatic reconstruction from 2D orthographic views have addressed sectional views in detail. It is believed that the volume-based approach is better suited to handle different types of sectional views. The volume-based approach constructs the 3D solid by a boolean combination of elementary solids. The elementary solids are formed by sweep operation on loops identified in the input views. The only adjustment to be made for the presence of sectional views is in the identification of loops that would form the elemental solids. In the algorithm, the conventions of engineering drawing for sectional views, are used to identify the loops correctly. The algorithm is simple and intuitive in nature. Results have been obtained for full sections, offset sections and half sections. Future work will address other types of sectional views such as removed and revolved sections and broken-out sections. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, an analytical study considering the effect of uncertainties in the seismic analysis of geosynthetic-reinforced soil (GRS) walls is presented. Using limit equilibrium method and assuming sliding wedge failure mechanism, analysis is conducted to evaluate the external stability of GRS walls when subjected to earthquake loads. Target reliability based approach is used to estimate the probability of failure in three modes of failure, viz., sliding, bearing, and eccentricity failure. The properties of reinforced backfill, retained backfill, foundation soil, and geosynthetic reinforcement are treated as random variables. In addition, the uncertainties associated with horizontal seismic acceleration and surcharge load acting on the wall are considered. The optimum length of reinforcement needed to maintain the stability against three modes of failure by targeting various component and system reliability indices is obtained. Studies have also been made to study the influence of various parameters on the seismic stability in three failure modes. The results are compared with those given by first-order second moment method and Monte Carlo simulation methods. In the illustrative example, external stability of the two walls, Gould and Valencia walls, subjected to Northridge earthquake is reexamined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, an analytical study considering the effect of uncertainties in the seismic analysis of geosynthetic-reinforced soil (GRS) walls is presented. Using limit equilibrium method and assuming sliding wedge failure mechanism, analysis is conducted to evaluate the external stability of GRS walls when subjected to earthquake loads. Target reliability based approach is used to estimate the probability of failure in three modes of failure, viz., sliding, bearing, and eccentricity failure. The properties of reinforced backfill, retained backfill, foundation soil, and geosynthetic reinforcement are treated as random variables. In addition, the uncertainties associated with horizontal seismic acceleration and surcharge load acting on the wall are considered. The optimum length of reinforcement needed to maintain the stability against three modes of failure by targeting various component and system reliability indices is obtained. Studies have also been made to study the influence of various parameters on the seismic stability in three failure modes. The results are compared with those given by first-order second moment method and Monte Carlo simulation methods. In the illustrative example, external stability of the two walls, Gould and Valencia walls, subjected to Northridge earthquake is reexamined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose a general Linear Programming (LP) based formulation and solution methodology for obtaining optimal solution to the load distribution problem in divisible load scheduling. We exploit the power of the versatile LP formulation to propose algorithms that yield exact solutions to several very general load distribution problems for which either no solutions or only heuristic solutions were available. We consider both star (single-level tree) networks and linear daisy chain networks, having processors equipped with front-ends, that form the generic models for several important network topologies. We consider arbitrary processing node availability or release times and general models for communication delays and computation time that account for constant overheads such as start up times in communication and computation. The optimality of the LP based algorithms is proved rigorously.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a novel algorithm for placement of standard cells in VLSI circuits based on an analogy of this problem with neural networks. By employing some of the organising principles of these nets, we have attempted to improve the behaviour of the bipartitioning method as proposed by Kernighan and Lin. Our algorithm yields better quality placements compared with the above method, and also makes the final placement independent of the initial partition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pattern Cognition is looked at from the functional view point. The need for knowledge in synthesizing such patterns is explained and various aspects of knowledge-based pattern generation are highlighted. This approach to the generation of patterns is detailed with a concrete example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our study concerns an important current problem, that of diffusion of information in social networks. This problem has received significant attention from the Internet research community in the recent times, driven by many potential applications such as viral marketing and sales promotions. In this paper, we focus on the target set selection problem, which involves discovering a small subset of influential players in a given social network, to perform a certain task of information diffusion. The target set selection problem manifests in two forms: 1) top-k nodes problem and 2) lambda-coverage problem. In the top-k nodes problem, we are required to find a set of k key nodes that would maximize the number of nodes being influenced in the network. The lambda-coverage problem is concerned with finding a set of k key nodes having minimal size that can influence a given percentage lambda of the nodes in the entire network. We propose a new way of solving these problems using the concept of Shapley value which is a well known solution concept in cooperative game theory. Our approach leads to algorithms which we call the ShaPley value-based Influential Nodes (SPINs) algorithms for solving the top-k nodes problem and the lambda-coverage problem. We compare the performance of the proposed SPIN algorithms with well known algorithms in the literature. Through extensive experimentation on four synthetically generated random graphs and six real-world data sets (Celegans, Jazz, NIPS coauthorship data set, Netscience data set, High-Energy Physics data set, and Political Books data set), we show that the proposed SPIN approach is more powerful and computationally efficient. Note to Practitioners-In recent times, social networks have received a high level of attention due to their proven ability in improving the performance of web search, recommendations in collaborative filtering systems, spreading a technology in the market using viral marketing techniques, etc. It is well known that the interpersonal relationships (or ties or links) between individuals cause change or improvement in the social system because the decisions made by individuals are influenced heavily by the behavior of their neighbors. An interesting and key problem in social networks is to discover the most influential nodes in the social network which can influence other nodes in the social network in a strong and deep way. This problem is called the target set selection problem and has two variants: 1) the top-k nodes problem, where we are required to identify a set of k influential nodes that maximize the number of nodes being influenced in the network and 2) the lambda-coverage problem which involves finding a set of influential nodes having minimum size that can influence a given percentage lambda of the nodes in the entire network. There are many existing algorithms in the literature for solving these problems. In this paper, we propose a new algorithm which is based on a novel interpretation of information diffusion in a social network as a cooperative game. Using this analogy, we develop an algorithm based on the Shapley value of the underlying cooperative game. The proposed algorithm outperforms the existing algorithms in terms of generality or computational complexity or both. Our results are validated through extensive experimentation on both synthetically generated and real-world data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new approach based on occupation measures is introduced for studying stochastic differential games. For two-person zero-sum games, the existence of values and optimal strategies for both players is established for various payoff criteria. ForN-person games, the existence of equilibria in Markov strategies is established for various cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past few years there have been attempts to develop subspace methods for DoA (direction of arrival) estimation using a fourth?order cumulant which is known to de?emphasize Gaussian background noise. To gauge the relative performance of the cumulant MUSIC (MUltiple SIgnal Classification) (c?MUSIC) and the standard MUSIC, based on the covariance function, an extensive numerical study has been carried out, where a narrow?band signal source has been considered and Gaussian noise sources, which produce a spatially correlated background noise, have been distributed. These simulations indicate that, even though the cumulant approach is capable of de?emphasizing the Gaussian noise, both bias and variance of the DoA estimates are higher than those for MUSIC. To achieve comparable results the cumulant approach requires much larger data, three to ten times that for MUSIC, depending upon the number of sources and how close they are. This is attributed to the fact that in the estimation of the cumulant, an average of a product of four random variables is needed to make an evaluation. Therefore, compared to those in the evaluation of the covariance function, there are more cross terms which do not go to zero unless the data length is very large. It is felt that these cross terms contribute to the large bias and variance observed in c?MUSIC. However, the ability to de?emphasize Gaussian noise, white or colored, is of great significance since the standard MUSIC fails when there is colored background noise. Through simulation it is shown that c?MUSIC does yield good results, but only at the cost of more data.