914 resultados para Probabilistic logic
Resumo:
This paper introduces a recursive rule base adjustment to enhance the performance of fuzzy logic controllers. Here the fuzzy controller is constructed on the basis of a decision table (DT), relying on membership functions and fuzzy rules that incorporate heuristic knowledge and operator experience. If the controller performance is not satisfactory, it has previously been suggested that the rule base be altered by combined tuning of membership functions and controller scaling factors. The alternative approach proposed here entails alteration of the fuzzy rule base. The recursive rule base adjustment algorithm proposed in this paper has the benefit that it is computationally more efficient for the generation of a DT, and advantage for online realization. Simulation results are presented to support this thesis. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
In this paper, by investigating the influence of source/drain extension region engineering (also known as gate-source/drain underlap) in nanoscale planar double gate (DG) SOI MOSFETs, we offer new insights into the design of future nanoscale gate-underlap DG devices to achieve ITRS projections for high performance (HP), low standby power (LSTP) and low operating power (LOP) logic technologies. The impact of high-kappa gate dielectric, silicon film thickness, together with parameters associated with the lateral source/drain doping profile, is investigated in detail. The results show that spacer width along with lateral straggle can not only effectively control short-channel effects, thus presenting low off-current in a gate underlap device, but can also be optimized to achieve lower intrinsic delay and higher on-off current ratio (I-on/I-off). Based on the investigation of on-current (I-on), off-current (I-off), I-on/I-off, intrinsic delay (tau), energy delay product and static power dissipation, we present design guidelines to select key device parameters to achieve ITRS projections. Using nominal gate lengths for different technologies, as recommended from ITRS specification, optimally designed gate-underlap DG MOSFETs with a spacer-to-straggle (s/sigma) ratio of 2.3 for HP/LOP and 3.2 for LSTP logic technologies will meet ITRS projection. However, a relatively narrow range of lateral straggle lying between 7 to 8 nm is recommended. A sensitivity analysis of intrinsic delay, on-current and off-current to important parameters allows a comparative analysis of the various design options and shows that gate workfunction appears to be the most crucial parameter in the design of DG devices for all three technologies. The impact of back gate misalignment on I-on, I-off and tau is also investigated for optimized underlap devices.
Resumo:
This paper discusses the relations between extended incidence calculus and assumption-based truth maintenance systems (ATMSs). We first prove that managing labels for statements (nodes) in an ATMS is equivalent to producing incidence sets of these statements in extended incidence calculus. We then demonstrate that the justification set for a node is functionally equivalent to the implication relation set for the same node in extended incidence calculus. As a consequence, extended incidence calculus can provide justifications for an ATMS, because implication relation sets are discovered by the system automatically. We also show that extended incidence calculus provides a theoretical basis for constructing a probabilistic ATMS by associating proper probability distributions on assumptions. In this way, we can not only produce labels for all nodes in the system, but also calculate the probability of any of such nodes in it. The nogood environments can also be obtained automatically. Therefore, extended incidence calculus and the ATMS are equivalent in carrying out inferences at both the symbolic level and the numerical level. This extends a result due to Laskey and Lehner.
Resumo:
Logistic regression and Gaussian mixture model (GMM) classifiers have been trained to estimate the probability of acute myocardial infarction (AMI) in patients based upon the concentrations of a panel of cardiac markers. The panel consists of two new markers, fatty acid binding protein (FABP) and glycogen phosphorylase BB (GPBB), in addition to the traditional cardiac troponin I (cTnI), creatine kinase MB (CKMB) and myoglobin. The effect of using principal component analysis (PCA) and Fisher discriminant analysis (FDA) to preprocess the marker concentrations was also investigated. The need for classifiers to give an accurate estimate of the probability of AMI is argued and three categories of performance measure are described, namely discriminatory ability, sharpness, and reliability. Numerical performance measures for each category are given and applied. The optimum classifier, based solely upon the samples take on admission, was the logistic regression classifier using FDA preprocessing. This gave an accuracy of 0.85 (95% confidence interval: 0.78-0.91) and a normalised Brier score of 0.89. When samples at both admission and a further time, 1-6 h later, were included, the performance increased significantly, showing that logistic regression classifiers can indeed use the information from the five cardiac markers to accurately and reliably estimate the probability AMI. © Springer-Verlag London Limited 2008.