899 resultados para Computacional Intelligence in Medecine
Resumo:
In this paper we investigate the relationship between two prioritized knowledge bases by measuring both the conflict and the agreement between them.First of all, a quantity of conflict and two quantities of agreement are defined. The former is shown to be a generalization of the well-known Dalal distance which is the hamming distance between two interpretations. The latter are, respectively, a quantity of strong agreement which measures the amount ofinformation on which two belief bases “totally” agree, and a quantity of weak agreement which measures the amount of information that is believed by onesource but is unknown to the other. All three quantity measures are based on the weighted prime implicant, which represents beliefs in a prioritized belief base. We then define a degree of conflict and two degrees of agreement based on our quantity of conflict and quantities of agreement. We also consider the impact of these measures on belief merging and information source ordering.
Resumo:
A variety of short time delays inserted between pairs of subjects were found to affect their ability to synchronize a musical task. The subjects performed a clapping rhythm together from separate sound-isolated rooms via headphones and without visual contact. One-way time delays between pairs were manipulated electronically in the range of 3 to 78 ms. We are interested in quantifying the envelope of time delay within which two individuals produce synchronous per- formances. The results indicate that there are distinct regimes of mutually coupled behavior, and that `natural time delay'o¨delay within the narrow range associated with travel times across spatial arrangements of groups and ensembleso¨supports the most stable performance. Conditions outside of this envelope, with time delays both below and above it, create characteristic interaction dynamics in the mutually coupled actions of the duo. Trials at extremely short delays (corresponding to unnaturally close proximity) had a tendency to accelerate from anticipation. Synchronization lagged at longer delays (larger than usual physical distances) and produced an increasingly severe deceleration and then deterioration of performed rhythms. The study has implications for music collaboration over the Internet and suggests that stable rhythmic performance can be achieved by `wired ensembles' across distances of thousands of kilometers.
Resumo:
This paper describes the application of an improved nonlinear principal component analysis (PCA) to the detection of faults in polymer extrusion processes. Since the processes are complex in nature and nonlinear relationships exist between the recorded variables, an improved nonlinear PCA, which incorporates the radial basis function (RBF) networks and principal curves, is proposed. This algorithm comprises two stages. The first stage involves the use of the serial principal curve to obtain the nonlinear scores and approximated data. The second stage is to construct two RBF networks using a fast recursive algorithm to solve the topology problem in traditional nonlinear PCA. The benefits of this improvement are demonstrated in the practical application to a polymer extrusion process.
Resumo:
Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.
Resumo:
Previous research has highlighted theoretical and empirical links between measures of both personality and trait emotional intelligence (EI), and the ability to decode facial expressions of emotion. Research has also found that the posed, static characteristics of the photographic stimuli used to explore these links affects the decoding process and differentiates them from the natural expressions they represent. This undermines the ecological validity of established trait-emotion decoding relationships. This study addresses these methodological shortcomings by testing relationships between the reliability of participant ratings of dynamic, spontaneously elicited expressions of emotion with personality and trait EI. Fifty participants completed personality and self-report EI questionnaires, and used a computer-logging program to continuously rate change in emotional intensity expressed in video clips. Each clip was rated twice to obtain an intra-rater reliability score. The results provide limited support for links between both trait EI and personality variables and how reliably we decode natural expressions of emotion. Limitations and future directions are discussed.
Resumo:
Multicore computational accelerators such as GPUs are now commodity components for highperformance computing at scale. While such accelerators have been studied in some detail as stand-alone computational engines, their integration in large-scale distributed systems raises new challenges and trade-offs. In this paper, we present an exploration of resource management alternatives for building asymmetric accelerator-based distributed systems. We present these alternatives in the context of a capabilities-aware framework for data-intensive computing, which uses an enhanced implementation of the MapReduce programming model for accelerator-based clusters, compared to the state of the art. The framework can transparently utilize heterogeneous accelerators for deriving high performance with low programming effort. Our work is the first to compare heterogeneous types of accelerators, GPUs and a Cell processors, in the same environment and the first to explore the trade-offs between compute-efficient and control-efficient accelerators on data-intensive systems. Our investigation shows that our framework scales well with the number of different compute nodes. Furthermore, it runs simultaneously on two different types of accelerators, successfully adapts to the resource capabilities, and performs 26.9% better on average than a static execution approach.
Resumo:
It is becoming clear that, contrary to earlier expectations, the application of AI techniques to law is not as easy nor as effective as some claimed. Unfortunately, for most AI researchers, there seems to be little understanding of just why this is. In this paper I argue, from empirical study of lawyers in action, just why there is a mismatch between the AI view of law, and law in practice. While this is important and novel, it also - if my arguments are accepted - demonstrates just why AI will never have success in producing the computerised lawyer.
Resumo:
Background: There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probebased real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design: Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). Study selection: diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. Data extraction: three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination: Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care.