888 resultados para belief rule-based approach
Resumo:
In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.
Resumo:
Prism is a modular classification rule generation method based on the ‘separate and conquer’ approach that is alternative to the rule induction approach using decision trees also known as ‘divide and conquer’. Prism often achieves a similar level of classification accuracy compared with decision trees, but tends to produce a more compact noise tolerant set of classification rules. As with other classification rule generation methods, a principle problem arising with Prism is that of overfitting due to over-specialised rules. In addition, over-specialised rules increase the associated computational complexity. These problems can be solved by pruning methods. For the Prism method, two pruning algorithms have been introduced recently for reducing overfitting of classification rules - J-pruning and Jmax-pruning. Both algorithms are based on the J-measure, an information theoretic means for quantifying the theoretical information content of a rule. Jmax-pruning attempts to exploit the J-measure to its full potential because J-pruning does not actually achieve this and may even lead to underfitting. A series of experiments have proved that Jmax-pruning may outperform J-pruning in reducing overfitting. However, Jmax-pruning is computationally relatively expensive and may also lead to underfitting. This paper reviews the Prism method and the two existing pruning algorithms above. It also proposes a novel pruning algorithm called Jmid-pruning. The latter is based on the J-measure and it reduces overfitting to a similar level as the other two algorithms but is better in avoiding underfitting and unnecessary computational effort. The authors conduct an experimental study on the performance of the Jmid-pruning algorithm in terms of classification accuracy and computational efficiency. The algorithm is also evaluated comparatively with the J-pruning and Jmax-pruning algorithms.
Resumo:
HydroShare is an online, collaborative system being developed for open sharing of hydrologic data and models. The goal of HydroShare is to enable scientists to easily discover and access hydrologic data and models, retrieve them to their desktop or perform analyses in a distributed computing environment that may include grid, cloud or high performance computing model instances as necessary. Scientists may also publish outcomes (data, results or models) into HydroShare, using the system as a collaboration platform for sharing data, models and analyses. HydroShare is expanding the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated, creating new capability to share models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. One of the fundamental concepts in HydroShare is that of a Resource. All content is represented using a Resource Data Model that separates system and science metadata and has elements common to all resources as well as elements specific to the types of resources HydroShare will support. These will include different data types used in the hydrology community and models and workflows that require metadata on execution functionality. The HydroShare web interface and social media functions are being developed using the Drupal content management system. A geospatial visualization and analysis component enables searching, visualizing, and analyzing geographic datasets. The integrated Rule-Oriented Data System (iRODS) is being used to manage federated data content and perform rule-based background actions on data and model resources, including parsing to generate metadata catalog information and the execution of models and workflows. This presentation will introduce the HydroShare functionality developed to date, describe key elements of the Resource Data Model and outline the roadmap for future development.
Resumo:
Market risk exposure plays a key role for nancial institutions risk management. A possible measure for this exposure is to evaluate losses likely to incurwhen the price of the portfolio's assets declines using Value-at-Risk (VaR) estimates, one of the most prominent measure of nancial downside market risk. This paper suggests an evolving possibilistic fuzzy modeling approach for VaR estimation. The approach is based on an extension of the possibilistic fuzzy c-means clustering and functional fuzzy rule-based modeling, which employs memberships and typicalities to update clusters and creates new clusters based on a statistical control distance-based criteria. ePFM also uses an utility measure to evaluate the quality of the current cluster structure. Computational experiments consider data of the main global equity market indexes of United States, London, Germany, Spain and Brazil from January 2000 to December 2012 for VaR estimation using ePFM, traditional VaR benchmarks such as Historical Simulation, GARCH, EWMA, and Extreme Value Theory and state of the art evolving approaches. The results show that ePFM is a potential candidate for VaR modeling, with better performance than alternative approaches.
Resumo:
Over the past several decades, the topic of child development in a cultural context has received a great deal of theoretical and empirical investigation. Investigators from the fields of indigenous and cultural psychology have argued that childhood is socially and historically constructed, rather than a universal process with a standard sequence of developmental stages or descriptions. As a result, many psychologists have become doubtful that any stage theory of cognitive or socialemotional development can be found to be valid for all times and places. In placing more theoretical emphasis on contextual processes, they define culture as a complex system of common symbolic action patterns (or scripts) built up through everyday human social interaction by means of which individuals create common meanings and in terms of which they organize experience. Researchers understand culture to be organized and coherent, but not homogenous or static, and realize that the complex dynamic system of culture constantly undergoes transformation as participants (adults and children) negotiate and re-negotiate meanings through social interaction. These negotiations and transactions give rise to unceasing heterogeneity and variability in how different individuals and groups of individuals interpret values and meanings. However, while many psychologists—both inside and outside the fields of indigenous and cultural psychology–are now willing to give up the idea of a universal path of child development and a universal story of parenting, they have not necessarily foreclosed on the possibility of discovering and describing some universal processes that underlie socialization and development-in-context. The roots of such universalities would lie in the biological aspects of child development, in the evolutionary processes of adaptation, and in the unique symbolic and problem-solving capacities of the human organism as a culture-bearing species. For instance, according to functionalist psychological anthropologists, shared (cultural) processes surround the developing child and promote in the long view the survival of families and groups if they are to demonstrate continuity in the face of ecological change and resource competition, (e.g. Edwards & Whiting, 2004; Gallimore, Goldenberg, & Weisner, 1993; LeVine, Dixon, LeVine, Richman, Leiderman, Keefer, & Brazelton, 1994; LeVine, Miller, & West, 1988; Weisner, 1996, 2002; Whiting & Edwards, 1988; Whiting & Whiting, 1980). As LeVine and colleagues (1994) state: A population tends to share an environment, symbol systems for encoding it, and organizations and codes of conduct for adapting to it (emphasis added). It is through the enactment of these population-specific codes of conduct in locally organized practices that human adaptation occurs. Human adaptation, in other words, is largely attributable to the operation of specific social organizations (e.g. families, communities, empires) following culturally prescribed scripts (normative models) in subsistence, reproduction, and other domains [communication and social regulation]. (p. 12) It follows, then, that in seeking to understand child development in a cultural context, psychologists need to support collaborative and interdisciplinary developmental science that crosses international borders. Such research can advance cross-cultural psychology, cultural psychology, and indigenous psychology, understood as three sub-disciplines composed of scientists who frequently communicate and debate with one another and mutually inform one another’s research programs. For example, to turn to parental belief systems, the particular topic of this chapter, it is clear that collaborative international studies are needed to support the goal of crosscultural psychologists for findings that go beyond simply describing cultural differences in parental beliefs. Comparative researchers need to shed light on whether parental beliefs are (or are not) systematically related to differences in child outcomes; and they need meta-analyses and reviews to explore between- and within-culture variations in parental beliefs, with a focus on issues of social change (Saraswathi, 2000). Likewise, collaborative research programs can foster the goals of indigenous psychology and cultural psychology and lay out valid descriptions of individual development in their particular cultural contexts and the processes, principles, and critical concepts needed for defining, analyzing, and predicting outcomes of child development-in-context. The project described in this chapter is based on an approach that integrates elements of comparative methodology to serve the aim of describing particular scenarios of child development in unique contexts. The research team of cultural insiders and outsiders allows for a look at American belief systems based on a dialogue of multiple perspectives.
Resumo:
OBJECTIVE: This study proposes a new approach that considers uncertainty in predicting and quantifying the presence and severity of diabetic peripheral neuropathy. METHODS: A rule-based fuzzy expert system was designed by four experts in diabetic neuropathy. The model variables were used to classify neuropathy in diabetic patients, defining it as mild, moderate, or severe. System performance was evaluated by means of the Kappa agreement measure, comparing the results of the model with those generated by the experts in an assessment of 50 patients. Accuracy was evaluated by an ROC curve analysis obtained based on 50 other cases; the results of those clinical assessments were considered to be the gold standard. RESULTS: According to the Kappa analysis, the model was in moderate agreement with expert opinions. The ROC analysis (evaluation of accuracy) determined an area under the curve equal to 0.91, demonstrating very good consistency in classifying patients with diabetic neuropathy. CONCLUSION: The model efficiently classified diabetic patients with different degrees of neuropathy severity. In addition, the model provides a way to quantify diabetic neuropathy severity and allows a more accurate patient condition assessment.
Resumo:
The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Modeling of tumor growth has been performed according to various approaches addressing different biocomplexity levels and spatiotemporal scales. Mathematical treatments range from partial differential equation based diffusion models to rule-based cellular level simulators, aiming at both improving our quantitative understanding of the underlying biological processes and, in the mid- and long term, constructing reliable multi-scale predictive platforms to support patient-individualized treatment planning and optimization. The aim of this paper is to establish a multi-scale and multi-physics approach to tumor modeling taking into account both the cellular and the macroscopic mechanical level. Therefore, an already developed biomodel of clinical tumor growth and response to treatment is self-consistently coupled with a biomechanical model. Results are presented for the free growth case of the imageable component of an initially point-like glioblastoma multiforme tumor. The composite model leads to significant tumor shape corrections that are achieved through the utilization of environmental pressure information and the application of biomechanical principles. Using the ratio of smallest to largest moment of inertia of the tumor material to quantify the effect of our coupled approach, we have found a tumor shape correction of 20\% by coupling biomechanics to the cellular simulator as compared to a cellular simulation without preferred growth directions. We conclude that the integration of the two models provides additional morphological insight into realistic tumor growth behavior. Therefore, it might be used for the development of an advanced oncosimulator focusing on tumor types for which morphology plays an important role in surgical and/or radio-therapeutic treatment planning.
Resumo:
The presented approach describes a model for a rule-based expert system calculating the temporal variability of the release of wet snow avalanches, using the assumption of avalanche triggering without the loading of new snow. The knowledge base of the model is created by using investigations on the system behaviour of wet snow avalanches in the Italian Ortles Alps, and is represented by a fuzzy logic rule-base. Input parameters of the expert system are numerical and linguistic variables, measurable meteorological and topographical factors and observable characteristics of the snow cover. Output of the inference method is the quantified release disposition for wet snow avalanches. Combining topographical parameters and the spatial interpolation of the calculated release disposition a hazard index map is dynamically generated. Furthermore, the spatial and temporal variability of damage potential on roads exposed to wet snow avalanches can be quantified, expressed by the number of persons at risk. The application of the rule base to the available data in the study area generated plausible results. The study demonstrates the potential for the application of expert systems and fuzzy logic in the field of natural hazard monitoring and risk management.
Resumo:
In computer science, different types of reusable components for building software applications were proposed as a direct consequence of the emergence of new software programming paradigms. The success of these components for building applications depends on factors such as the flexibility in their combination or the facility for their selection in centralised or distributed environments such as internet. In this article, we propose a general type of reusable component, called primitive of representation, inspired by a knowledge-based approach that can promote reusability. The proposal can be understood as a generalisation of existing partial solutions that is applicable to both software and knowledge engineering for the development of hybrid applications that integrate conventional and knowledge based techniques. The article presents the structure and use of the component and describes our recent experience in the development of real-world applications based on this approach.
Resumo:
In this paper we propose an innovative method for the automatic detection and tracking of road traffic signs using an onboard stereo camera. It involves a combination of monocular and stereo analysis strategies to increase the reliability of the detections such that it can boost the performance of any traffic sign recognition scheme. Firstly, an adaptive color and appearance based detection is applied at single camera level to generate a set of traffic sign hypotheses. In turn, stereo information allows for sparse 3D reconstruction of potential traffic signs through a SURF-based matching strategy. Namely, the plane that best fits the cloud of 3D points traced back from feature matches is estimated using a RANSAC based approach to improve robustness to outliers. Temporal consistency of the 3D information is ensured through a Kalman-based tracking stage. This also allows for the generation of a predicted 3D traffic sign model, which is in turn used to enhance the previously mentioned color-based detector through a feedback loop, thus improving detection accuracy. The proposed solution has been tested with real sequences under several illumination conditions and in both urban areas and highways, achieving very high detection rates in challenging environments, including rapid motion and significant perspective distortion
Resumo:
Advances in screening technologies allowing the identification of growth factor receptors solely by virtue of DNA or protein sequence comparison call for novel methods to isolate corresponding ligand growth factors. The EPH-like receptor tyrosine kinase (RTK) HEK (human EPH-like kinase) was identified previously as a membrane antigen on the LK63 human pre-B-cell line and overexpression in leukemic specimens and cell lines suggested a role in oncogenesis. We developed a biosensor-based approach using the immobilized HEK receptor exodomain to detect and monitor purification of the HEK ligand. A protein purification protocol, which included HEK affinity chromatography, achieved a 1.8 X 10(6)-fold purification of an approximately 23-kDa protein from human placental conditioned medium. Analysis of specific sHEK (soluble extracellular domain of HEK) ligand interactions in the first and final purification steps suggested a ligand concentration of 40 pM in the source material and a Kd of 2-3 nM. Since the purified ligand was N-terminally blocked, we generated tryptic peptides and N-terminal amino acid sequence analysis of 7 tryptic fragments of the S-pyridylethylated protein unequivocally matched the sequence for AL-1, a recently reported ligand for the related EPH-like RTK REK7 (Winslow, J.W., Moran, P., Valverde, J., Shih, A., Yuan, J.Q., Wong, S.C., Tsai, S.P., Goddard, A., Henzel, W.J., Hefti, F., Beck, K.D., & Caras, I.W. (1995) Neuron 14, 973-981). Our findings demonstrate the application of biosensor technology in ligand purification and show that AL-1, as has been found for other ligands of the EPH-like RTK family, binds more than one receptor.
Resumo:
This paper presents an approach to the belief system based on a computational framework in three levels: first, the logic level with the definition of binary local rules, second, the arithmetic level with the definition of recursive functions and finally the behavioural level with the definition of a recursive construction pattern. Social communication is achieved when different beliefs are expressed, modified, propagated and shared through social nets. This approach is useful to mimic the belief system because the defined functions provide different ways to process the same incoming information as well as a means to propagate it. Our model also provides a means to cross different beliefs so, any incoming information can be processed many times by the same or different functions as it occurs is social nets.
Resumo:
The question of energy security of the European Union (EU) has come high on the European political agenda since the mid-2000s as developments in the international energy sector have increasingly been perceived as a threat by the EU institutions and by the Member State governments. The externalisation of the EU’s internal energy market has in that context been presented as a means to ensure energy security. This approach, which can be called ‘post-modern’ with reference to Robert Cooper’s division of the world into different ‘ages’,1 however, shows insufficiencies in terms of energy security as a number of EU energy partners belonging to the ‘modern’ world do not accept to play the same rules. This consequently poses the questions of the relevance of the market-based approach and of the need for alternative solutions. This paper therefore argues that the market-based approach, based on the liberalisation of the European energy market, needs to be complemented by a geopolitical approach to ensure the security of the EU’s energy supplies. Such a geopolitical approach, however, still faces important challenges.
Resumo:
We consider the problem of assessing the number of clusters in a limited number of tissue samples containing gene expressions for possibly several thousands of genes. It is proposed to use a normal mixture model-based approach to the clustering of the tissue samples. One advantage of this approach is that the question on the number of clusters in the data can be formulated in terms of a test on the smallest number of components in the mixture model compatible with the data. This test can be carried out on the basis of the likelihood ratio test statistic, using resampling to assess its null distribution. The effectiveness of this approach is demonstrated on simulated data and on some microarray datasets, as considered previously in the bioinformatics literature. (C) 2004 Elsevier Inc. All rights reserved.