888 resultados para belief rule-based approach
Resumo:
Most physiological effects of thyroid hormones are mediated by the two thyroid hormone receptor subtypes, TR alpha and TR beta. Several pharmacological effects mediated by TR beta might be beneficial in important medical conditions such as obesity, hypercholesterolemia and diabetes, and selective TR beta activation may elicit these effects while maintaining an acceptable safety profile, To understand the molecular determinants of affinity and subtype selectivity of TR ligands, we have successfully employed a ligand- and structure-guided pharmacophore-based approach to obtain the molecular alignment of a large series of thyromimetics. Statistically reliable three-dimensional quantitative structure-activity relationship (3D-QSAR) and three-dimensional quantitative structure-selectivity relationship (3D-QSSR) models were obtained using the comparative molecular field analysis (CoMFA) method, and the visual analyses of the contour maps drew attention to a number of possible opportunities for the development of analogs with improved affinity and selectivity. Furthermore, the 3D-QSSR analysis allowed the identification of a novel and previously unmentioned halogen bond, bringing new insights to the mechanism of activity and selectivity of thyromimetics.
Resumo:
The purpose of this work is to develop a web based decision support system, based onfuzzy logic, to assess the motor state of Parkinson patients on their performance in onscreenmotor tests in a test battery on a hand computer. A set of well defined rules, basedon an expert’s knowledge, were made to diagnose the current state of the patient. At theend of a period, an overall score is calculated which represents the overall state of thepatient during the period. Acceptability of the rules is based on the absolute differencebetween patient’s own assessment of his condition and the diagnosed state. Anyinconsistency can be tracked by highlighted as an alert in the system. Graphicalpresentation of data aims at enhanced analysis of patient’s state and performancemonitoring by the clinic staff. In general, the system is beneficial for the clinic staff,patients, project managers and researchers.
Resumo:
The aim of this work was to design a set of rules for levodopa infusion dose adjustment in Parkinson’s disease based on a simulation experiments. Using this simulator, optimal infusions dose in different conditions were calculated. There are seven conditions (-3 to +3)appearing in a rating scale for Parkinson’s disease patients. By finding mean of the differences between conditions and optimal dose, two sets of rules were designed. The set of rules was optimized by several testing. Usefulness for optimizing the titration procedure of new infusion patients based on rule-based reasoning was investigated. Results show that both of the number of the steps and the errors for finding optimal dose was shorten by new rules. At last, the dose predicted with new rules well on each single occasion of majority of patients in simulation experiments.
Resumo:
A major problem in e-service development is the prioritization of the requirements of different stakeholders. The main stakeholders are governments and their citizens, all of whom have different and sometimes conflicting requirements. In this paper, the prioritization problem is addressed by combining a value-based approach with an illustration technique. This paper examines the following research question: How can multiple stakeholder requirements be illustrated from a value-based perspective in order to be prioritizable? We used an e-service development case taken from a Swedish municipality to elaborate on our approach. Our contributions are: 1) a model of the relevant domains for requirement prioritization for government, citizens, technology, finances and laws and regulations; and 2) a requirement fulfillment analysis tool (RFA) that consists of a requirement-goal-value matrix (RGV), and a calculation and illustration module (CIM). The model reduces cognitive load, helps developers to focus on value fulfillment in e-service development and supports them in the formulation of requirements. It also offers an input to public policy makers, should they aim to target values in the design of e-services.
Resumo:
The aim of this paper is to evaluate the performance of two divergent methods for delineating commuting regions, also called labour market areas, in a situation that the base spatial units differ largely in size as a result of an irregular population distribution. Commuting patterns in Sweden have been analyzed with geographical information system technology by delineating commuting regions using two regionalization methods. One, a rule-based method, uses one-way commuting flows to delineate local labour market areas in a top-down procedure based on the selection of predefined employment centres. The other method, the interaction-based Intramax analysis, uses two-way flows in a bottom-up procedure based on numerical taxonomy principles. A comparison of these methods will expose a number of strengths and weaknesses. For both methods, the same data source has been used. The performance of both methods has been evaluated for the country as a whole using resident employed population, self-containment levels and job ratios for criteria. A more detailed evaluation has been done in the Goteborg metropolitan area by comparing regional patterns with the commuting fields of a number of urban centres in this area. It is concluded that both methods could benefit from the inclusion of additional control measures to identify improper allocations of municipalities.
Resumo:
Due to the increase in water demand and hydropower energy, it is getting more important to operate hydraulic structures in an efficient manner while sustaining multiple demands. Especially, companies, governmental agencies, consultant offices require effective, practical integrated tools and decision support frameworks to operate reservoirs, cascades of run-of-river plants and related elements such as canals by merging hydrological and reservoir simulation/optimization models with various numerical weather predictions, radar and satellite data. The model performance is highly related with the streamflow forecast, related uncertainty and its consideration in the decision making. While deterministic weather predictions and its corresponding streamflow forecasts directly restrict the manager to single deterministic trajectories, probabilistic forecasts can be a key solution by including uncertainty in flow forecast scenarios for dam operation. The objective of this study is to compare deterministic and probabilistic streamflow forecasts on an earlier developed basin/reservoir model for short term reservoir management. The study is applied to the Yuvacık Reservoir and its upstream basin which is the main water supply of Kocaeli City located in the northwestern part of Turkey. The reservoir represents a typical example by its limited capacity, downstream channel restrictions and high snowmelt potential. Mesoscale Model 5 and Ensemble Prediction System data are used as a main input and the flow forecasts are done for 2012 year using HEC-HMS. Hydrometeorological rule-based reservoir simulation model is accomplished with HEC-ResSim and integrated with forecasts. Since EPS based hydrological model produce a large number of equal probable scenarios, it will indicate how uncertainty spreads in the future. Thus, it will provide risk ranges in terms of spillway discharges and reservoir level for operator when it is compared with deterministic approach. The framework is fully data driven, applicable, useful to the profession and the knowledge can be transferred to other similar reservoir systems.
Resumo:
The most widely used updating rule for non-additive probalities is the Dempster-Schafer rule. Schmeidles and Gilboa have developed a model of decision making under uncertainty based on non-additive probabilities, and in their paper “Updating Ambiguos Beliefs” they justify the Dempster-Schafer rule based on a maximum likelihood procedure. This note shows in the context of Schmeidler-Gilboa preferences under uncertainty, that the Dempster-Schafer rule is in general not ex-ante optimal. This contrasts with Brown’s result that Bayes’ rule is ex-ante optimal for standard Savage preferences with additive probabilities.
Resumo:
Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.
Resumo:
This work aims to examine the television social representation by mothers/educators as a TV viewers, to understand the meaning of this media in their quotidian and which relations occur between teachers and students in the classroom. The study purpose is the educational television rule, based on a social representation approach. It look for to reveal, through the discourses of five educators who are engaged in pedagogic activities in the Public Elementary School of the Natal city, a significant experience in the media education field progress. It’s also a way to understand which representations the educators have about the television can contribute to aid the idea and critic analysis about the media meaning in the teacher’s formation. Some questions were in the basis of the investigation as: What is the television for the educators who are also TV viewers? How it reaches the classroom? Their relation with the media interfere in the pedagogic practice? Assuming that the verbal technical is one of the formal ways to access the representations, the methodological strategy employed was the open interview, guided by a wide and flexible schedule, leaving the interviewees free to expose their ideas, a attitude adopted to avoid the imposition of interviwer’s points of view, that result in a rich material. Through this strategy it was possible to confirm or to reject presumptions raised in the beginning of the investigation and modify some planning direction lines. The study has as the theory presupposition the contribution of the Mexican researcher Guillermo Orozco Gómez, who, based on the Paulo Freire e Jesús Martín-Barbero ideas, establishes a dialogue between popular Education and the communication theories, mainly the television reception, when he develops an integral view focused on the audience or on the multiple mediations model. The school – and the family, as well – is an important mediator of the media information. The relationship which the teachers establish between the television and their representations about it in their lives reflects effectively and directly on their professional practice and on the media dialogue within the school, it can contribute to the critic reflection which students establish with the media trough the educators mediation
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In this work, we propose a two-stage algorithm for real-time fault detection and identification of industrial plants. Our proposal is based on the analysis of selected features using recursive density estimation and a new evolving classifier algorithm. More specifically, the proposed approach for the detection stage is based on the concept of density in the data space, which is not the same as probability density function, but is a very useful measure for abnormality/outliers detection. This density can be expressed by a Cauchy function and can be calculated recursively, which makes it memory and computational power efficient and, therefore, suitable for on-line applications. The identification/diagnosis stage is based on a self-developing (evolving) fuzzy rule-based classifier system proposed in this work, called AutoClass. An important property of AutoClass is that it can start learning from scratch". Not only do the fuzzy rules not need to be prespecified, but neither do the number of classes for AutoClass (the number may grow, with new class labels being added by the on-line learning process), in a fully unsupervised manner. In the event that an initial rule base exists, AutoClass can evolve/develop it further based on the newly arrived faulty state data. In order to validate our proposal, we present experimental results from a level control didactic process, where control and error signals are used as features for the fault detection and identification systems, but the approach is generic and the number of features can be significant due to the computationally lean methodology, since covariance or more complex calculations, as well as storage of old data, are not required. The obtained results are significantly better than the traditional approaches used for comparison
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Oil spills cause great damage to coastal habitats, especially when rapid and suitable response measures are not taken. Establishing high priority areas is fundamental for the operation of response teams. Under this context and considering the need for keeping all geographical information up-to-date for emergencial use, the present study proposes employing a decision tree coupled with a knowledge-based approach using GIS to assign oil sensitivity indices to Brazilian coastal habitats. The modelled system works based on rules set by the official standards of Brazilian Federal Environment Organ. We tested it on one of the littoral regions of Brazil where transportation of petroleum is most intense: the coast of the municipalities of Sao Sebastiao and Caraguatatuba in the northern littoral of São Paulo state, Brazil. The system automatically ranked the littoral sensitivity index of the study area habitats according to geographical conditions during summer and winter; since index ranks of some habitats varied between these seasons because of sediment alterations. The obtained results illustrate the great potential of the proposed system in generating ESI maps and in aiding response teams during emergency operations. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents an approach to integrate an artificial intelligence (AI) technique, concretely rule-based processing, into mobile agents. In particular, it focuses on the aspects of designing and implementing an appropriate inference engine of small size to reduce migration costs. The main goal is combine two lines of agent research, First, the engineering oriented approach on mobile agent architectures, and, second, the AI related approach on inference engines driven by rules expressed in a restricted subset of first-order predicate logic (FOPL). In addition to size reduction, the main functions of this type of engine were isolated, generalized and implemented as dynamic components, making possible not only their migration with the agent, but also their dynamic migration and loading on demand. A set of classes for representing and exchanging knowledge between rule-based systems was also proposed.