899 resultados para Expert System. Rule-based System. Inference Engine. Rules. Alarm Management. Alarm filtering


Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUÇÃO: O hipotireoidismo subclínico (HSC), definido por concentrações elevadas do TSH em face de níveis normais dos hormônios tireoidianos, tem elevada prevalência no Brasil, particularmente entre mulheres e idosos. Embora um número crescente de estudos venha associando o HSC com maior risco de doença arterial coronariana e de mortalidade, não há ensaio clínico randomizado sobre o benefício do tratamento com levotiroxina na redução dos riscos e o tratamento permanece controverso. OBJETIVO: Este consenso, patrocinado pelo Departamento de Tireoide da Sociedade Brasileira de Endocrinologia e Metabologia e desenvolvido por especialistas brasileiros com vasta experiência clínica em tireoide, apresenta recomendações baseadas em evidências para uma abordagem clínica do paciente com HSC no Brasil. MATERIAIS E MÉTODOS: Após estruturação das questões clínicas, a busca das evidências disponíveis na literatura foi realizada inicialmente na base de dados do MedLine-PubMed e posteriormente nas bases Embase e SciELO - Lilacs. A força da evidência, avaliada pelo sistema de classificação de Oxford, foi estabelecida a partir do desenho de estudo utilizado, considerando-se a melhor evidência disponível para cada questão e a experiência brasileira. RESULTADOS: Os temas abordados foram definição e diagnóstico, história natural, significado clínico, tratamento e gestação, que resultaram em 29 recomendações para a abordagem clínica do paciente adulto com HSC. CONCLUSÃO: O tratamento com levotiroxina foi recomendado para todos os pacientes com HSC persistente com níveis séricos do TSH > 10 mU/L e para alguns subgrupos especiais de pacientes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to present a procedure that utilizes C-13 NMR for identification of substituent groups which are bonded to carbon skeletons of natural products. For so much was developed a new version of the program (MACRONO), that presents a database with 161 substituent types found in the most varied terpenoids. This new version was widely tested in the identification of the substituents of 60 compounds that, after removal of the signals that did not belong to the carbon skeleton, served to test the prediction of skeletons by using other programs of the expert system (SISTEMAT). (C) 2002 Elsevier B.V. Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new expert system: a constraints generator for structure determination of natural products. The constraints that the system furnishes are: skeleton (reliability: 95%), large substructures (reliability: 98%) and their associated assignments (reliability: 90%) This system is intended for structure determination of carbon-rich compounds (sesqui-, di- and triterpenes, sterols etc.) for which most structures generators are not very effective. We also present a new algorithm that can avoid the combinatorial explosion during subspectrum/substructure analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a new computer approach for chemotaxonomic studies. The methodology employed enables the search for chemical substructures as taxonomic descriptors using all expert system built with plant natural products. The operation of the system was tested with diterpenes as taxonomic markers in Lamiaceae. © 2001 Elsevier Science Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work describes the development of a new program, named SISTAX, for the expert system SISTEMAT. This program allows anyone interested in chemotaxonomy to carry out an intelligent search for organic compounds in databases through chemical structures. When coupled with can efficient encoding system, the program recognizes skeletal types and can find any substructural constraints demanded by the user. An example of an application of the program to the diterpene class found in plants is described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The supply chain management, postponement and demand management operations are of strategic importance to the economic success of organizations because they directly influence the production process. The aim of this paper is to analyze the influence of the postponement in an enterprise production system with make-to-stock and with seasonal demand. The research method used was a case study, the instruments of data collection were semi-structured interviews, document analysis and site visits. The research is based on the following issues: Demand Management which can be understood as a practice that allows you to manage and coordinate the supply chain in reverse, in which consumers trigger actions for the delivery of products. The Supply Chain Management is able to allow the addition of value, exceeding the expectations of consumers, developing a relationship with suppliers and customer's win-win. The Postponement strategy must fit the characteristics of markets that require variety of customized products and services, lower cost and higher quality, aiming to support decision making. The production system make-to-stock shows enough interest to organizations that are operating in markets with high demand variability. © 2011 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organizations often operate in turbulent environments characterized by intense competitiveness, constant technological progress, new market requirements, and scarce natural resources. This scenario imposes the constant need for change in the operation and companies' management. The integration of certifiable management systems is an effective alternative in this sense. The objective of the present study is to propose guidelines for the integration of the ISO 9001 Quality Management System (QMS), ISO 14001 Environmental Management System (EMS) and OHSAS 18001 Occupational Health and Safety Management System (OHSMS) in industrial companies. These guidelines were developed based on a theoretical framework and on the results from fourteen case studies performed in Brazilian industrial companies. The proposed guidelines were divided into three phases: A) integration planning, b) integration development, and c) integration control and improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Energia na Agricultura) - FCA

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Mecânica - FEG

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Undocked institutions widely dilate the scope of time-space distancing, since the separation between these elements is crucial for the extreme dynamism of modernity. This separation is the prime condition of the process of disembedding, in which the displacement of social relations of local contexts of interaction and its redesign through indefinite extensions of time and space are favored. Therefore, there are two types of dislocated mechanisms intrinsically involved in the development of social modern institutions: symbolic tokens and expert systems. These mechanisms depend on reliability. Within this context, this present paper sees the professional life cycle of the teacher in the perspective of the expert system. It also sees the work stability of public servants in the symbolic token perspective, more specifically public teachers in Brazil. This paper also attempts to propose some reflections on what a disembedding in Brazilian education would mean and on how an eventual impasse between teachercentrism and studentcentrism would affect this process if necessary or desired, from one would understand by modernity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Arboviral diseases are major global public health threats. Yet, our understanding of infection risk factors is, with a few exceptions, considerably limited. A crucial shortcoming is the widespread use of analytical methods generally not suited for observational data - particularly null hypothesis-testing (NHT) and step-wise regression (SWR). Using Mayaro virus (MAYV) as a case study, here we compare information theory-based multimodel inference (MMI) with conventional analyses for arboviral infection risk factor assessment. Methodology/Principal Findings: A cross-sectional survey of anti-MAYV antibodies revealed 44% prevalence (n = 270 subjects) in a central Amazon rural settlement. NHT suggested that residents of village-like household clusters and those using closed toilet/latrines were at higher risk, while living in non-village-like areas, using bednets, and owning fowl, pigs or dogs were protective. The "minimum adequate" SWR model retained only residence area and bednet use. Using MMI, we identified relevant covariates, quantified their relative importance, and estimated effect-sizes (beta +/- SE) on which to base inference. Residence area (beta(Village) = 2.93 +/- 0.41; beta(Upland) = -0.56 +/- 0.33, beta(Riverbanks) = -2.37 +/- 0.55) and bednet use (beta = -0.95 +/- 0.28) were the most important factors, followed by crop-plot ownership (beta = 0.39 +/- 0.22) and regular use of a closed toilet/latrine (beta = 0.19 +/- 0.13); domestic animals had insignificant protective effects and were relatively unimportant. The SWR model ranked fifth among the 128 models in the final MMI set. Conclusions/Significance: Our analyses illustrate how MMI can enhance inference on infection risk factors when compared with NHT or SWR. MMI indicates that forest crop-plot workers are likely exposed to typical MAYV cycles maintained by diurnal, forest dwelling vectors; however, MAYV might also be circulating in nocturnal, domestic-peridomestic cycles in village-like areas. This suggests either a vector shift (synanthropic mosquitoes vectoring MAYV) or a habitat/habits shift (classical MAYV vectors adapting to densely populated landscapes and nocturnal biting); any such ecological/adaptive novelty could increase the likelihood of MAYV emergence in Amazonia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents Bayesian solutions to inference problems for three types of social network data structures: a single observation of a social network, repeated observations on the same social network, and repeated observations on a social network developing through time. A social network is conceived as being a structure consisting of actors and their social interaction with each other. A common conceptualisation of social networks is to let the actors be represented by nodes in a graph with edges between pairs of nodes that are relationally tied to each other according to some definition. Statistical analysis of social networks is to a large extent concerned with modelling of these relational ties, which lends itself to empirical evaluation. The first paper deals with a family of statistical models for social networks called exponential random graphs that takes various structural features of the network into account. In general, the likelihood functions of exponential random graphs are only known up to a constant of proportionality. A procedure for performing Bayesian inference using Markov chain Monte Carlo (MCMC) methods is presented. The algorithm consists of two basic steps, one in which an ordinary Metropolis-Hastings up-dating step is used, and another in which an importance sampling scheme is used to calculate the acceptance probability of the Metropolis-Hastings step. In paper number two a method for modelling reports given by actors (or other informants) on their social interaction with others is investigated in a Bayesian framework. The model contains two basic ingredients: the unknown network structure and functions that link this unknown network structure to the reports given by the actors. These functions take the form of probit link functions. An intrinsic problem is that the model is not identified, meaning that there are combinations of values on the unknown structure and the parameters in the probit link functions that are observationally equivalent. Instead of using restrictions for achieving identification, it is proposed that the different observationally equivalent combinations of parameters and unknown structure be investigated a posteriori. Estimation of parameters is carried out using Gibbs sampling with a switching devise that enables transitions between posterior modal regions. The main goal of the procedures is to provide tools for comparisons of different model specifications. Papers 3 and 4, propose Bayesian methods for longitudinal social networks. The premise of the models investigated is that overall change in social networks occurs as a consequence of sequences of incremental changes. Models for the evolution of social networks using continuos-time Markov chains are meant to capture these dynamics. Paper 3 presents an MCMC algorithm for exploring the posteriors of parameters for such Markov chains. More specifically, the unobserved evolution of the network in-between observations is explicitly modelled thereby avoiding the need to deal with explicit formulas for the transition probabilities. This enables likelihood based parameter inference in a wider class of network evolution models than has been available before. Paper 4 builds on the proposed inference procedure of Paper 3 and demonstrates how to perform model selection for a class of network evolution models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L'approvvigionamento di risorse minerali e la tutela dell'ambiente sono spesso considerate attività contrapposte ed inconciliabili, ma in realtà rappresentano due necessità imprescindibili per le società moderne. Le georisorse, in quanto non rinnovabili, devono essere valorizzate in maniera efficiente, adoperando strumenti che garantiscano la sostenibilità ambientale, sociale ed economica degli interventi estrattivi. La necessità di tutelare il territorio e migliorare la qualità della vita delle comunità locali impone alla Pubblica Amministrazione di implementare misure per la riqualificazione di aree degradate, ma fino ai primi anni '90 la normativa di settore non prevedeva strumenti a tal proposito, e ciò ha portato alla proliferazione di siti estrattivi dismessi e abbandonati senza interventi di recupero ambientale. Il presente lavoro di ricerca fornisce contributi innovativi alla pianificazione e progettazione sostenibile delle attività estrattive, attraverso l'adozione di un approccio multidisciplinare alla trattazione del tema e l'utilizzo esperto dei Sistemi Informativi Geografici, in particolare GRASS GIS. A seguito di una approfondita analisi in merito agli strumenti e le procedure adottate nella pianificazione delle Attività Estrattive in Italia, sono stati sviluppati un metodo di indagine ed un sistema esperto per la previsione ed il controllo delle vibrazioni indotte nel terreno da volate in cava a cielo aperto, che consentono di ottimizzare la progettazione della volata e del sistema di monitoraggio delle vibrazioni grazie a specifici strumenti operativi implementati in GRASS GIS. A supporto di una più efficace programmazione di interventi di riqualificazione territoriale, è stata messa a punto una procedura per la selezione di siti dismessi e di potenziali interventi di riqualificazione, che ottimizza le attività di pianificazione individuando interventi caratterizzati da elevata sostenibilità ambientale, economica e sociale. I risultati ottenuti dimostrano la necessità di un approccio esperto alla pianificazione ed alla progettazione delle attività estrattive, incrementandone la sostenibilità attraverso l'adozione di strumenti operativi più efficienti.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last couple of decades we assisted to a reappraisal of spatial design-based techniques. Usually the spatial information regarding the spatial location of the individuals of a population has been used to develop efficient sampling designs. This thesis aims at offering a new technique for both inference on individual values and global population values able to employ the spatial information available before sampling at estimation level by rewriting a deterministic interpolator under a design-based framework. The achieved point estimator of the individual values is treated both in the case of finite spatial populations and continuous spatial domains, while the theory on the estimator of the population global value covers the finite population case only. A fairly broad simulation study compares the results of the point estimator with the simple random sampling without replacement estimator in predictive form and the kriging, which is the benchmark technique for inference on spatial data. The Monte Carlo experiment is carried out on populations generated according to different superpopulation methods in order to manage different aspects of the spatial structure. The simulation outcomes point out that the proposed point estimator has almost the same behaviour as the kriging predictor regardless of the parameters adopted for generating the populations, especially for low sampling fractions. Moreover, the use of the spatial information improves substantially design-based spatial inference on individual values.