899 resultados para Expert System. Rule-based System. Inference Engine. Rules. Alarm Management. Alarm filtering
Resumo:
This paper presents a methodology and software for hazard rate analysis of induction type watt-hour meters, considering the main variables related with the degradation process of these meters, for the Elektro Electricity and Services SA. The modeling developed to calculate the watt-hour meters hazard rate was implemented in a tool through a user friendly platform, in Delphi language, enabling not only hazard rate analysis, but also a classification by risk range, localization of installation for the analyzed meters, and, allowing, through an expert system, the sampling of induction type watt-hour meters, based on the model risk developed with artificial intelligence, with the mainly goal of follow and manage the process of degradation, maintenance and replacement of these meters. © 2010 IEEE.
Specialist tool for monitoring the measurement degradation process of induction active energy meters
Resumo:
This paper presents a methodology and a specialist tool for failure probability analysis of induction type watt-hour meters, considering the main variables related to their measurement degradation processes. The database of the metering park of a distribution company, named Elektro Electricity and Services Co., was used for determining the most relevant variables and to feed the data in the software. The modeling developed to calculate the watt-hour meters probability of failure was implemented in a tool through a user friendly platform, written in Delphi language. Among the main features of this tool are: analysis of probability of failure by risk range; geographical localization of the meters in the metering park, and automatic sampling of induction type watt-hour meters, based on a risk classification expert system, in order to obtain information to aid the management of these meters. The main goals of the specialist tool are following and managing the measurement degradation, maintenance and replacement processes for induction watt-hour meters. © 2011 IEEE.
Resumo:
The aimed of this article is to measure risk factors on health and milk production on organic and conventional dairy goats in Brazil. Two experimental groups (organic and conventional) were evaluated simultaneously. The study design was completely randomized. The organic herd consisted of 25 goats and 15 kids. In the conventional production system, a dairy herd comprising 40 goats and 20 kids participated in the study. Data on milk production and health management were available from January 2007 to December 2009. The abortion rate in the conventional system was 5% (2/40) whereas in organic system no abortion was diagnosed (0/25). The mortality rate at weaning in the conventional system was 5% (2/40) and in the organic system was 8% (2/25). Milk production was lower (2.20 kg/day) in organic than conventional system (2.66 kg/day). Goats and kids in organic farm had a higher FEC (386±104 and 900±204, respectively) (p<0.05) than those in conventional farm (245±132 and 634±212, respectively). In addition, Saanen kids had higher FEC (p<0.001) than goats. Treatment with antiparasitic drugs was higher in conventional system (50%) than organic system (1.3%).
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
O presente trabalho objetiva analisar as implicações do Programa “Excelência em Gestão Educacional” da Fundação Itaú Social na gestão da escola pública brasileira, em termos de orientações teórico- metodológicas contidas em documentos balizadores da parceria firmada. Para tanto, foi feita uma pesquisa documental que, por meio de análise de conteúdo, buscou analisar os documentos referentes a esse programa. As análises desenvolvidas mostraram que o modelo de gestão defendido pela Fundação Itaú para a educação brasileira é o das escolas charter americanas, escolas financiadas pelo setor público, mas administradas pelo setor privado. Tais escolas são apresentadas como tendo melhorado significativamente os índices educacionais nos EUA. No entanto, constatou-se que a realidade concreta não condiz com a apresentada pelo Programa Excelência em Gestão, pois o modelo de gestão baseada nos parâmetros do mercado, que associa conceitos como qualidade, participação, descentralização, autonomia e avaliação à ideia de gerenciamento de recursos com vista à produtividade do sistema educacional, não foi capaz de melhorar o sistema educacional americano. Muito pelo contrário, agravou ainda mais a crise da educação pública naquele País. No Brasil, já existem experiências nesse sentido e as análises sobre as escolas charter que foram implantadas em Pernambuco revelaram que as mesmas adotam na sua gestão padrões gerenciais trazidos do mundo empresarial. Assim, verificou-se a introdução de princípios de mercado como o da gestão gerencial, da definição de metas e resultados, expressos nos seus planejamentos estratégicos, da remuneração por mérito para os professores e a generalização dos testes de avaliação, dentre outros. Nesse contexto, a autonomia escolar é entendida como maior responsabilização dos professores e diretores pelo sucesso ou fracasso da escola, e, sobretudo do gestor, como liderança de todo o processo. Além disso, nessas escolas não existe autonomia pedagógica, pois o projeto pedagógico é elaborado de acordo com critérios de produtividade definidos previamente pelo órgão responsável pela implantação dessas escolas (PROCENTRO). A participação que se desenvolve nesse contexto não passa de um mero processo de colaboração, de mão única, de adesão, de obediência às decisões que são tomadas de cima para baixo. Fica claro que esse modelo de gestão e de escola não contribui para a democratização das relações de poder na escola e consequentemente para a formação da cidadania.
Resumo:
We consider some of the relations that exist between real Szegö polynomials and certain para-orthogonal polynomials defined on the unit circle, which are again related to certain orthogonal polynomials on [-1, 1] through the transformation x = (z1/2+z1/2)/2. Using these relations we study the interpolatory quadrature rule based on the zeros of polynomials which are linear combinations of the orthogonal polynomials on [-1, 1]. In the case of any symmetric quadrature rule on [-1, 1], its associated quadrature rule on the unit circle is also given.
Resumo:
Pós-graduação em Ciências Ambientais - Sorocaba
Resumo:
This qualitative, exploratory, descriptive study was performed with the objective of understanding the perception of the nurses working in medical-surgical units of a university hospital, regarding the strategies developed to perform a pilot test of the PROCEnf-USP electronic system, with the purpose of computerizing clinical nursing documentation. Eleven nurses of a theoretical-practical training program were interviewed and the obtained data were analyzed using the Content Analysis Technique. The following categories were discussed based on the references of participative management and planned changes: favorable aspects for the implementation; unfavorable aspects for the implementation; and expectations regarding the implementation. According to the nurses' perceptions, the preliminary use of the electronic system allowed them to show their potential and to propose improvements, encouraging them to become partners of the group manager in the dissemination to other nurses of the institution.
Resumo:
The vast majority of known proteins have not yet been experimentally characterized and little is known about their function. The design and implementation of computational tools can provide insight into the function of proteins based on their sequence, their structure, their evolutionary history and their association with other proteins. Knowledge of the three-dimensional (3D) structure of a protein can lead to a deep understanding of its mode of action and interaction, but currently the structures of <1% of sequences have been experimentally solved. For this reason, it became urgent to develop new methods that are able to computationally extract relevant information from protein sequence and structure. The starting point of my work has been the study of the properties of contacts between protein residues, since they constrain protein folding and characterize different protein structures. Prediction of residue contacts in proteins is an interesting problem whose solution may be useful in protein folding recognition and de novo design. The prediction of these contacts requires the study of the protein inter-residue distances related to the specific type of amino acid pair that are encoded in the so-called contact map. An interesting new way of analyzing those structures came out when network studies were introduced, with pivotal papers demonstrating that protein contact networks also exhibit small-world behavior. In order to highlight constraints for the prediction of protein contact maps and for applications in the field of protein structure prediction and/or reconstruction from experimentally determined contact maps, I studied to which extent the characteristic path length and clustering coefficient of the protein contacts network are values that reveal characteristic features of protein contact maps. Provided that residue contacts are known for a protein sequence, the major features of its 3D structure could be deduced by combining this knowledge with correctly predicted motifs of secondary structure. In the second part of my work I focused on a particular protein structural motif, the coiled-coil, known to mediate a variety of fundamental biological interactions. Coiled-coils are found in a variety of structural forms and in a wide range of proteins including, for example, small units such as leucine zippers that drive the dimerization of many transcription factors or more complex structures such as the family of viral proteins responsible for virus-host membrane fusion. The coiled-coil structural motif is estimated to account for 5-10% of the protein sequences in the various genomes. Given their biological importance, in my work I introduced a Hidden Markov Model (HMM) that exploits the evolutionary information derived from multiple sequence alignments, to predict coiled-coil regions and to discriminate coiled-coil sequences. The results indicate that the new HMM outperforms all the existing programs and can be adopted for the coiled-coil prediction and for large-scale genome annotation. Genome annotation is a key issue in modern computational biology, being the starting point towards the understanding of the complex processes involved in biological networks. The rapid growth in the number of protein sequences and structures available poses new fundamental problems that still deserve an interpretation. Nevertheless, these data are at the basis of the design of new strategies for tackling problems such as the prediction of protein structure and function. Experimental determination of the functions of all these proteins would be a hugely time-consuming and costly task and, in most instances, has not been carried out. As an example, currently, approximately only 20% of annotated proteins in the Homo sapiens genome have been experimentally characterized. A commonly adopted procedure for annotating protein sequences relies on the "inheritance through homology" based on the notion that similar sequences share similar functions and structures. This procedure consists in the assignment of sequences to a specific group of functionally related sequences which had been grouped through clustering techniques. The clustering procedure is based on suitable similarity rules, since predicting protein structure and function from sequence largely depends on the value of sequence identity. However, additional levels of complexity are due to multi-domain proteins, to proteins that share common domains but that do not necessarily share the same function, to the finding that different combinations of shared domains can lead to different biological roles. In the last part of this study I developed and validate a system that contributes to sequence annotation by taking advantage of a validated transfer through inheritance procedure of the molecular functions and of the structural templates. After a cross-genome comparison with the BLAST program, clusters were built on the basis of two stringent constraints on sequence identity and coverage of the alignment. The adopted measure explicity answers to the problem of multi-domain proteins annotation and allows a fine grain division of the whole set of proteomes used, that ensures cluster homogeneity in terms of sequence length. A high level of coverage of structure templates on the length of protein sequences within clusters ensures that multi-domain proteins when present can be templates for sequences of similar length. This annotation procedure includes the possibility of reliably transferring statistically validated functions and structures to sequences considering information available in the present data bases of molecular functions and structures.
Resumo:
The subject of this thesis is in the area of Applied Mathematics known as Inverse Problems. Inverse problems are those where a set of measured data is analysed in order to get as much information as possible on a model which is assumed to represent a system in the real world. We study two inverse problems in the fields of classical and quantum physics: QCD condensates from tau-decay data and the inverse conductivity problem. Despite a concentrated effort by physicists extending over many years, an understanding of QCD from first principles continues to be elusive. Fortunately, data continues to appear which provide a rather direct probe of the inner workings of the strong interactions. We use a functional method which allows us to extract within rather general assumptions phenomenological parameters of QCD (the condensates) from a comparison of the time-like experimental data with asymptotic space-like results from theory. The price to be paid for the generality of assumptions is relatively large errors in the values of the extracted parameters. Although we do not claim that our method is superior to other approaches, we hope that our results lend additional confidence to the numerical results obtained with the help of methods based on QCD sum rules. EIT is a technology developed to image the electrical conductivity distribution of a conductive medium. The technique works by performing simultaneous measurements of direct or alternating electric currents and voltages on the boundary of an object. These are the data used by an image reconstruction algorithm to determine the electrical conductivity distribution within the object. In this thesis, two approaches of EIT image reconstruction are proposed. The first is based on reformulating the inverse problem in terms of integral equations. This method uses only a single set of measurements for the reconstruction. The second approach is an algorithm based on linearisation which uses more then one set of measurements. A promising result is that one can qualitatively reconstruct the conductivity inside the cross-section of a human chest. Even though the human volunteer is neither two-dimensional nor circular, such reconstructions can be useful in medical applications: monitoring for lung problems such as accumulating fluid or a collapsed lung and noninvasive monitoring of heart function and blood flow.
Resumo:
Modern Internal Combustion Engines are becoming increasingly complex in terms of their control systems and strategies. The growth of the algorithms’ complexity results in a rise of the number of on-board quantities for control purposes. In order to improve combustion efficiency and, simultaneously, limit the amount of pollutant emissions, the on-board evaluation of two quantities in particular has become essential; namely indicated torque produced by the engine and the angular position where 50% of fuel mass injected over an engine cycle is burned (MFB50). The above mentioned quantities can be evaluated through the measurement of in-cylinder pressure. Nonetheless, at the time being, the installation of in-cylinder pressure sensors on vehicles is extremely uncommon mainly because of measurement reliability and costs. This work illustrates a methodological approach for the estimation of indicated torque and MFB50 that is based on the engine speed fluctuation measurement. This methodology is compatible with the typical on-board application restraints. Moreover, it requires no additional costs since speed can be measured using the system already mounted on the vehicle, which is made of a magnetic pick-up faced to a toothed wheel. The estimation algorithm consists of two main parts: first, the evaluation of indicated torque fluctuation based on speed measurement and secondly, the evaluation of the mean value of the indicated torque (over an engine cycle) and MFB50 by using the relationship with the indicated torque harmonic and other engine quantities. The procedure has been successfully applied to an L4 turbocharged Diesel engine mounted on-board a vehicle.
Resumo:
BACKGROUND AND OBJECTIVE: Sleep disturbances are prevalent but often overlooked or underestimated. We suspected that sleep disorders might be particularly common among pharmacy customers, and that they could benefit from counselling. Therefore, we described the prevalence and severity of symptoms associated with sleep and wakefulness disorders among Swiss pharmacy customers, and estimated the need for counselling and treatment. METHODS: In 804 Swiss pharmacies (49% of all community pharmacies) clients were invited to complete the Stanford Sleep Disorders Questionnaire (SDQ), and the Epworth Sleepiness Scale (EPW). The SDQ was designed to classify symptoms of sleep and wakefulness into the four most prevalent disorders: sleep apnoea syndrome (SAS), insomnia in psychiatric disorders (PSY), periodic leg movement disorders/restless legs (RLS) and narcolepsy (NAR). Data were entered into an internet-linked database for analysis by an expert system as a basis for immediate counselling by the pharmacist. RESULTS: Of 4901 participants, 3238 (66.1%) were female, and 1663 (33.9%) were male. The mean age (SD) of females and males was 52.4 (18.05), and 55.1 (17.10) years, respectively. The percentages of female and male individuals above cut-off of SDQ subscales were 11.4% and 19.8% for sleep apnoea, 40.9% and 38.7% for psychiatric sleep disorders, 59.3% and 46.8% for restless legs, and 10.4% and 9.4% for narcolepsy respectively. The prevalence of an Epworth Sleepiness Scale score >11 was 16.5% in females, and 23.9% in males. Reliability assessed by Cronbach's alpha was 0.65 to 0.78 for SDQ subscales, and for the Epworth score. CONCLUSIONS: Symptoms of sleep and wakefulness disorders among Swiss pharmacy customers were highly prevalent. The SDQ and the Epworth Sleepiness Scale score had a satisfactory reliability to be useful for identification of pharmacy customers who might benefit from information and counselling while visiting pharmacies. The internet-based system proved to be a helpful tool for the pharmacist when counselling his customers in terms of diagnostic classification and severity of symptoms associated with the sleeping and waking state.
Resumo:
We describe a system for performing SLA-driven management and orchestration of distributed infrastructures composed of services supporting mobile computing use cases. In particular, we focus on a Follow-Me Cloud scenario in which we consider mobile users accessing cloud-enable services. We combine a SLA-driven approach to infrastructure optimization, with forecast-based performance degradation preventive actions and pattern detection for supporting mobile cloud infrastructure management. We present our system's information model and architecture including the algorithmic support and the proposed scenarios for system evaluation.
Resumo:
Spike timing dependent plasticity (STDP) is a phenomenon in which the precise timing of spikes affects the sign and magnitude of changes in synaptic strength. STDP is often interpreted as the comprehensive learning rule for a synapse - the "first law" of synaptic plasticity. This interpretation is made explicit in theoretical models in which the total plasticity produced by complex spike patterns results from a superposition of the effects of all spike pairs. Although such models are appealing for their simplicity, they can fail dramatically. For example, the measured single-spike learning rule between hippocampal CA3 and CA1 pyramidal neurons does not predict the existence of long-term potentiation one of the best-known forms of synaptic plasticity. Layers of complexity have been added to the basic STDP model to repair predictive failures, but they have been outstripped by experimental data. We propose an alternate first law: neural activity triggers changes in key biochemical intermediates, which act as a more direct trigger of plasticity mechanisms. One particularly successful model uses intracellular calcium as the intermediate and can account for many observed properties of bidirectional plasticity. In this formulation, STDP is not itself the basis for explaining other forms of plasticity, but is instead a consequence of changes in the biochemical intermediate, calcium. Eventually a mechanism-based framework for learning rules should include other messengers, discrete change at individual synapses, spread of plasticity among neighboring synapses, and priming of hidden processes that change a synapse's susceptibility to future change. Mechanism-based models provide a rich framework for the computational representation of synaptic plasticity.