901 resultados para System Compositional Approach
Resumo:
Dyslipidaemia is one of the major cardiovascular risk factors, it can be due to primary causes (i.e. monogenic, characterized by a single gene mutation, or dyslipidaemia of polygenic/environmental causes), or secondary to specific disorders such as obesity, diabetes mellitus or hypothyroidism. Monogenic patients present the most severe phenotype and so they need to be identified in early age so pharmacologic treatment can be implemented to decrease the cardiovascular risk. However the majority of hyperlipidemic patients most likely have a polygenic disease that can be mainly controlled just by the implementation of a healthy lifestyle. Thus, the distinction between monogenic and polygenic dyslipidaemia is important for a prompt diagnosis, cardiovascular risk assessment, counselling and treatment. Besides the already stated biomarkers as LDL, apoB and apoB/apoA-I ratio, other promising (yet, needing further research) biomarkers for clinical differentiation between dyslipidaemias are apoE, sdLDL, apoC-2 and apoC-3. However, none of these biomarkers can explain the complex lipid profile of the majority of these patients.
Resumo:
257 p.
Resumo:
A thermodynamic information system for diagnosis and prognosis of an existing power plant was developed. The system is based on an analytic approach that informs the current thermodynamic condition of all cycle components, as well as the improvement that can be obtained in the cycle performance by the elimination of the discovered anomalies. The effects induced by components anomalies and repairs in other components efficiency, which have proven to be one of the main drawbacks in the diagnosis and prognosis analyses, are taken into consideration owing to the use of performance curves and corrected performance curves together with the thermodynamic data collected from the distributed control system. The approach used to develop the system is explained, the system implementation in a real gas turbine cogeneration combined cycle is described and the results are discussed. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The activity of Control Center operators is important to guarantee the effective performance of Power Systems. Operators’ actions are crucial to deal with incidents, especially severe faults, like blackouts. In this paper we present an Intelligent Tutoring approach for training Portuguese Control Centre operators in tasks like incident analysis and diagnosis, and service restoration of Power Systems. Intelligent Tutoring System (ITS) approach is used in the training of the operators, taking into account context awareness and the unobtrusive integration in the working environment.
Resumo:
Diplomityössä luodaan viitekehys tuotetiedonhallintajärjestelmän esisuunnittelua varten. Siinä on kolme ulottuvuutta: lisäarvontuotto-, toiminnallisuus- ja ohjelmistoulottuvuus. Viitekehys auttaa- tunnistamaan lisäarvontuottokomponentit, joihin voidaan vaikuttaa tiettyjen ohjelmistoluokkien tarjoamilla tuotetiedonhallintatoiminnallisuuksilla. Viitekehyksen järjestelmäsuunnittelullista näkökulmaa hyödynnetään tutkittavissa yritystapauksissa perustuen laskentamatriisin muotoon mallinnettuihin ulottuvuuksien välisiin suhteisiin. Matriisiin syötetään lisäarvontuotto- ja toiminnallisuuskomponenttien saamat tärkeydet kohdeyrityksessä suoritetussa haastattelututkimuksessa. Matriisin tuotos on tietyn ohjelmiston soveltuvuus kyseisen yrityksen tapauksessa. Soveltuvuus on joukko tunnuslukuja, jotka analysoidaan tulostenkäsittelyvaiheessa. Soveltuvuustulokset avustavat kohdeyritystä sen valitessa lähestymistapaansa tuotetiedonhallintaan - ja kuvaavat esisuunnitellun tuotetiedonhallintajärjestelmän. Viitekehyksen rakentaminen vaatii perinpohjaisen lähestymistavan merkityksellisten lisäarvontuotto- ja toiminnallisuuskomponenttien sekä ohjelmistoluokkien määrittämiseen. Määritystyö perustuu työssä yksityiskohtaisesti laadittujen menetelmien ja komponenttiryhmitysten hyödyntämiselle. Kunkin alueen analysointi mahdollistaa viitekehyksen ja laskentamatriisin rakentamisen yhdenmukaisten määritysten perusteella. Viitekehykselle on ominaista sen muunneltavuus. Nykymuodossaan se soveltuu elektroniikka- ja high-tech yrityksille. Viitekehystä voidaan hyödyntää myös muilla toimialoilla muokkaamalla lisäarvontuottokomponentteja kunkin toimialan intressien mukaisesti. Vastaavasti analysoitava ohjelmisto voidaan valita tapauskohtaisesti. Laskentamatriisi on kuitenkin ensin päivitettävä valitun ohjelmiston kyvykkyyksillä, minkä jälkeen viitekehys voi tuottaa soveltuvuustuloksia kyseiseen yritystapaukseen perustuen
Resumo:
Parameter estimation still remains a challenge in many important applications. There is a need to develop methods that utilize achievements in modern computational systems with growing capabilities. Owing to this fact different kinds of Evolutionary Algorithms are becoming an especially perspective field of research. The main aim of this thesis is to explore theoretical aspects of a specific type of Evolutionary Algorithms class, the Differential Evolution (DE) method, and implement this algorithm as codes capable to solve a large range of problems. Matlab, a numerical computing environment provided by MathWorks inc., has been utilized for this purpose. Our implementation empirically demonstrates the benefits of a stochastic optimizers with respect to deterministic optimizers in case of stochastic and chaotic problems. Furthermore, the advanced features of Differential Evolution are discussed as well as taken into account in the Matlab realization. Test "toycase" examples are presented in order to show advantages and disadvantages caused by additional aspects involved in extensions of the basic algorithm. Another aim of this paper is to apply the DE approach to the parameter estimation problem of the system exhibiting chaotic behavior, where the well-known Lorenz system with specific set of parameter values is taken as an example. Finally, the DE approach for estimation of chaotic dynamics is compared to the Ensemble prediction and parameter estimation system (EPPES) approach which was recently proposed as a possible solution for similar problems.
Resumo:
This paper presents a general modeling approach to investigate and to predict measurement errors in active energy meters both induction and electronic types. The measurement error modeling is based on Generalized Additive Model (GAM), Ridge Regression method and experimental results of meter provided by a measurement system. The measurement system provides a database of 26 pairs of test waveforms captured in a real electrical distribution system, with different load characteristics (industrial, commercial, agricultural, and residential), covering different harmonic distortions, and balanced and unbalanced voltage conditions. In order to illustrate the proposed approach, the measurement error models are discussed and several results, which are derived from experimental tests, are presented in the form of three-dimensional graphs, and generalized as error equations. © 2009 IEEE.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The translation from psychiatric core symptoms to brain functions and vice versa is a largely unresolved issue. In particular, the search for disorders of single brain regions explaining classical symptoms has not yielded the expected results. Based on the assumption that the psychopathology of psychosis is related to a functional imbalance of higher-order brain systems, the authors focused on three specific candidate brain circuitries, namely the language, and limbic and motor systems. These domains are of particular interest for understanding the disastrous communication breakdown during psychotic disorders. Core symptoms of psychosis were mapped on these domains by shaping their definitions in order to match the related brain functions. The resulting psychopathological assessment scale was tested for interrater reliability and internal consistency in a group of 168 psychotic patients. The items of the scale were reliable and a principal component analysis (PCA) was best explained by a solution resembling the three candidate systems. Based on the results, the scale was optimized as an instrument to identify patient subgroups characterized by a prevailing dysfunction of one or more of these systems. In conclusion, the scale is apt to distinguish symptom domains related to the activity of defined brain systems. PCA showed a certain degree of independence of the system-specific symptom clusters within the patient group, indicating relative subgroups of psychosis. The scale is understood as a research instrument to investigate psychoses based on a system-oriented approach. Possible immediate advantages in the clinical application of the understanding of psychoses related to system-specific symptom domains are also discussed.
Resumo:
Las pruebas de software (Testing) son en la actualidad la técnica más utilizada para la validación y la evaluación de la calidad de un programa. El testing está integrado en todas las metodologías prácticas de desarrollo de software y juega un papel crucial en el éxito de cualquier proyecto de software. Desde las unidades de código más pequeñas a los componentes más complejos, su integración en un sistema de software y su despliegue a producción, todas las piezas de un producto de software deben ser probadas a fondo antes de que el producto de software pueda ser liberado a un entorno de producción. La mayor limitación del testing de software es que continúa siendo un conjunto de tareas manuales, representando una buena parte del coste total de desarrollo. En este escenario, la automatización resulta fundamental para aliviar estos altos costes. La generación automática de casos de pruebas (TCG, del inglés test case generation) es el proceso de generar automáticamente casos de prueba que logren un alto recubrimiento del programa. Entre la gran variedad de enfoques hacia la TCG, esta tesis se centra en un enfoque estructural de caja blanca, y más concretamente en una de las técnicas más utilizadas actualmente, la ejecución simbólica. En ejecución simbólica, el programa bajo pruebas es ejecutado con expresiones simbólicas como argumentos de entrada en lugar de valores concretos. Esta tesis se basa en un marco general para la generación automática de casos de prueba dirigido a programas imperativos orientados a objetos (Java, por ejemplo) y basado en programación lógica con restricciones (CLP, del inglés constraint logic programming). En este marco general, el programa imperativo bajo pruebas es primeramente traducido a un programa CLP equivalente, y luego dicho programa CLP es ejecutado simbólicamente utilizando los mecanismos de evaluación estándar de CLP, extendidos con operaciones especiales para el tratamiento de estructuras de datos dinámicas. Mejorar la escalabilidad y la eficiencia de la ejecución simbólica constituye un reto muy importante. Es bien sabido que la ejecución simbólica resulta impracticable debido al gran número de caminos de ejecución que deben ser explorados y a tamaño de las restricciones que se deben manipular. Además, la generación de casos de prueba mediante ejecución simbólica tiende a producir un número innecesariamente grande de casos de prueba cuando es aplicada a programas de tamaño medio o grande. Las contribuciones de esta tesis pueden ser resumidas como sigue. (1) Se desarrolla un enfoque composicional basado en CLP para la generación de casos de prueba, el cual busca aliviar el problema de la explosión de caminos interprocedimiento analizando de forma separada cada componente (p.ej. método) del programa bajo pruebas, almacenando los resultados y reutilizándolos incrementalmente hasta obtener resultados para el programa completo. También se ha desarrollado un enfoque composicional basado en especialización de programas (evaluación parcial) para la herramienta de ejecución simbólica Symbolic PathFinder (SPF). (2) Se propone una metodología para usar información del consumo de recursos del programa bajo pruebas para guiar la ejecución simbólica hacia aquellas partes del programa que satisfacen una determinada política de recursos, evitando la exploración de aquellas partes del programa que violan dicha política. (3) Se propone una metodología genérica para guiar la ejecución simbólica hacia las partes más interesantes del programa, la cual utiliza abstracciones como generadores de trazas para guiar la ejecución de acuerdo a criterios de selección estructurales. (4) Se propone un nuevo resolutor de restricciones, el cual maneja eficientemente restricciones sobre el uso de la memoria dinámica global (heap) durante ejecución simbólica, el cual mejora considerablemente el rendimiento de la técnica estándar utilizada para este propósito, la \lazy initialization". (5) Todas las técnicas propuestas han sido implementadas en el sistema PET (el enfoque composicional ha sido también implementado en la herramienta SPF). Mediante evaluación experimental se ha confirmado que todas ellas mejoran considerablemente la escalabilidad y eficiencia de la ejecución simbólica y la generación de casos de prueba. ABSTRACT Testing is nowadays the most used technique to validate software and assess its quality. It is integrated into all practical software development methodologies and plays a crucial role towards the success of any software project. From the smallest units of code to the most complex components and their integration into a software system and later deployment; all pieces of a software product must be tested thoroughly before a software product can be released. The main limitation of software testing is that it remains a mostly manual task, representing a large fraction of the total development cost. In this scenario, test automation is paramount to alleviate such high costs. Test case generation (TCG) is the process of automatically generating test inputs that achieve high coverage of the system under test. Among a wide variety of approaches to TCG, this thesis focuses on structural (white-box) TCG, where one of the most successful enabling techniques is symbolic execution. In symbolic execution, the program under test is executed with its input arguments being symbolic expressions rather than concrete values. This thesis relies on a previously developed constraint-based TCG framework for imperative object-oriented programs (e.g., Java), in which the imperative program under test is first translated into an equivalent constraint logic program, and then such translated program is symbolically executed by relying on standard evaluation mechanisms of Constraint Logic Programming (CLP), extended with special treatment for dynamically allocated data structures. Improving the scalability and efficiency of symbolic execution constitutes a major challenge. It is well known that symbolic execution quickly becomes impractical due to the large number of paths that must be explored and the size of the constraints that must be handled. Moreover, symbolic execution-based TCG tends to produce an unnecessarily large number of test cases when applied to medium or large programs. The contributions of this dissertation can be summarized as follows. (1) A compositional approach to CLP-based TCG is developed which overcomes the inter-procedural path explosion by separately analyzing each component (method) in a program under test, stowing the results as method summaries and incrementally reusing them to obtain whole-program results. A similar compositional strategy that relies on program specialization is also developed for the state-of-the-art symbolic execution tool Symbolic PathFinder (SPF). (2) Resource-driven TCG is proposed as a methodology to use resource consumption information to drive symbolic execution towards those parts of the program under test that comply with a user-provided resource policy, avoiding the exploration of those parts of the program that violate such policy. (3) A generic methodology to guide symbolic execution towards the most interesting parts of a program is proposed, which uses abstractions as oracles to steer symbolic execution through those parts of the program under test that interest the programmer/tester most. (4) A new heap-constraint solver is proposed, which efficiently handles heap-related constraints and aliasing of references during symbolic execution and greatly outperforms the state-of-the-art standard technique known as lazy initialization. (5) All techniques above have been implemented in the PET system (and some of them in the SPF tool). Experimental evaluation has confirmed that they considerably help towards a more scalable and efficient symbolic execution and TCG.
Resumo:
This paper applies an integrated modeling approach to the case of Spain; the approach is based on a random utility-based multiregional input-output model and a road transport network model for assessing the effect of introducing longer and heavier vehicles (LHVs) on the regional consumer price index (CPI) and on the transportation system. The approach strongly supports the concept that changes in transport costs derived from the LHV allowance as well as the economic structure of regions have direct and indirect effects on the economy and on the transportation system. Results show that the introduction of LHVs might reduce prices paid by consumers for a representative basket of goods and services in the regions of Spain and would also lead to a reduction in the regional CPI. In addition, the magnitude and extent of changes in the transportation system are estimated by using the commodity-based structure of the approach to identify the effect of traffic changes on traffic flows and on pollutant emissions over the whole network.
Resumo:
In this article we present a model of organization of a belief system based on a set of binary recursive functions that characterize the dynamic context that modifies the beliefs. The initial beliefs are modeled by a set of two-bit words that grow, update, and generate other beliefs as the different experiences of the dynamic context appear. Reason is presented as an emergent effect of the experience on the beliefs. The system presents a layered structure that allows a functional organization of the belief system. Our approach seems suitable to model different ways of thinking and to apply to different realistic scenarios such as ideologies.
Resumo:
Regular monitoring of wastewater characteristics is undertaken on most wastewater treatment plants. The data acquired during this process are usually filed and forgotten. However, systematic analysis of these data can provide useful insights into plant behaviour. Conventional graphical techniques are inadequate to give a good overall picture of how wastewater characteristics vary, with time and along the lagoon system. An approach based on the use of contour plots was devised that largely overcomes this problem. Superimposition of contour plots for different parameters can be used to gain a qualitative understanding of the nature and strength of relationships between the parameters. This is illustrated in an analysis of monitoring data for lagoon 115 East at the Western Treatment Plant, near Melbourne, Australia. In this illustrative analysis, relationships between ammonia removal rates and parameters such as chlorophyll a level and temperature are explored using a contour plot superimposition approach. It is concluded that this approach can help improve our understanding, not only of lagoon systems, but of other wastewater treatment systems as well.
Resumo:
Starting with a UML specification that captures the underlying functionality of some given Java-based concurrent system, we describe a systematic way to construct, from this specification, test sequences for validating an implementation of the system. The approach is to first extend the specification to create UML state machines that directly address those aspects of the system we wish to test. To be specific, the extended UML state machines can capture state information about the number of waiting threads or the number of threads blocked on a given object. Using the SAL model checker we can generate from the extended UML state machines sequences that cover all the various possibilities of events and states. These sequences can then be directly transformed into test sequences suitable for input into a testing tool such as ConAn. As an illustration, the methodology is applied to generate sequences for testing a Java implementation of the producer-consumer system. © 2005 IEEE
Resumo:
The development of an information system in Caribbean public sector organisations is usually seen as a matter of installing hardware and software according to a directive from senior management, without much planning. This causes huge investment in procuring hardware and software without improving overall system performance. Increasingly, Caribbean organisations are looking for assurances on information system performance before making investment decisions not only to satisfy the funding agencies, but also to be competitive in this dynamic and global business world. This study demonstrates an information system planning approach using a process-reengineering framework. Firstly, the stakeholders for the business functions are identified along with their relationships and requirements. Secondly, process reengineering is carried out to develop the system requirements. Accordingly, information technology is selected through detailed system requirement analysis. Thirdly, cost-benefit analysis, identification of critical success factors and risk analysis are carried out to strengthen the selection. The entire methodology has been demonstrated through an information system project in the Barbados drug service, a public sector organisation in the Caribbean.