996 resultados para Computational Evaluation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes the application of computational intelligence techniques to assist complex problems concerning lightning in transformers. In order to estimate the currents related to lightning in a transformer, a neural tool is presented. ATP has generated the training vectors. The input variables used in Artificial Neural Networks (ANN) were the wave front time, the wave tail time, the voltage variation rate and the output variable is the maximum current in the secondary of the transformer. These parameters can define the behavior and severity of lightning. Based on these concepts and from the results obtained, it can be verified that the overvoltages at the secondary of transformer are also affected by the discharge waveform in a similar way to the primary side. By using the tool developed, the high voltage process in the distribution transformers can be mapped and estimated with more precision aiding the transformer project process, minimizing empirics and evaluation errors, and contributing to minimize the failure rate of transformers. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this paper is to show an alternative representation in time domain of a non-transposed three-phase transmission line decomposed in its exact modes by using two transformation matrices. The first matrix is Clarke's matrix that is real, frequency independent, easily represented in computational transient programs (EMTP) and separates the line into Quasi-modes alpha, beta and zero. After that, Quasi-modes a and zero are decomposed into their exact modes by using a modal transformation matrix whose elements can be synthesized in time domain through standard curve-fitting techniques. The main advantage of this alternative representation is to reduce the processing time because a frequency dependent modal transformation matrix of a three-phase line has nine elements to be represented in time domain while a modal transformation matrix of a two-phase line has only four elements. This paper shows modal decomposition process and eigenvectors of a nontransposed three-phase line with a vertical symmetry plane whose nominal voltage is 440 kV and line length is 500 km.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new methodology to evaluate in a predictive way the reliability of distribution systems, considering the impact of automatic recloser switches. The developed algorithm is based on state enumeration techniques with Markovian models and on the minimal cut set theory. Some computational aspects related with the implementation of the proposed algorithm in typical distribution networks are also discussed. The description of the proposed approach is carried out using a sample test system. The results obtained with a typical configuration of a Brazilian system (EDP Bandeirante Energia S.A.) are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this paper is to show an alternative representation in time domain of a non-transposed three-phase transmission line decomposed in its exact modes by using two transformation matrices. The first matrix is Clarke's matrix that is real, frequency independent, easily represented in computational transient programs (EMTP) and separates the line into Quasi-modes α, β and zero. After that, Quasi-modes a and zero are decomposed into their exact modes by using a modal transformation matrix whose elements can be synthesized in time domain through standard curve-fitting techniques. The main advantage of this alternative representation is to reduce the processing time because a frequency dependent modal transformation matrix of a three-phase line has nine elements to be represented in time domain while a modal transformation matrix of a two-phase line has only four elements. This paper shows modal decomposition process and eigenvectors of a non-transposed three-phase line with a vertical symmetry plane whose nominal voltage is 440 kV and line length is 500 km. ©2006 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the importance of using a top-down methodology and suitable CAD tools in the development of electronic circuits. The paper presents an evaluation of the methodology used in a computational tool created to support the synthesis of digital to analog converter models by translating between different tools used in a wide variety of applications. This tool is named MS 2SV and works directly with the following two commercial tools: MATLAB/Simulink and SystemVision. Model translation of an electronic circuit is achieved by translating a mixed-signal block diagram developed in Simulink into a lower level of abstraction in VHDL-AMS and the simulation project support structure in SystemVision. The method validation was performed by analyzing the power spectral of the signal obtained by the discrete Fourier transform of a digital to analog converter simulation model. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The software industry has become more and more concerned with the appropriate application of activities that composes requirement engineering as a way to improve the quality of its products. In order to support these activities, several computational tools have been available in the market, although it is still possible to find a lack of resources related to some activities. In this context, this paper proposes the inclusion of a module to aid in the requirements specification to a tool called Requirements Elicitation Support Tool. This module allows to specify requirements in accordance with IEEE 830 standard, thus contributing to the documentation of the requirements established for a software system, besides supporting the learning of concepts related to the requirements specification, which improves the skills of users of the tool. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The daily-to-day of medical practice is marked by a constant search for an accurate diagnosis and therapeutic assessment. For this purpose the doctor serves up a wide variety of imaging techniques, however, the methods using ionizing radiation still the most widely used because it is considered cheaper and above all very efficient when used with control and quality. The optimization of the risk-benefit ratio is considered a major breakthrough in relation to conventional radiology, though this is not the reality of computing and digital radiology, where Brazil has not established standards and protocols for this purpose. This work aims to optimize computational chest radiographs (anterior-posterior projection-AP). To achieve this objective were used a homogeneous phantoms that simulate the characteristics of absorption and scattering of radiation close to the chest of a patient standard. Another factor studied was the subjective evaluation of image quality, carried out by visual grading assessment (VGA) by specialists in radiology, using an anthropomorphic phantom to identify the best image for a particular pathology (fracture or pneumonia). Quantifying the corresponding images indicated by the radiologist was performed from the quantification of physical parameters (Detective Quantum Efficiency - DQE, Modulation Transfer Function - MTF and Noise Power Spectrum - NPS) using the software MatLab®. © 2013 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hepatocellular carcinoma (HCC) is a primary tumor of the liver. After local therapies, the tumor evaluation is based on the mRECIST criteria, which involves the measurement of the maximum diameter of the viable lesion. This paper describes a computed methodology to measure through the contrasted area of the lesions the maximum diameter of the tumor by a computational algorithm 63 computed tomography (CT) slices from 23 patients were assessed. Non-contrasted liver and HCC typical nodules were evaluated, and a virtual phantom was developed for this purpose. Optimization of the algorithm detection and quantification was made using the virtual phantom. After that, we compared the algorithm findings of maximum diameter of the target lesions against radiologist measures. Computed results of the maximum diameter are in good agreement with the results obtained by radiologist evaluation, indicating that the algorithm was able to detect properly the tumor limits A comparison of the estimated maximum diameter by radiologist versus the algorithm revealed differences on the order of 0.25 cm for large-sized tumors (diameter > 5 cm), whereas agreement lesser than 1.0cm was found for small-sized tumors. Differences between algorithm and radiologist measures were accurate for small-sized tumors with a trend to a small increase for tumors greater than 5 cm. Therefore, traditional methods for measuring lesion diameter should be complemented with non-subjective measurement methods, which would allow a more correct evaluation of the contrast-enhanced areas of HCC according to the mRECIST criteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we address the problem of defining the product mix in order to maximise a system's throughput. This problem is well known for being NP-Complete and therefore, most contributions to the topic focus on developing heuristics that are able to obtain good solutions for the problem in a short CPU time. In particular, constructive heuristics are available for the problem such as that by Fredendall and Lea, and by Aryanezhad and Komijan. We propose a new constructive heuristic based on the Theory of Constraints and the Knapsack Problem. The computational results indicate that the proposed heuristic yields better results than the existing heuristic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new general Bayesian latent class model for evaluation of the performance of multiple diagnostic tests in situations in which no gold standard test exists based on a computationally intensive approach. The modeling represents an interesting and suitable alternative to models with complex structures that involve the general case of several conditionally independent diagnostic tests, covariates, and strata with different disease prevalences. The technique of stratifying the population according to different disease prevalence rates does not add further marked complexity to the modeling, but it makes the model more flexible and interpretable. To illustrate the general model proposed, we evaluate the performance of six diagnostic screening tests for Chagas disease considering some epidemiological variables. Serology at the time of donation (negative, positive, inconclusive) was considered as a factor of stratification in the model. The general model with stratification of the population performed better in comparison with its concurrents without stratification. The group formed by the testing laboratory Biomanguinhos FIOCRUZ-kit (c-ELISA and rec-ELISA) is the best option in the confirmation process by presenting false-negative rate of 0.0002% from the serial scheme. We are 100% sure that the donor is healthy when these two tests have negative results and he is chagasic when they have positive results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A recent initiative of the European Space Agency (ESA) aims at the definition and adoption of a software reference architecture for use in on-board software of future space missions. Our PhD project placed in the context of that effort. At the outset of our work we gathered all the industrial needs relevant to ESA and all the main European space stakeholders and we were able to consolidate a set of technical high-level requirements for the fulfillment of them. The conclusion we reached from that phase confirmed that the adoption of a software reference architecture was indeed the best solution for the fulfillment of the high-level requirements. The software reference architecture we set on building rests on four constituents: (i) a component model, to design the software as a composition of individually verifiable and reusable software units; (ii) a computational model, to ensure that the architectural description of the software is statically analyzable; (iii) a programming model, to ensure that the implementation of the design entities conforms with the semantics, the assumptions and the constraints of the computational model; (iv) a conforming execution platform, to actively preserve at run time the properties asserted by static analysis. The nature, feasibility and fitness of constituents (ii), (iii) and (iv), were already proved by the author in an international project that preceded the commencement of the PhD work. The core of the PhD project was therefore centered on the design and prototype implementation of constituent (i), a component model. Our proposed component model is centered on: (i) rigorous separation of concerns, achieved with the support for design views and by careful allocation of concerns to the dedicated software entities; (ii) the support for specification and model-based analysis of extra-functional properties; (iii) the inclusion space-specific concerns.