948 resultados para Fault location algorithms
Resumo:
Distributed generation unlike centralized electrical generation aims to generate electrical energy on small scale as near as possible to load centers, interchanging electric power with the network. This work presents a probabilistic methodology conceived to assist the electric system planning engineers in the selection of the distributed generation location, taking into account the hourly load changes or the daily load cycle. The hourly load centers, for each of the different hourly load scenarios, are calculated deterministically. These location points, properly weighted according to their load magnitude, are used to calculate the best fit probability distribution. This distribution is used to determine the maximum likelihood perimeter of the area where each source distributed generation point should preferably be located by the planning engineers. This takes into account, for example, the availability and the cost of the land lots, which are factors of special relevance in urban areas, as well as several obstacles important for the final selection of the candidates of the distributed generation points. The proposed methodology has been applied to a real case, assuming three different bivariate probability distributions: the Gaussian distribution, a bivariate version of Freund’s exponential distribution and the Weibull probability distribution. The methodology algorithm has been programmed in MATLAB. Results are presented and discussed for the application of the methodology to a realistic case and demonstrate the ability of the proposed methodology for efficiently handling the determination of the best location of the distributed generation and their corresponding distribution networks.
Resumo:
The ability to locate an individual is an essential part of many applications, specially the mobile ones. Obtaining this location in an open environment is relatively simple through GPS (Global Positioning System), but indoors or even in dense environments this type of location system doesn't provide a good accuracy. There are already systems that try to suppress these limitations, but most of them need the existence of a structured environment to work. Since Inertial Navigation Systems (INS) try to suppress the need of a structured environment we propose an INS based on Micro Electrical Mechanical Systems (MEMS) that is capable of, in real time, compute the position of an individual everywhere.
Resumo:
Purpose/Introduction: To determine the clinical utility of pre-operative diffusion tensor (DT) tractography of the facial nerve in the vicinity of cerebellopontine angle (CPA) tumours. The location of the facial nerve was established pre-operatively by tractography and compared with in-vivo electrode stimulation during microsurgery of vestibular schwannomas and rare CPA masses (meningiomas and arachnoid cysts).
Resumo:
The Gulf of Cadiz, as part of the Azores-Gibraltar plate boundary, is recognized as a potential source of big earthquakes and tsunamis that may affect the bordering countries, as occurred on 1 November 1755. Preparing for the future, Portugal is establishing a national tsunami warning system in which the threat caused by any large-magnitude earthquake in the area is estimated from a comprehensive database of scenarios. In this paper we summarize the knowledge about the active tectonics in the Gulf of Cadiz and integrate the available seismological information in order to propose the generation model of destructive tsunamis to be applied in tsunami warnings. The fault model derived is then used to estimate the recurrence of large earthquakes using the fault slip rates obtained by Cunha et al. (2012) from thin-sheet neotectonic modelling. Finally we evaluate the consistency of seismicity rates derived from historical and instrumental catalogues with the convergence rates between Eurasia and Nubia given by plate kinematic models.
Resumo:
Cloud computing is increasingly being adopted in different scenarios, like social networking, business applications, scientific experiments, etc. Relying in virtualization technology, the construction of these computing environments targets improvements in the infrastructure, such as power-efficiency and fulfillment of users’ SLA specifications. The methodology usually applied is packing all the virtual machines on the proper physical servers. However, failure occurrences in these networked computing systems can induce substantial negative impact on system performance, deviating the system from ours initial objectives. In this work, we propose adapted algorithms to dynamically map virtual machines to physical hosts, in order to improve cloud infrastructure power-efficiency, with low impact on users’ required performance. Our decision making algorithms leverage proactive fault-tolerance techniques to deal with systems failures, allied with virtual machine technology to share nodes resources in an accurately and controlled manner. The results indicate that our algorithms perform better targeting power-efficiency and SLA fulfillment, in face of cloud infrastructure failures.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
This work aims at investigating the impact of treating breast cancer using different radiation therapy (RT) techniques – forwardly-planned intensity-modulated, f-IMRT, inversely-planned IMRT and dynamic conformal arc (DCART) RT – and their effects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB treatment planning system were compared: Pencil Beam Convolution (PBC) and commercial Monte Carlo (iMC). Seven left-sided breast patients submitted to breast-conserving surgery were enrolled in the study. For each patient, four RT techniques – f-IMRT, IMRT using 2-fields and 5-fields (IMRT2 and IMRT5, respectively) and DCART – were applied. The dose distributions in the planned target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose–volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all techniques provided adequate coverage of the PTV. However, statistically significant dose differences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung and heart than tangential techniques. However, IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Differences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the iMC algorithm predicted.
Resumo:
Introduction: Image resizing is a normal feature incorporated into the Nuclear Medicine digital imaging. Upsampling is done by manufacturers to adequately fit more the acquired images on the display screen and it is applied when there is a need to increase - or decrease - the total number of pixels. This paper pretends to compare the “hqnx” and the “nxSaI” magnification algorithms with two interpolation algorithms – “nearest neighbor” and “bicubic interpolation” – in the image upsampling operations. Material and Methods: Three distinct Nuclear Medicine images were enlarged 2 and 4 times with the different digital image resizing algorithms (nearest neighbor, bicubic interpolation nxSaI and hqnx). To evaluate the pixel’s changes between the different output images, 3D whole image plot profiles and surface plots were used as an addition to the visual approach in the 4x upsampled images. Results: In the 2x enlarged images the visual differences were not so noteworthy. Although, it was clearly noticed that bicubic interpolation presented the best results. In the 4x enlarged images the differences were significant, with the bicubic interpolated images presenting the best results. Hqnx resized images presented better quality than 4xSaI and nearest neighbor interpolated images, however, its intense “halo effect” affects greatly the definition and boundaries of the image contents. Conclusion: The hqnx and the nxSaI algorithms were designed for images with clear edges and so its use in Nuclear Medicine images is obviously inadequate. Bicubic interpolation seems, from the algorithms studied, the most suitable and its each day wider applications seem to show it, being assumed as a multi-image type efficient algorithm.
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.
Resumo:
1st Mares Conference on Marine Ecosystems Health and Conservation. Olhão, Portugal 17-21 November 2014.
Resumo:
Dissertação de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica Ramo de Manutenção e Produção
Resumo:
Trabalho de Projeto realizado para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
O desenvolvimento de sistemas de localização pedestre com recurso a técnicas de dead reckoning tem mostrado ser uma área em expansão no mundo académico e não só. Existem algumas soluções criadas, no entanto, nem todas as soluções serão facilmente implementadas no mercado, quer seja pelo hardware caro, ou pelo sistema em si, que é desenvolvido tendo em conta um cenário em particular. INPERLYS é um sistema que visa apresentar uma solução de localização pedestre, independentemente do cenário, utilizando recursos que poderão ser facilmente usados. Trata-se de um sistema que utiliza uma técnica de dead reckonig para dar a localização do utilizador. Em cenários outdoor, um receptor GPS fornece a posição do utilizador, fornecendo uma posição absoluta ao sistema. Quando não é possível utilizar o GPS, recorre-se a um sensor MEMS e a uma bússola para se obter posições relativas à última posição válida do GPS. Para interligar todos os sensores foi utilizado o protocolo de comunicações sem fios ZigBee™. A escolha recaiu neste protocolo devido a factores como os seus baixos consumos e o seu baixo custo. Assim o sistema torna-se de uso fácil e confortável para o utilizador, ao contrário de sistemas similares desenvolvidos, que utilizam cabos para interligarem os diferentes componentes do sistema. O sensor MEMS do tipo acelerómetro tem a função de ler a aceleração horizontal, ao nível do pé. Esta aceleração será usada por um algoritmo de reconhecimento do padrão das acelerações para se detectar os passos dados. Após a detecção do passo, a aceleração máxima registada nesse passo é fornecida ao coordenador, para se obter o deslocamento efectuado. Foram efectuados alguns testes para se perceber a eficiência do INPERLYS. Os testes decorreram num percurso plano, efectuados a uma velocidade normal e com passadas normais. Verificou-se que, neste momento, o desempenho do sistema poderá ser melhorado, quer seja a nível de gestão das comunicações, quer a nível do reconhecimento do padrão da aceleração horizontal, essencial para se detectar os passos. No entanto o sistema é capaz de fornecer a posição através do GPS, quando é possível a sua utilização, e é capaz de fornecer a orientação do movimento.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica - Ramo de Energia