803 resultados para Sample algorithms
Resumo:
This paper describes a methodology that was developed for the classification of Medium Voltage (MV) electricity customers. Starting from a sample of data bases, resulting from a monitoring campaign, Data Mining (DM) techniques are used in order to discover a set of a MV consumer typical load profile and, therefore, to extract knowledge regarding to the electric energy consumption patterns. In first stage, it was applied several hierarchical clustering algorithms and compared the clustering performance among them using adequacy measures. In second stage, a classification model was developed in order to allow classifying new consumers in one of the obtained clusters that had resulted from the previously process. Finally, the interpretation of the discovered knowledge are presented and discussed.
Resumo:
Mestrado em Radioterapia
Resumo:
This paper describes a comparison of adaptations of the QuEChERS (quick, easy, cheap, effective, rugged and safe) approach for the determination of 14 organochlorine pesticide (OCP) residues in strawberry jam by concurrent use of gas chromatography (GC) coupled to electron capture detector (ECD) and GC tandem mass spectrometry (GC-MS/MS). Three versions were tested based on the original QuEChERS method. The results were good (overall average of 89% recoveries with 15% RSD) using the ultrasonic bath at five spiked levels. Performance characteristics, such as accuracy, precision, linear range, limits of detection (LOD) and quantification (LOQ), were determined for each pesticide. LOD ranged from 0.8 to 8.9 microg kg-1 ; LOQ was in the range of 2.5–29.8 microg kg- 1; and calibration curves were linear (r2>0.9970) in the whole range of the explored concentrations (5–100 microg kg- 1). The LODs of these pesticides were much lower than the maximum residue levels (MRLs) allowed in Europe for strawberries. The method was successfully applied to the quantification of OCP in commercially available jams. The OCPs were detected lower than the LOD.
Resumo:
Ibuprofen is one of the most used active pharmaceutical ingredients worldwide. A new method for the analysis of ibuprofen and its metabolites, hydroxyibuprofen and carboxyibuprofen, in soils is presented. The extraction of these compounds from the soil matrices was performed by using a modified quick, easy, cheap, effective, rugged, and safe (QuEChERS) method. The method involves a single extraction of the investigated compounds with purified water (acidified at pH 2.5 with hydrochloric acid), and a slow and continuous addition of the QuEChERS content, followed by the addition of acidified acetonitrile (1% acetic acid), prior to the determination by liquid chromatography coupled with fluorescence detection (LC–FLD). Validation studies were carried out using soil samples with a range of organic carbon contents. Recoveries of the fortified samples ranged from 79.5% to 101%. Relative standard deviations for all matrix–compound combinations did not exceed 3%. The method quantification limits were ≤22.4 μg kg−1 in all cases. The developed method was applied to the analysis of sixteen real samples.
Resumo:
In this paper, we present the results of mammography quality control tests related to the work with Portuguese mammography equipment, either in conventional or in digital mammography computed radiography, showing the main differences in the tested equipments. Quality control in mammography is a very special area of quality control in radiology, which demands relatively high knowledge on physics. Digital imaging is changing the standards of the radiographic imaging. Regarding mammography, this is yet a controversial issue owing to some limitations of the digital detectors, like the resolution for instance. A complete set of results regarding radiation protection of the patients submitted to mammography diagnosis is presented. A discussion of the quality image parameters and its interpretation in conventional and digital mammography is presented. In conclusion, we present a sample of results that can be considered as characteristics of mammography equipment in Portugal.
Resumo:
This work aims at investigating the impact of treating breast cancer using different radiation therapy (RT) techniques – forwardly-planned intensity-modulated, f-IMRT, inversely-planned IMRT and dynamic conformal arc (DCART) RT – and their effects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB treatment planning system were compared: Pencil Beam Convolution (PBC) and commercial Monte Carlo (iMC). Seven left-sided breast patients submitted to breast-conserving surgery were enrolled in the study. For each patient, four RT techniques – f-IMRT, IMRT using 2-fields and 5-fields (IMRT2 and IMRT5, respectively) and DCART – were applied. The dose distributions in the planned target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose–volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all techniques provided adequate coverage of the PTV. However, statistically significant dose differences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung and heart than tangential techniques. However, IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Differences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the iMC algorithm predicted.
Resumo:
Introduction: Image resizing is a normal feature incorporated into the Nuclear Medicine digital imaging. Upsampling is done by manufacturers to adequately fit more the acquired images on the display screen and it is applied when there is a need to increase - or decrease - the total number of pixels. This paper pretends to compare the “hqnx” and the “nxSaI” magnification algorithms with two interpolation algorithms – “nearest neighbor” and “bicubic interpolation” – in the image upsampling operations. Material and Methods: Three distinct Nuclear Medicine images were enlarged 2 and 4 times with the different digital image resizing algorithms (nearest neighbor, bicubic interpolation nxSaI and hqnx). To evaluate the pixel’s changes between the different output images, 3D whole image plot profiles and surface plots were used as an addition to the visual approach in the 4x upsampled images. Results: In the 2x enlarged images the visual differences were not so noteworthy. Although, it was clearly noticed that bicubic interpolation presented the best results. In the 4x enlarged images the differences were significant, with the bicubic interpolated images presenting the best results. Hqnx resized images presented better quality than 4xSaI and nearest neighbor interpolated images, however, its intense “halo effect” affects greatly the definition and boundaries of the image contents. Conclusion: The hqnx and the nxSaI algorithms were designed for images with clear edges and so its use in Nuclear Medicine images is obviously inadequate. Bicubic interpolation seems, from the algorithms studied, the most suitable and its each day wider applications seem to show it, being assumed as a multi-image type efficient algorithm.
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
Dissertação de Mestrado em Ambiente, Saúde e Segurança.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica
Resumo:
The paper formulates a genetic algorithm that evolves two types of objects in a plane. The fitness function promotes a relationship between the objects that is optimal when some kind of interface between them occurs. Furthermore, the algorithm adopts an hexagonal tessellation of the two-dimensional space for promoting an efficient method of the neighbour modelling. The genetic algorithm produces special patterns with resemblances to those revealed in percolation phenomena or in the symbiosis found in lichens. Besides the analysis of the spacial layout, a modelling of the time evolution is performed by adopting a distance measure and the modelling in the Fourier domain in the perspective of fractional calculus. The results reveal a consistent, and easy to interpret, set of model parameters for distinct operating conditions.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Automação e Electrónica Industrial