980 resultados para Algorithm efficiency
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.
Resumo:
The BALA project (Biodiversity of Arthropods of Laurisilva of the Azores) is a research initiative to quantify the spatial distribution of arthropod biodiversity in native forests of the Azores archipelago. Arthropods were collected using a combination of two techniques, targeting epigean (ground dwelling) and canopy (arboreal) arthropods: pitfall traps (with Turquin and Ethylene solutions) and beating samples (using the three most dominant plant species). A total of 109 transects distributed amongst 18 forest fragments in seven of the nine Azorean islands were used in this study. The performance of alternative sampling methods and effort were tested. No significant differences were found in the accumulated number of species captured whether an alternative method was used or whether another transect with similar effort was established in another location within the same fragment. A combination of Ethylene and Turquin traps captured more species per individual, Turquin and beating captured more species per sample, and Turquin captured more species per unit time. An optimization exercise was performed and we found that the protocol applied during recent years is very close to optimal, allowing its future replication with confidence. The minimum combinations of sampling effort and methods, in order to monitor or to inventory diversity, taking into account different proportions of sample completeness are discussed.
Resumo:
OBJECTIVE: The Integrated Management of Childhood Illness is a strategy designed to address major causes of child mortality. The aim of this study was to assess the impact of the strategy on the quality of child health care provided at primary facilities. METHODS: Child health quality of care and costs were compared in four states in Northeastern Brazil, in 2001. There were studied 48 health facilities considered to have had stable strategy implementation at least two years before the start of study, with 48 matched comparison facilities in the same states. A single measure of correct management of sick children was used to assess care provided to all sick children. Costs included all resources at the national, state, local and facility levels associated with child health care. RESULTS: Facilities providing strategy-based care had significantly better management of sick children at no additional cost to municipalities relative to the comparison municipalities. At strategy facilities 72% of children were correctly managed compared with 56% in comparison facilities (p=0.001). The cost per child managed correctly was US$13.20 versus US$21.05 in the strategy and comparison municipalities, respectively, after standardization for population size. CONCLUSIONS: The strategy improves the efficiency of primary facilities in Northeastern Brazil. It leads to better health outcomes at no extra cost.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Automação e Electrónica Industrial
Resumo:
Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for unsupervised endmember extraction from hyperspectral data, termed vertex component analysis (VCA). The algorithm exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. In a series of experiments using simulated and real data, the VCA algorithm competes with state-of-the-art methods, with a computational complexity between one and two orders of magnitude lower than the best available method.
Resumo:
The calculation of the dose is one of the key steps in radiotherapy planning1-5. This calculation should be as accurate as possible, and over the years it became feasible through the implementation of new algorithms to calculate the dose on the treatment planning systems applied in radiotherapy. When a breast tumour is irradiated, it is fundamental a precise dose distribution to ensure the planning target volume (PTV) coverage and prevent skin complications. Some investigations, using breast cases, showed that the pencil beam convolution algorithm (PBC) overestimates the dose in the PTV and in the proximal region of the ipsilateral lung. However, underestimates the dose in the distal region of the ipsilateral lung, when compared with analytical anisotropic algorithm (AAA). With this study we aim to compare the performance in breast tumors of the PBC and AAA algorithms.
Resumo:
Conferência - 16th International Symposium on Wireless Personal Multimedia Communications (WPMC)- Jun 24-27, 2013
Resumo:
Abstract This work reports the analysis of the efficiency and time of soil remediation using vapour extraction as well as provides comparison of results using both, prepared and real soils. The main objectives were: (i) to analyse the efficiency and time of remediation according to the water and natural organic matter content of the soil; and (ii) to assess if a previous study, performed using prepared soils, could help to preview the process viability in real conditions. For sandy soils with negligible clay content, artificially contaminated with cyclohexane before vapour extraction, it was concluded that (i) the increase of soil water content and mainly of natural organic matter content influenced negatively the remediation process, making it less efficient, more time consuming, and consequently more expensive; and (ii) a previous study using prepared soils of similar characteristics has proven helpful for previewing the process viability in real conditions.
Resumo:
TiO2 nanorod films have been deposited on ITO substrates by dc reactive magnetron sputtering technique. The structures of these nanorod films were modified by the variation of the oxygen pressure during the sputtering process. Although all these TiO2 nanorod films deposited at different oxygen pressures show an anatase structure, the orientation of the nanorod films varies with the oxygen pressure. Only a very weak (101) diffraction peak can be observed for the TiO2 nanorod film prepared at low oxygen pressure. However, as the oxygen pressure is increased, the (220) diffraction peak appears and the intensity of this diffraction peak is increased with the oxygen pressure. The results of the SEM show that these TiO2 nanorods are perpendicular to the ITO substrate. At low oxygen pressure, these sputtered TiO2 nanorods stick together and have a dense structure. As the oxygen pressure is increased, these sputtered TiO2 nanorods get separated gradually and have a porous structure. The optical transmittance of these TiO2 nanorod films has been measured and then fitted by OJL model. The porosities of the TiO2 nanorod films have been calculated. The TiO2 nanorod film prepared at high oxygen pressure shows a high porosity. The dye-sensitized solar cells (DSSCs) have been assembled using these TiO2 nanorod films prepared at different oxygen pressures as photoelectrode. The optimum performance was achieved for the DSSC using the TiO2 nanorod film with the highest (220) diffraction peak and the highest porosity.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Electrónica e Telecomunicações
Resumo:
Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Energia
Resumo:
Objectivo do estudo: comparar o desempenho dos algoritmos Pencil Beam Convolution (PBC) e do Analytical Anisotropic Algorithm (AAA) no planeamento do tratamento de tumores de mama com radioterapia conformacional a 3D.
Resumo:
Dissertação elaborada para a obtenção do grau de Mestre em Engenharia Civil na Área de Especialização de Estruturas