984 resultados para Meta-heuristics algorithms
Resumo:
In this paper a solution to an highly constrained and non-convex economical dispatch (ED) problem with a meta-heuristic technique named Sensing Cloud Optimization (SCO) is presented. The proposed meta-heuristic is based on a cloud of particles whose central point represents the objective function value and the remaining particles act as sensors "to fill" the search space and "guide" the central particle so it moves into the best direction. To demonstrate its performance, a case study with multi-fuel units and valve- point effects is presented.
Resumo:
Introduction - The increasing of TB burden is usually related to inadequate case detection, diagnosis and cure. Global targets for TB control, adopted by the World Health Organization (WHO), are to detect 70% of the estimated incidence of sputum smear-positive TB and to cure 85% of newly detected cases of sputum smear-positive TB. Factors associated with unsuccessful treatment outcomes are closely related to TB risk factors. Objectives - To describe treatment success rates in pulmonary TB cases and to identify factors associated with unsuccessful treatment outcomes, according to ad-hoc studies.
Resumo:
This work aims at investigating the impact of treating breast cancer using different radiation therapy (RT) techniques – forwardly-planned intensity-modulated, f-IMRT, inversely-planned IMRT and dynamic conformal arc (DCART) RT – and their effects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB treatment planning system were compared: Pencil Beam Convolution (PBC) and commercial Monte Carlo (iMC). Seven left-sided breast patients submitted to breast-conserving surgery were enrolled in the study. For each patient, four RT techniques – f-IMRT, IMRT using 2-fields and 5-fields (IMRT2 and IMRT5, respectively) and DCART – were applied. The dose distributions in the planned target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose–volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all techniques provided adequate coverage of the PTV. However, statistically significant dose differences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung and heart than tangential techniques. However, IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Differences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the iMC algorithm predicted.
Resumo:
Introduction: Image resizing is a normal feature incorporated into the Nuclear Medicine digital imaging. Upsampling is done by manufacturers to adequately fit more the acquired images on the display screen and it is applied when there is a need to increase - or decrease - the total number of pixels. This paper pretends to compare the “hqnx” and the “nxSaI” magnification algorithms with two interpolation algorithms – “nearest neighbor” and “bicubic interpolation” – in the image upsampling operations. Material and Methods: Three distinct Nuclear Medicine images were enlarged 2 and 4 times with the different digital image resizing algorithms (nearest neighbor, bicubic interpolation nxSaI and hqnx). To evaluate the pixel’s changes between the different output images, 3D whole image plot profiles and surface plots were used as an addition to the visual approach in the 4x upsampled images. Results: In the 2x enlarged images the visual differences were not so noteworthy. Although, it was clearly noticed that bicubic interpolation presented the best results. In the 4x enlarged images the differences were significant, with the bicubic interpolated images presenting the best results. Hqnx resized images presented better quality than 4xSaI and nearest neighbor interpolated images, however, its intense “halo effect” affects greatly the definition and boundaries of the image contents. Conclusion: The hqnx and the nxSaI algorithms were designed for images with clear edges and so its use in Nuclear Medicine images is obviously inadequate. Bicubic interpolation seems, from the algorithms studied, the most suitable and its each day wider applications seem to show it, being assumed as a multi-image type efficient algorithm.
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.
Resumo:
Abstract Introduction: Exhaustive and/or unaccustomed exercise, mainly those involving eccentric muscle actions, induces temporary muscle damage, evidenced by Delayed Onset Muscle Soreness. Different strategies to recover the signs and symptoms of this myogenic condition have been studied by researchers, as a result a significant number of articles on this issue have been published. Purpose: A systematic review was conducted to assess the evidence of the physiotherapeutic interventions of exercise-induced muscle damage. Methods: The electronic data bases were searched, including MEDLINE (1996-2011), CINHAL (1982- 2011), EMBASE (1988-2011), PEDro (1950-2011), and SPORTDiscus (1985-2011). Systematic review was limited to randomized control trials (RCTs) studies, written in English or Portuguese, which included physiotherapeutic interventions, namely massage, cryotherapy, stretching and low-intensity exercise, on adult human subjects (18-60 years old) of either gender. Studies were excluded when the intervention could not be assessed independently. The methodological quality of RCTs was independently assessed with the PEDro Scale by three reviewers. Results: Thirty-three studies were included in the systematic review; eight analyzed the effects of the massage, ten analyzed the effects of the cryotherapy, eight the effect of stretching and seventeen focused low-intensity exercise intervention. The results suggest that massage is the most effective intervention and that there is inconclusive evidence to support the use of cryotherapy; whereas the other conventional, namely stretching and low-intensity exercise, there is no evidence to prove their efficacy. Conclusion: The results allow the conclusion that massage is the physiotherapeutic intervention that demonstrated to be the most effective in the relief of symptoms and signs of exercise-induced muscle damage, as a result, massage should still be used in the muscular recovery after sports activities.
Resumo:
OBJETIVO: Analisar as evidências na literatura do efeito do peso ao nascer sobre a ocorrência de síndrome metabólica em adultos. MÉTODOS: Foram pesquisados nas bases PubMed and LILACS, no período de 1966 a maio de 2006, artigos publicados usando os seguintes descritores: "birth weight" , "birthweight" , "intra-uterine growth restriction (IUGR)", "fetal growth retardation", "metabolic syndrome", "syndrome X", "Reaven's X syndrome". Foram selecionados 224 estudos considerados elegíveis que relatavam estimativas de associação entre peso ao nascer e síndrome metabólica ou seus componentes. Desses, 11 apresentavam razões de odds e foram usados na meta-análise. RESULTADOS: Com exceção de dois estudos, os demais relataram associação inversa entre peso ao nascer e síndrome metabólica. Comparadas com pessoas de peso normal, a razão de odds do efeito combinado naquelas que nasceram com baixo peso foi de 2,53 (IC 95%: 1,57;4,08). O gráfico de funil sugere viés de publicação e o resultado permanece estatisticamente significativo mesmo em estudos com mais de 400 pessoas (efeito combinado 2,37; IC 95%: 1,15;4,90). CONCLUSÕES: Baixo peso ao nascer aumenta o risco de síndrome metabólica na idade adulta.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Aquando da definição de um layout por fluxo de produto, ou linha de produção, é necessário proceder-se à melhor selecção de combinações de tarefas a serem executadas em cada estação / posto de trabalho para que o trabalho seja executado numa sequência exequível e sejam necessárias quantidades de tempo aproximadamente iguais em cada estação / posto de trabalho. Este processo é chamado de balanceamento da linha de produção. Verifica-se que as estações de trabalho e equipamentos podem ser combinados de muitas maneiras diferentes; daí que a necessidade de efectuar o balanceamento das linhas de produção implique a distribuição de actividades sequenciais por postos de trabalho de modo a permitir uma elevada utilização de trabalho e de equipamentos e a minimizar o tempo de vazio. Os problemas de balanceamento de linhas são tipicamente problemas complexos de tratar, devido ao elevado número de combinações possíveis. Entre os métodos utilizados para resolver estes problemas encontram-se métodos de tentativa e erro, métodos heurísticos, métodos computacionais de avaliação de diferentes opções até se encontrar uma boa solução e métodos de optimização. O objectivo deste trabalho passou pelo desenvolvimento de uma ferramenta computacional para efectuar o balanceamento de linhas de produção recorrendo a algoritmos genéticos. Foi desenvolvida uma aplicação que implementa dois algoritmos genéticos, um primeiro que obtém soluções para o problema e um segundo que optimiza essas soluções, associada a uma interface gráfica em C# que permite a inserção do problema e a visualização de resultados. Obtiveram-se resultados exequíveis demonstrando vantagens em relação aos métodos heurísticos, pois é possível obter-se mais do que uma solução. Além disso, para problemas complexos torna-se mais prático o uso da aplicação desenvolvida. No entanto, esta aplicação permite no máximo seis precedências por cada operação e resultados com o máximo de nove estações de trabalho.
Resumo:
A DPOC é uma das principais causas mundiais de morbilidade e mortalidade, representando um problema de saúde pública devido ao elevado consumo de recursos sanitários e económicos associados. A reabilitação pulmonar é uma recomendação standard nos cuidados a estes pacientes, de forma a controlar os sintomas da doença e optimizar a capacidade funcional destes indivíduos, reduzindo assim os custos de saúde associados às exacerbações e limitação da actividade e participação. Contudo, em pacientes com DPOC severa o exercício físico pode ser de difícil desempenho devido a dispneia extrema, diminuição da força muscular e fadiga, ou inclusive hipoxemia e dispneia durante pequenos esforços e actividades diárias, limitando a sua qualidade de vida. Assim, a VNI tem sido usada em combinação com o exercício, de forma a melhorar a capacidade de exercício nestes doentes, ainda que sem consenso para a sua recomendação. Assim, tivemos como objectivo verificar se a utilização de VNI durante o exercício é mais eficaz que exercício sem VNI, na dispneia, distância percorrida, gasimetria e estado de saúde, em pacientes com DPOC, através de revisão sistemática e meta-análise.
Resumo:
Mestrado em Fisioterapia
Resumo:
Objectives - To identify associated factors for PTB in studies published recently and to quantify significant combined measures for PTB risk factors previously identified.
Resumo:
The paper formulates a genetic algorithm that evolves two types of objects in a plane. The fitness function promotes a relationship between the objects that is optimal when some kind of interface between them occurs. Furthermore, the algorithm adopts an hexagonal tessellation of the two-dimensional space for promoting an efficient method of the neighbour modelling. The genetic algorithm produces special patterns with resemblances to those revealed in percolation phenomena or in the symbiosis found in lichens. Besides the analysis of the spacial layout, a modelling of the time evolution is performed by adopting a distance measure and the modelling in the Fourier domain in the perspective of fractional calculus. The results reveal a consistent, and easy to interpret, set of model parameters for distinct operating conditions.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Energia