945 resultados para Combinatorial optimization algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims at investigating the impact of treating breast cancer using different radiation therapy (RT) techniques – forwardly-planned intensity-modulated, f-IMRT, inversely-planned IMRT and dynamic conformal arc (DCART) RT – and their effects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB treatment planning system were compared: Pencil Beam Convolution (PBC) and commercial Monte Carlo (iMC). Seven left-sided breast patients submitted to breast-conserving surgery were enrolled in the study. For each patient, four RT techniques – f-IMRT, IMRT using 2-fields and 5-fields (IMRT2 and IMRT5, respectively) and DCART – were applied. The dose distributions in the planned target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose–volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all techniques provided adequate coverage of the PTV. However, statistically significant dose differences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung and heart than tangential techniques. However, IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Differences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the iMC algorithm predicted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Image resizing is a normal feature incorporated into the Nuclear Medicine digital imaging. Upsampling is done by manufacturers to adequately fit more the acquired images on the display screen and it is applied when there is a need to increase - or decrease - the total number of pixels. This paper pretends to compare the “hqnx” and the “nxSaI” magnification algorithms with two interpolation algorithms – “nearest neighbor” and “bicubic interpolation” – in the image upsampling operations. Material and Methods: Three distinct Nuclear Medicine images were enlarged 2 and 4 times with the different digital image resizing algorithms (nearest neighbor, bicubic interpolation nxSaI and hqnx). To evaluate the pixel’s changes between the different output images, 3D whole image plot profiles and surface plots were used as an addition to the visual approach in the 4x upsampled images. Results: In the 2x enlarged images the visual differences were not so noteworthy. Although, it was clearly noticed that bicubic interpolation presented the best results. In the 4x enlarged images the differences were significant, with the bicubic interpolated images presenting the best results. Hqnx resized images presented better quality than 4xSaI and nearest neighbor interpolated images, however, its intense “halo effect” affects greatly the definition and boundaries of the image contents. Conclusion: The hqnx and the nxSaI algorithms were designed for images with clear edges and so its use in Nuclear Medicine images is obviously inadequate. Bicubic interpolation seems, from the algorithms studied, the most suitable and its each day wider applications seem to show it, being assumed as a multi-image type efficient algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores. Área de Especialização de Automação e Sistemas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica Ramo de Manutenção e Produção

Relevância:

20.00% 20.00%

Publicador:

Resumo:

55th European Regional Science Association Congress, Lisbon, Portugal (25-28 August 2015).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aquando da definição de um layout por fluxo de produto, ou linha de produção, é necessário proceder-se à melhor selecção de combinações de tarefas a serem executadas em cada estação / posto de trabalho para que o trabalho seja executado numa sequência exequível e sejam necessárias quantidades de tempo aproximadamente iguais em cada estação / posto de trabalho. Este processo é chamado de balanceamento da linha de produção. Verifica-se que as estações de trabalho e equipamentos podem ser combinados de muitas maneiras diferentes; daí que a necessidade de efectuar o balanceamento das linhas de produção implique a distribuição de actividades sequenciais por postos de trabalho de modo a permitir uma elevada utilização de trabalho e de equipamentos e a minimizar o tempo de vazio. Os problemas de balanceamento de linhas são tipicamente problemas complexos de tratar, devido ao elevado número de combinações possíveis. Entre os métodos utilizados para resolver estes problemas encontram-se métodos de tentativa e erro, métodos heurísticos, métodos computacionais de avaliação de diferentes opções até se encontrar uma boa solução e métodos de optimização. O objectivo deste trabalho passou pelo desenvolvimento de uma ferramenta computacional para efectuar o balanceamento de linhas de produção recorrendo a algoritmos genéticos. Foi desenvolvida uma aplicação que implementa dois algoritmos genéticos, um primeiro que obtém soluções para o problema e um segundo que optimiza essas soluções, associada a uma interface gráfica em C# que permite a inserção do problema e a visualização de resultados. Obtiveram-se resultados exequíveis demonstrando vantagens em relação aos métodos heurísticos, pois é possível obter-se mais do que uma solução. Além disso, para problemas complexos torna-se mais prático o uso da aplicação desenvolvida. No entanto, esta aplicação permite no máximo seis precedências por cada operação e resultados com o máximo de nove estações de trabalho.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conferência: 39th Annual Conference of the IEEE Industrial-Electronics-Society (IECON) - NOV 10-14, 2013

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Box–Behnken factorial design coupled with surface response methodology was used to evaluate the effects of temperature, pH and initial concentration in the Cu(II) sorption process onto the marine macroalgae Ascophyllum nodosum. The effect of the operating variables on metal uptake capacitywas studied in a batch system and a mathematical model showing the influence of each variable and their interactions was obtained. Study ranges were 10–40ºC for temperature, 3.0–5.0 for pH and 50–150mgL−1 for initial Cu(II) concentration. Within these ranges, the biosorption capacity is slightly dependent on temperature but markedly increases with pH and initial concentration of Cu(II). The uptake capacities predicted by the model are in good agreement with the experimental values. Maximum biosorption capacity of Cu(II) by A. nodosum is 70mgg−1 and corresponds to the following values of those variables: temperature = 40ºC, pH= 5.0 and initial Cu(II) concentration = 150mgL−1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A flow-spectrophotometric method is proposed for the routine determination of tartaric acid in wines. The reaction between tartaric acid and vanadate in acetic media is carried out in flowing conditions and the subsequent colored complex is monitored at 475 nm. The stability of the complex and the corresponding formation constant are presented. The effect of wavelength and pH was evaluated by batch experiments. The selected conditions were transposed to a flowinjection analytical system. Optimization of several flow parameters such as reactor lengths, flow-rate and injection volume was carried out. Using optimized conditions, a linear behavior was observed up to 1000 µg mL-1 tartaric acid, with a molar extinction coefficient of 450 L mg-1 cm-1 and ± 1 % repeatability. Sample throughput was 25 samples per hour. The flow-spectrophotometric method was satisfactorily applied to the quantification of tartaric acid (TA) in wines from different sources. Its accuracy was confirmed by statistical comparison to the conventional Rebelein procedure and to a certified analytical method carried out in a routine laboratory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solvent extraction is considered as a multi-criteria optimization problem, since several chemical species with similar extraction kinetic properties are frequently present in the aqueous phase and the selective extraction is not practicable. This optimization, applied to mixer–settler units, considers the best parameters and operating conditions, as well as the best structure or process flow-sheet. Global process optimization is performed for a specific flow-sheet and a comparison of Pareto curves for different flow-sheets is made. The positive weight sum approach linked to the sequential quadratic programming method is used to obtain the Pareto set. In all investigated structures, recovery increases with hold-up, residence time and agitation speed, while the purity has an opposite behaviour. For the same treatment capacity, counter-current arrangements are shown to promote recovery without significant impairment in purity. Recycling the aqueous phase is shown to be irrelevant, but organic recycling with as many stages as economically feasible clearly improves the design criteria and reduces the most efficient organic flow-rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: A new protocol for fixation and slide preservation was evaluated in order to improve the quality of immunocytochemical reactions on cytology slides. Methods: The quality of immunoreactions was evaluated retrospectively on 186 cytology slides (130 direct smears, 56 cytospins) prepared from different cytology samples. Ninety-three of the slides were air dried, stored at -20 °C and fixed in acetone for 10 minutes (Protocol 1), whereas the other 93 were immediately fixed in methanol at -20 °C for at least 30 minutes, subsequently protected with polyethylene glycol (PEG) and stored at room temperature (Protocol 2). Immunocytochemical staining, with eight primary antibodies, was performed on a Ventana BenchMark Ultra instrument using an UltraView Universal DAB Detection Kit. The following parameters were evaluated for each immunoreaction: morphology preservation, intensity of specific staining, background and counterstain. The slides were blinded and independently scored by four observers with marks from 0 to 20. Results: The quality of immunoreactions was better on methanol-fixed slides protected with PEG than on air-dried slides stored in the freezer: X¯ = 14.44 ± 3.58 versus X¯ = 11.02 ± 3.86, respectively (P < 0.001). Conclusion: Immediate fixation of cytology slides in cold methanol with subsequent application of PEG is an easy and straightforward procedure that improves the quality of immunocytochemical reactions and allows the storage of the slides at room temperature.