989 resultados para Quantification methods
Resumo:
Background: With the decrease of DNA sequencing costs, sequence-based typing methods are rapidly becoming the gold standard for epidemiological surveillance. These methods provide reproducible and comparable results needed for a global scale bacterial population analysis, while retaining their usefulness for local epidemiological surveys. Online databases that collect the generated allelic profiles and associated epidemiological data are available but this wealth of data remains underused and are frequently poorly annotated since no user-friendly tool exists to analyze and explore it. Results: PHYLOViZ is platform independent Java software that allows the integrated analysis of sequence-based typing methods, including SNP data generated from whole genome sequence approaches, and associated epidemiological data. goeBURST and its Minimum Spanning Tree expansion are used for visualizing the possible evolutionary relationships between isolates. The results can be displayed as an annotated graph overlaying the query results of any other epidemiological data available. Conclusions: PHYLOViZ is a user-friendly software that allows the combined analysis of multiple data sources for microbial epidemiological and population studies. It is freely available at http://www.phyloviz.net.
Resumo:
Introdução – Numa era em que os tratamentos de Radioterapia Externa (RTE) exigem cada vez mais precisão, a utilização de imagem médica permitirá medir, quantificar e avaliar o impacto do erro provocado pela execução do tratamento ou pelos movimentos dos órgãos. Objetivo – Analisar os dados existentes na literatura acerca de desvios de posicionamento (DP) em patologias de cabeça e pescoço (CP) e próstata, medidos com Cone Beam Computed Tomography (CBCT) ou Electronic Portal Image Device (EPID). Metodologia – Para esta revisão da literatura foram pesquisados artigos recorrendo às bases de dados MEDLINE/PubMed e b-on. Foram incluídos artigos que reportassem DP em patologias CP e próstata medidos através de CBCT e EPID. Seguidamente foram aplicados critérios de validação, que permitiram a seleção dos estudos. Resultados – Após a análise de 35 artigos foram incluídos 13 estudos e validados 9 estudos. Para tumores CP, a média (μ) dos DP encontra-se entre 0,0 e 1,2mm, com um desvio padrão (σ) máximo de 1,3mm. Para patologias de próstata observa-se μDP compreendido entre 0,0 e 7,1mm, com σ máximo de 7,5mm. Discussão/Conclusão – Os DP em patologias CP são atribuídos, maioritariamente, aos efeitos secundários da RTE, como mucosite e dor, que afetam a deglutição e conduzem ao emagrecimento, contribuindo para a instabilidade da posição do doente durante o tratamento, aumentando as incertezas de posicionamento. Os movimentos da próstata devem-se principalmente às variações de preenchimento vesical, retal e gás intestinal. O desconhecimento dos DP afeta negativamente a precisão da RTE. É importante detetá-los e quantificá-los para calcular margens adequadas e a magnitude dos erros, aumentando a precisão da administração de RTE, incluindo o aumento da segurança do doente. - ABSTRACT - Background and Purpose – In an era where precision is an increasing necessity in external radiotherapy (RT), modern medical imaging techniques provide means for measuring, quantifying and evaluating the impact of treatment execution and movement error. The aim of this paper is to review the current literature on the quantification of setup deviations (SD) in patients with head and neck (H&N) or prostate tumors, using Cone Beam Computed Tomography (CBCT) or Electronic Portal Image Device (EPID). Methods – According to the study protocol, MEDLINE/PubMed and b-on databases were searched for trials, which were analyzed using selection criteria based on the quality of the articles. Results – After assessment of 35 papers, 13 studies were included in this analysis and nine were authenticated (6 for prostate and 3 for H&N tumors). The SD in the treatment of H&N cancer patients is in the interval of 0.1 to 1.2mm, whereas in prostate cancer this interval is 0.0 to 7.1mm. Discussion – The reproducibility of patient positioning is the biggest barrier for higher precision in RT, which is affected by geometrical uncertainty, positioning errors and inter or intra-fraction organ movement. There are random and systematic errors associated to patient positioning, introduced since the treatment planning phase or through physiological organ movement. Conclusion – The H&N SD are mostly assigned to the Radiotherapy adverse effects, like mucositis and pain, which affect swallowing and decrease secretions, contributing for the instability of patient positioning during RT treatment and increasing positioning uncertainties. Prostate motion is mainly related to the variation in bladder and rectal filling. Ignoring SD affects negatively the accuracy of RT. Therefore, detection and quantification of SD is crucial in order to calculate appropriate margins, the magnitude of error and to improve accuracy in RTE and patient safety.
Resumo:
Personal memories composed of digital pictures are very popular at the moment. To retrieve these media items annotation is required. During the last years, several approaches have been proposed in order to overcome the image annotation problem. This paper presents our proposals to address this problem. Automatic and semi-automatic learning methods for semantic concepts are presented. The automatic method is based on semantic concepts estimated using visual content, context metadata and audio information. The semi-automatic method is based on results provided by a computer game. The paper describes our proposals and presents their evaluations.
Resumo:
The present research paper presents five different clustering methods to identify typical load profiles of medium voltage (MV) electricity consumers. These methods are intended to be used in a smart grid environment to extract useful knowledge about customer’s behaviour. The obtained knowledge can be used to support a decision tool, not only for utilities but also for consumers. Load profiles can be used by the utilities to identify the aspects that cause system load peaks and enable the development of specific contracts with their customers. The framework presented throughout the paper consists in several steps, namely the pre-processing data phase, clustering algorithms application and the evaluation of the quality of the partition, which is supported by cluster validity indices. The process ends with the analysis of the discovered knowledge. To validate the proposed framework, a case study with a real database of 208 MV consumers is used.
Resumo:
A crucial method for investigating patients with coronary artery disease (CAD) is the calculation of the left ventricular ejection fraction (LVEF). It is, consequently, imperative to precisely estimate the value of LVEF--a process that can be done with myocardial perfusion scintigraphy. Therefore, the present study aimed to establish and compare the estimation performance of the quantitative parameters of the reconstruction methods filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM). Methods: A beating-heart phantom with known values of end-diastolic volume, end-systolic volume, and LVEF was used. Quantitative gated SPECT/quantitative perfusion SPECT software was used to obtain these quantitative parameters in a semiautomatic mode. The Butterworth filter was used in FBP, with the cutoff frequencies between 0.2 and 0.8 cycles per pixel combined with the orders of 5, 10, 15, and 20. Sixty-three reconstructions were performed using 2, 4, 6, 8, 10, 12, and 16 OSEM subsets, combined with several iterations: 2, 4, 6, 8, 10, 12, 16, 32, and 64. Results: With FBP, the values of end-diastolic, end-systolic, and the stroke volumes rise as the cutoff frequency increases, whereas the value of LVEF diminishes. This same pattern is verified with the OSEM reconstruction. However, with OSEM there is a more precise estimation of the quantitative parameters, especially with the combinations 2 iterations × 10 subsets and 2 iterations × 12 subsets. Conclusion: The OSEM reconstruction presents better estimations of the quantitative parameters than does FBP. This study recommends the use of 2 iterations with 10 or 12 subsets for OSEM and a cutoff frequency of 0.5 cycles per pixel with the orders 5, 10, or 15 for FBP as the best estimations for the left ventricular volumes and ejection fraction quantification in myocardial perfusion scintigraphy.
Resumo:
Intensive use of Distributed Generation (DG) represents a change in the paradigm of power systems operation making small-scale energy generation and storage decision making relevant for the whole system. This paradigm led to the concept of smart grid for which an efficient management, both in technical and economic terms, should be assured. This paper presents a new approach to solve the economic dispatch in smart grids. The proposed methodology for resource management involves two stages. The first one considers fuzzy set theory to define the natural resources range forecast as well as the load forecast. The second stage uses heuristic optimization to determine the economic dispatch considering the generation forecast, storage management and demand response
Resumo:
In the context of electricity markets, transmission pricing is an important tool to achieve an efficient operation of the electricity system. The electricity market is influenced by several factors; however the transmission network management is one of the most important aspects, because the network is a natural monopoly. The transmission tariffs can help to regulate the market, for this reason transmission tariffs must follow strict criteria. This paper presents the following methods to tariff the use of transmission networks by electricity market players: Post-Stamp Method; MW-Mile Method Distribution Factors Methods; Tracing Methodology; Bialek’s Tracing Method and Locational Marginal Price. A nine bus transmission network is used to illustrate the application of the tariff methods.
Resumo:
This paper proposes two meta-heuristics (Genetic Algorithm and Evolutionary Particle Swarm Optimization) for solving a 15 bid-based case of Ancillary Services Dispatch in an Electricity Market. A Linear Programming approach is also included for comparison purposes. A test case based on the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is used to demonstrate that the use of meta-heuristics is suitable for solving this kind of optimization problem. Faster execution times and lower computational resources requirements are the most relevant advantages of the used meta-heuristics when compared with the Linear Programming approach.
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tool must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case based on California Independent System Operator (CAISO) data concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.
Resumo:
Microbial adhesion is a field of recognized relevance and, as such, an impressive array of tools has been developed to understand its molecular mechanisms and ultimately for its quantification. Some of the major limitations found within these methodologies concern the incubation time, the small number of cells analyzed, and the operator's subjectivity. To overcome these aspects, we have developed a quantitative method to measure yeast cells' adhesion through flow cytometry. In this methodology, a suspension of yeast cells is mixed with green fluorescent polystyrene microspheres (uncoated or coated with host proteins). Within 2 h, an adhesion profile is obtained based on two parameters: percentage and cells-microsphere population's distribution pattern. This flow cytometry protocol represents a useful tool to quantify yeast adhesion to different substrata in a large scale, providing manifold data in a speedy and informative manner.
Resumo:
Objectives : The purpose of this article is to find out differences between surveys using paper and online questionnaires. The author has deep knowledge in the case of questions concerning opinions in the development of survey based research, e.g. the limits of postal and online questionnaires. Methods : In the physician studies carried out in 1995 (doctors graduated in 1982-1991), 2000 (doctors graduated in 1982-1996), 2005 (doctors graduated in 1982-2001), 2011 (doctors graduated in 1977-2006) and 457 family doctors in 2000, were used paper and online questionnaires. The response rates were 64%, 68%, 64%, 49% and 73%, respectively. Results : The results of the physician studies showed that there were differences between methods. These differences were connected with using paper-based questionnaire and online questionnaire and response rate. The online-based survey gave a lower response rate than the postal survey. The major advantages of online survey were short response time; very low financial resource needs and data were directly loaded in the data analysis software, thus saved time and resources associated with the data entry process. Conclusions : The current article helps researchers with planning the study design and choosing of the right data collection method.
Resumo:
Tomographic image can be degraded, partially by patient based attenuation. The aim of this paper is to quantitatively verify the effects of attenuation correction methods Chang and CT in 111In studies through the analysis of profiles from abdominal SPECT, correspondent to a uniform radionuclide uptake organ, the left kidney.
Resumo:
Na indústria farmacêutica, a limpeza dos equipamentos e superfícies é muito importante no processo de fabrico/embalagem dos produtos farmacêuticos. Possíveis resíduos contaminantes devem ser removidos dos equipamentos e das superfícies envolvidas no processo. De acordo com as Boas Práticas de Fabrico (GMP), os procedimentos de limpeza e os métodos analíticos usados para determinar as quantidades de resíduos devem ser validados. O método analítico combinado com o método de amostragem utilizado na colheita de amostras deve ser sujeito a um ensaio de “recovery”. Neste trabalho apresenta-se uma estratégia inovadora para a validação de limpeza de formas farmacêuticas semi-sólidas. Propõe-se o uso de um método de amostragem que consiste na colheita direta de amostra após o seu fabrico, sendo a análise de resíduos feita directamente nesta amostra. Os produtos escolhidos para a avaliação da estratégia foram dois medicamentos dermatológicos, apresentados na forma de pomada e produzidos numa unidade de fabrico de vários produtos, pela Schering Plough Farma/ Merck Sharp & Dohme (Cacém, Portugal). Como métodos analíticos para a quantificação dos resíduos, utilizaram-se métodos validados por via espectrofotométrica (HPLC), usados na análise do produto acabado. A validação de limpeza foi avaliada através da análise de uma quantidade conhecida de pomada (produto B (*)), usando o método de análise da pomada fabricada anteriormente (produto A (*)), de modo a verificar-se a existência ou não de agente de limpeza e substâncias ativas deixadas após a limpeza do produto A, e vice-versa. As concentrações residuais das substâncias ativas e do agente de limpeza encontradas após a limpeza foram nulas, ou seja, inferiores ao limite de deteção (LOD), sendo que o critério de aceitação da limpeza utilizado foi de 6,4 x 10-4 mg/g para a substância ativa 1 (*); 1,0 x 10-2 mg/g para a substância ativa 2 (*); 1,0 x 10-3 mg/g para a substância ativa 3 (*) e de 10 ppm para o agente de limpeza. No ensaio de “recovery”, obtiveram-se resultados acima de 70% para todas as substâncias ativas e para o agente de limpeza nas duas pomadas. Antes de se proceder a este ensaio de “recovery”, houve a necessidade de ajustar as condições cromatográficas dos métodos analíticos de ambos os produtos e do agente de limpeza, por forma a obter-se valores da conformidade do sistema (fator de tailling e de resolução) de acordo com as especificações. A precisão dos resultados, reportada como desvio padrão relativo (RSD), deu abaixo de 2,0%, excepto nos ensaios que envolvem a substância ativa 3, cuja especificação é inferior a 10,0%. Os resultados obtidos demonstraram que os procedimentos de limpeza usados na unidade de fabrico em causa são eficazes, eliminando assim a existência de contaminação cruzada.
Resumo:
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.
Resumo:
Tese de Doutoramento, Geologia (Hidrogeologia), 17 de Dezembro de 2013, Universidade dos Açores.