28 resultados para Quantification methods

em Repositório Científico do Instituto Politécnico de Lisboa - Portugal


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structures experience various types of loads along their lifetime, which can be either static or dynamic and may be associated to phenomena of corrosion and chemical attack, among others. As a consequence, different types of structural damage can be produced; the deteriorated structure may have its capacity affected, leading to excessive vibration problems or even possible failure. It is very important to develop methods that are able to simultaneously detect the existence of damage and to quantify its extent. In this paper the authors propose a method to detect and quantify structural damage, using response transmissibilities measured along the structure. Some numerical simulations are presented and a comparison is made with results using frequency response functions. Experimental tests are also undertaken to validate the proposed technique. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, 14 primary schools of Lisbon city, Portugal, followed a questionnaire of the ISAAC - International Study of Asthma and Allergies in Childhood Program, in 2009/2010. The questionnaire contained questions to identify children with respiratory diseases (wheeze, asthma and rhinitis). Total particulate matter (TPM) was passively collected inside two classrooms of each of 14 primary schools. Two types of filter matrices were used to collect TPM: Millipore (IsoporeTM) polycarbonate and quartz. Three campaigns were selected for the measurement of TPM: Spring, Autumn and Winter. The highest difference between the two types of filters is that the mass of collected particles was higher in quartz filters than in polycarbonate filters, even if their correlation is excellent. The highest TPM depositions occurred between October 2009 and March 2010, when related with rhinitis proportion. Rhinitis was found to be related to TPM when the data were grouped seasonally and averaged for all the schools. For the data of 2006/2007, the seasonal variation was found to be related to outdoor particle deposition (below 10 μm).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid growth in genetics and molecular biology combined with the development of techniques for genetically engineering small animals has led to increased interest in in vivo small animal imaging. Small animal imaging has been applied frequently to the imaging of small animals (mice and rats), which are ubiquitous in modeling human diseases and testing treatments. The use of PET in small animals allows the use of subjects as their own control, reducing the interanimal variability. This allows performing longitudinal studies on the same animal and improves the accuracy of biological models. However, small animal PET still suffers from several limitations. The amounts of radiotracers needed, limited scanner sensitivity, image resolution and image quantification issues, all could clearly benefit from additional research. Because nuclear medicine imaging deals with radioactive decay, the emission of radiation energy through photons and particles alongside with the detection of these quanta and particles in different materials make Monte Carlo method an important simulation tool in both nuclear medicine research and clinical practice. In order to optimize the quantitative use of PET in clinical practice, data- and image-processing methods are also a field of intense interest and development. The evaluation of such methods often relies on the use of simulated data and images since these offer control of the ground truth. Monte Carlo simulations are widely used for PET simulation since they take into account all the random processes involved in PET imaging, from the emission of the positron to the detection of the photons by the detectors. Simulation techniques have become an importance and indispensable complement to a wide range of problems that could not be addressed by experimental or analytical approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chromium dioxide (CrO2) has been extensively used in the magnetic recording industry. However, it is its ferromagnetic half-metallic nature that has more recently attracted much attention, primarily for the development of spintronic devices. CrO2 is the only stoichiometric binary oxide theoretically predicted to be fully spin polarized at the Fermi level. It presents a Curie temperature of ∼ 396 K, i.e. well above room temperature, and a magnetic moment of 2 mB per formula unit. However an antiferromagnetic native insulating layer of Cr2O3 is always present on the CrO2 surface which enhances the CrO2 magnetoresistance and might be used as a barrier in magnetic tunnel junctions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: With the decrease of DNA sequencing costs, sequence-based typing methods are rapidly becoming the gold standard for epidemiological surveillance. These methods provide reproducible and comparable results needed for a global scale bacterial population analysis, while retaining their usefulness for local epidemiological surveys. Online databases that collect the generated allelic profiles and associated epidemiological data are available but this wealth of data remains underused and are frequently poorly annotated since no user-friendly tool exists to analyze and explore it. Results: PHYLOViZ is platform independent Java software that allows the integrated analysis of sequence-based typing methods, including SNP data generated from whole genome sequence approaches, and associated epidemiological data. goeBURST and its Minimum Spanning Tree expansion are used for visualizing the possible evolutionary relationships between isolates. The results can be displayed as an annotated graph overlaying the query results of any other epidemiological data available. Conclusions: PHYLOViZ is a user-friendly software that allows the combined analysis of multiple data sources for microbial epidemiological and population studies. It is freely available at http://www.phyloviz.net.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introdução – Numa era em que os tratamentos de Radioterapia Externa (RTE) exigem cada vez mais precisão, a utilização de imagem médica permitirá medir, quantificar e avaliar o impacto do erro provocado pela execução do tratamento ou pelos movimentos dos órgãos. Objetivo – Analisar os dados existentes na literatura acerca de desvios de posicionamento (DP) em patologias de cabeça e pescoço (CP) e próstata, medidos com Cone Beam Computed Tomography (CBCT) ou Electronic Portal Image Device (EPID). Metodologia – Para esta revisão da literatura foram pesquisados artigos recorrendo às bases de dados MEDLINE/PubMed e b-on. Foram incluídos artigos que reportassem DP em patologias CP e próstata medidos através de CBCT e EPID. Seguidamente foram aplicados critérios de validação, que permitiram a seleção dos estudos. Resultados – Após a análise de 35 artigos foram incluídos 13 estudos e validados 9 estudos. Para tumores CP, a média (μ) dos DP encontra-se entre 0,0 e 1,2mm, com um desvio padrão (σ) máximo de 1,3mm. Para patologias de próstata observa-se μDP compreendido entre 0,0 e 7,1mm, com σ máximo de 7,5mm. Discussão/Conclusão – Os DP em patologias CP são atribuídos, maioritariamente, aos efeitos secundários da RTE, como mucosite e dor, que afetam a deglutição e conduzem ao emagrecimento, contribuindo para a instabilidade da posição do doente durante o tratamento, aumentando as incertezas de posicionamento. Os movimentos da próstata devem-se principalmente às variações de preenchimento vesical, retal e gás intestinal. O desconhecimento dos DP afeta negativamente a precisão da RTE. É importante detetá-los e quantificá-los para calcular margens adequadas e a magnitude dos erros, aumentando a precisão da administração de RTE, incluindo o aumento da segurança do doente. - ABSTRACT - Background and Purpose – In an era where precision is an increasing necessity in external radiotherapy (RT), modern medical imaging techniques provide means for measuring, quantifying and evaluating the impact of treatment execution and movement error. The aim of this paper is to review the current literature on the quantification of setup deviations (SD) in patients with head and neck (H&N) or prostate tumors, using Cone Beam Computed Tomography (CBCT) or Electronic Portal Image Device (EPID). Methods – According to the study protocol, MEDLINE/PubMed and b-on databases were searched for trials, which were analyzed using selection criteria based on the quality of the articles. Results – After assessment of 35 papers, 13 studies were included in this analysis and nine were authenticated (6 for prostate and 3 for H&N tumors). The SD in the treatment of H&N cancer patients is in the interval of 0.1 to 1.2mm, whereas in prostate cancer this interval is 0.0 to 7.1mm. Discussion – The reproducibility of patient positioning is the biggest barrier for higher precision in RT, which is affected by geometrical uncertainty, positioning errors and inter or intra-fraction organ movement. There are random and systematic errors associated to patient positioning, introduced since the treatment planning phase or through physiological organ movement. Conclusion – The H&N SD are mostly assigned to the Radiotherapy adverse effects, like mucositis and pain, which affect swallowing and decrease secretions, contributing for the instability of patient positioning during RT treatment and increasing positioning uncertainties. Prostate motion is mainly related to the variation in bladder and rectal filling. Ignoring SD affects negatively the accuracy of RT. Therefore, detection and quantification of SD is crucial in order to calculate appropriate margins, the magnitude of error and to improve accuracy in RTE and patient safety.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Personal memories composed of digital pictures are very popular at the moment. To retrieve these media items annotation is required. During the last years, several approaches have been proposed in order to overcome the image annotation problem. This paper presents our proposals to address this problem. Automatic and semi-automatic learning methods for semantic concepts are presented. The automatic method is based on semantic concepts estimated using visual content, context metadata and audio information. The semi-automatic method is based on results provided by a computer game. The paper describes our proposals and presents their evaluations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A crucial method for investigating patients with coronary artery disease (CAD) is the calculation of the left ventricular ejection fraction (LVEF). It is, consequently, imperative to precisely estimate the value of LVEF--a process that can be done with myocardial perfusion scintigraphy. Therefore, the present study aimed to establish and compare the estimation performance of the quantitative parameters of the reconstruction methods filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM). Methods: A beating-heart phantom with known values of end-diastolic volume, end-systolic volume, and LVEF was used. Quantitative gated SPECT/quantitative perfusion SPECT software was used to obtain these quantitative parameters in a semiautomatic mode. The Butterworth filter was used in FBP, with the cutoff frequencies between 0.2 and 0.8 cycles per pixel combined with the orders of 5, 10, 15, and 20. Sixty-three reconstructions were performed using 2, 4, 6, 8, 10, 12, and 16 OSEM subsets, combined with several iterations: 2, 4, 6, 8, 10, 12, 16, 32, and 64. Results: With FBP, the values of end-diastolic, end-systolic, and the stroke volumes rise as the cutoff frequency increases, whereas the value of LVEF diminishes. This same pattern is verified with the OSEM reconstruction. However, with OSEM there is a more precise estimation of the quantitative parameters, especially with the combinations 2 iterations × 10 subsets and 2 iterations × 12 subsets. Conclusion: The OSEM reconstruction presents better estimations of the quantitative parameters than does FBP. This study recommends the use of 2 iterations with 10 or 12 subsets for OSEM and a cutoff frequency of 0.5 cycles per pixel with the orders 5, 10, or 15 for FBP as the best estimations for the left ventricular volumes and ejection fraction quantification in myocardial perfusion scintigraphy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microbial adhesion is a field of recognized relevance and, as such, an impressive array of tools has been developed to understand its molecular mechanisms and ultimately for its quantification. Some of the major limitations found within these methodologies concern the incubation time, the small number of cells analyzed, and the operator's subjectivity. To overcome these aspects, we have developed a quantitative method to measure yeast cells' adhesion through flow cytometry. In this methodology, a suspension of yeast cells is mixed with green fluorescent polystyrene microspheres (uncoated or coated with host proteins). Within 2 h, an adhesion profile is obtained based on two parameters: percentage and cells-microsphere population's distribution pattern. This flow cytometry protocol represents a useful tool to quantify yeast adhesion to different substrata in a large scale, providing manifold data in a speedy and informative manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tomographic image can be degraded, partially by patient based attenuation. The aim of this paper is to quantitatively verify the effects of attenuation correction methods Chang and CT in 111In studies through the analysis of profiles from abdominal SPECT, correspondent to a uniform radionuclide uptake organ, the left kidney.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Na indústria farmacêutica, a limpeza dos equipamentos e superfícies é muito importante no processo de fabrico/embalagem dos produtos farmacêuticos. Possíveis resíduos contaminantes devem ser removidos dos equipamentos e das superfícies envolvidas no processo. De acordo com as Boas Práticas de Fabrico (GMP), os procedimentos de limpeza e os métodos analíticos usados para determinar as quantidades de resíduos devem ser validados. O método analítico combinado com o método de amostragem utilizado na colheita de amostras deve ser sujeito a um ensaio de “recovery”. Neste trabalho apresenta-se uma estratégia inovadora para a validação de limpeza de formas farmacêuticas semi-sólidas. Propõe-se o uso de um método de amostragem que consiste na colheita direta de amostra após o seu fabrico, sendo a análise de resíduos feita directamente nesta amostra. Os produtos escolhidos para a avaliação da estratégia foram dois medicamentos dermatológicos, apresentados na forma de pomada e produzidos numa unidade de fabrico de vários produtos, pela Schering Plough Farma/ Merck Sharp & Dohme (Cacém, Portugal). Como métodos analíticos para a quantificação dos resíduos, utilizaram-se métodos validados por via espectrofotométrica (HPLC), usados na análise do produto acabado. A validação de limpeza foi avaliada através da análise de uma quantidade conhecida de pomada (produto B (*)), usando o método de análise da pomada fabricada anteriormente (produto A (*)), de modo a verificar-se a existência ou não de agente de limpeza e substâncias ativas deixadas após a limpeza do produto A, e vice-versa. As concentrações residuais das substâncias ativas e do agente de limpeza encontradas após a limpeza foram nulas, ou seja, inferiores ao limite de deteção (LOD), sendo que o critério de aceitação da limpeza utilizado foi de 6,4 x 10-4 mg/g para a substância ativa 1 (*); 1,0 x 10-2 mg/g para a substância ativa 2 (*); 1,0 x 10-3 mg/g para a substância ativa 3 (*) e de 10 ppm para o agente de limpeza. No ensaio de “recovery”, obtiveram-se resultados acima de 70% para todas as substâncias ativas e para o agente de limpeza nas duas pomadas. Antes de se proceder a este ensaio de “recovery”, houve a necessidade de ajustar as condições cromatográficas dos métodos analíticos de ambos os produtos e do agente de limpeza, por forma a obter-se valores da conformidade do sistema (fator de tailling e de resolução) de acordo com as especificações. A precisão dos resultados, reportada como desvio padrão relativo (RSD), deu abaixo de 2,0%, excepto nos ensaios que envolvem a substância ativa 3, cuja especificação é inferior a 10,0%. Os resultados obtidos demonstraram que os procedimentos de limpeza usados na unidade de fabrico em causa são eficazes, eliminando assim a existência de contaminação cruzada.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Epidemiological studies showed increased prevalence of respiratory symptoms and adverse changes in pulmonary function parameters in poultry workers, corroborating the increased exposure to risk factors, such as fungal load and their metabolites. This study aimed to determine the occupational exposure threat due to fungal contamination caused by the toxigenic isolates belonging to the complex of the species of Aspergillus flavus and also isolates fromAspergillus fumigatus species complex. The study was carried out in seven Portuguese poultries, using cultural and molecularmethodologies. For conventional/cultural methods, air, surfaces, and litter samples were collected by impaction method using the Millipore Air Sampler. For the molecular analysis, air samples were collected by impinger method using the Coriolis μ air sampler. After DNA extraction, samples were analyzed by real-time PCR using specific primers and probes for toxigenic strains of the Aspergillus flavus complex and for detection of isolates from Aspergillus fumigatus complex. Through conventional methods, and among the Aspergillus genus, different prevalences were detected regarding the presence of Aspergillus flavus and Aspergillus fumigatus species complexes, namely: 74.5 versus 1.0% in the air samples, 24.0 versus 16.0% in the surfaces, 0 versus 32.6% in new litter, and 9.9 versus 15.9%in used litter. Through molecular biology, we were able to detect the presence of aflatoxigenic strains in pavilions in which Aspergillus flavus did not grow in culture. Aspergillus fumigatus was only found in one indoor air sample by conventional methods. Using molecular methodologies, however, Aspergillus fumigatus complex was detected in seven indoor samples from three different poultry units. The characterization of fungal contamination caused by Aspergillus flavus and Aspergillus fumigatus raises the concern of occupational threat not only due to the detected fungal load but also because of the toxigenic potential of these species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Liver steatosis is a common disease usually associated with social and genetic factors. Early detection and quantification is important since it can evolve to cirrhosis. Steatosis is usually a diffuse liver disease, since it is globally affected. However, steatosis can also be focal affecting only some foci difficult to discriminate. In both cases, steatosis is detected by laboratorial analysis and visual inspection of ultrasound images of the hepatic parenchyma. Liver biopsy is the most accurate diagnostic method but its invasive nature suggest the use of other non-invasive methods, while visual inspection of the ultrasound images is subjective and prone to error. In this paper a new Computer Aided Diagnosis (CAD) system for steatosis classification and analysis is presented, where the Bayes Factor, obatined from objective intensity and textural features extracted from US images of the liver, is computed in a local or global basis. The main goal is to provide the physician with an application to make it faster and accurate the diagnosis and quantification of steatosis, namely in a screening approach. The results showed an overall accuracy of 93.54% with a sensibility of 95.83% and 85.71% for normal and steatosis class, respectively. The proposed CAD system seemed suitable as a graphical display for steatosis classification and comparison with some of the most recent works in the literature is also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project was developed to fully assess the indoor air quality in archives and libraries from a fungal flora point of view. It uses classical methodologies such as traditional culture media – for the viable fungi – and modern molecular biology protocols, especially relevant to assess the non-viable fraction of the biological contaminants. Denaturing high-performance liquid chromatography (DHPLC) has emerged as an alternative to denaturing gradient gel electrophoresis (DGGE) and has already been applied to the study of a few bacterial communities. We propose the application of DHPLC to the study of fungal colonization on paper-based archive materials. This technology allows for the identification of each component of a mixture of fungi based on their genetic variation. In a highly complex mixture of microbial DNA this method can be used simply to study the population dynamics, and it also allows for sample fraction collection, which can, in many cases, be immediately sequenced, circumventing the need for cloning. Some examples of the methodological application are shown. Also applied is fragment length analysis for the study of mixed Candida samples. Both of these methods can later be applied in various fields, such as clinical and sand sample analysis. So far, the environmental analyses have been extremely useful to determine potentially pathogenic/toxinogenic fungi such as Stachybotrys sp., Aspergillus niger, Aspergillus fumigatus, and Fusarium sp. This work will hopefully lead to more accurate evaluation of environmental conditions for both human health and the preservation of documents.