66 resultados para Processing methods
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Microarray allow to monitoring simultaneously thousands of genes, where the abundance of the transcripts under a same experimental condition at the same time can be quantified. Among various available array technologies, double channel cDNA microarray experiments have arisen in numerous technical protocols associated to genomic studies, which is the focus of this work. Microarray experiments involve many steps and each one can affect the quality of raw data. Background correction and normalization are preprocessing techniques to clean and correct the raw data when undesirable fluctuations arise from technical factors. Several recent studies showed that there is no preprocessing strategy that outperforms others in all circumstances and thus it seems difficult to provide general recommendations. In this work, it is proposed to use exploratory techniques to visualize the effects of preprocessing methods on statistical analysis of cancer two-channel microarray data sets, where the cancer types (classes) are known. For selecting differential expressed genes the arrow plot was used and the graph of profiles resultant from the correspondence analysis for visualizing the results. It was used 6 background methods and 6 normalization methods, performing 36 pre-processing methods and it was analyzed in a published cDNA microarray database (Liver) available at http://genome-www5.stanford.edu/ which microarrays were already classified by cancer type. All statistical analyses were performed using the R statistical software.
Resumo:
The rapid growth in genetics and molecular biology combined with the development of techniques for genetically engineering small animals has led to increased interest in in vivo small animal imaging. Small animal imaging has been applied frequently to the imaging of small animals (mice and rats), which are ubiquitous in modeling human diseases and testing treatments. The use of PET in small animals allows the use of subjects as their own control, reducing the interanimal variability. This allows performing longitudinal studies on the same animal and improves the accuracy of biological models. However, small animal PET still suffers from several limitations. The amounts of radiotracers needed, limited scanner sensitivity, image resolution and image quantification issues, all could clearly benefit from additional research. Because nuclear medicine imaging deals with radioactive decay, the emission of radiation energy through photons and particles alongside with the detection of these quanta and particles in different materials make Monte Carlo method an important simulation tool in both nuclear medicine research and clinical practice. In order to optimize the quantitative use of PET in clinical practice, data- and image-processing methods are also a field of intense interest and development. The evaluation of such methods often relies on the use of simulated data and images since these offer control of the ground truth. Monte Carlo simulations are widely used for PET simulation since they take into account all the random processes involved in PET imaging, from the emission of the positron to the detection of the photons by the detectors. Simulation techniques have become an importance and indispensable complement to a wide range of problems that could not be addressed by experimental or analytical approaches.
Resumo:
Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.
Resumo:
This paper is an elaboration of the DECA algorithm [1] to blindly unmix hyperspectral data. The underlying mixing model is linear, meaning that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. The proposed method, as DECA, is tailored to highly mixed mixtures in which the geometric based approaches fail to identify the simplex of minimum volume enclosing the observed spectral vectors. We resort then to a statitistical framework, where the abundance fractions are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. With respect to DECA, we introduce two improvements: 1) the number of Dirichlet modes are inferred based on the minimum description length (MDL) principle; 2) The generalized expectation maximization (GEM) algorithm we adopt to infer the model parameters is improved by using alternating minimization and augmented Lagrangian methods to compute the mixing matrix. The effectiveness of the proposed algorithm is illustrated with simulated and read data.
Resumo:
The use of iris recognition for human authentication has been spreading in the past years. Daugman has proposed a method for iris recognition, composed by four stages: segmentation, normalization, feature extraction, and matching. In this paper we propose some modifications and extensions to Daugman's method to cope with noisy images. These modifications are proposed after a study of images of CASIA and UBIRIS databases. The major modification is on the computationally demanding segmentation stage, for which we propose a faster and equally accurate template matching approach. The extensions on the algorithm address the important issue of pre-processing that depends on the image database, being mandatory when we have a non infra-red camera, like a typical WebCam. For this scenario, we propose methods for reflection removal and pupil enhancement and isolation. The tests, carried out by our C# application on grayscale CASIA and UBIRIS images show that the template matching segmentation method is more accurate and faster than the previous one, for noisy images. The proposed algorithms are found to be efficient and necessary when we deal with non infra-red images and non uniform illumination.
Resumo:
In this work, 14 primary schools of Lisbon city, Portugal, followed a questionnaire of the ISAAC - International Study of Asthma and Allergies in Childhood Program, in 2009/2010. The questionnaire contained questions to identify children with respiratory diseases (wheeze, asthma and rhinitis). Total particulate matter (TPM) was passively collected inside two classrooms of each of 14 primary schools. Two types of filter matrices were used to collect TPM: Millipore (IsoporeTM) polycarbonate and quartz. Three campaigns were selected for the measurement of TPM: Spring, Autumn and Winter. The highest difference between the two types of filters is that the mass of collected particles was higher in quartz filters than in polycarbonate filters, even if their correlation is excellent. The highest TPM depositions occurred between October 2009 and March 2010, when related with rhinitis proportion. Rhinitis was found to be related to TPM when the data were grouped seasonally and averaged for all the schools. For the data of 2006/2007, the seasonal variation was found to be related to outdoor particle deposition (below 10 μm).
Resumo:
O trabalho apresentado nesta dissertação refere-se à concepção, projecto e realização experimental de um conversor estático de potência tolerante a falhas. Foram analisados trabalhos de investigação sobre modos de falha de conversores electrónicos de potência, topologias de conversores tolerantes a falhas, métodos de detecção de falhas, entre outros. Com vista à concepção de uma solução, foram nomeados e analisados os principais modos de falhas para três soluções propostas de conversores com topologias tolerantes a falhas onde existem elementos redundantes em modo de espera. Foram analisados os vários aspectos de natureza técnica dos circuitos de potência e guiamento de sinais onde se salientam a necessidade de tempos mortos entre os sinais de disparo de IGBT do mesmo ramo, o isolamento galvânico entre os vários andares de disparo, a necessidade de minimizar as auto-induções entre o condensador DC e os braços do conversor de potência. Com vista a melhorar a fiabilidade e segurança de funcionamento do conversor estático de potência tolerante a falhas, foi concebido um circuito electrónico permitindo a aceleração da actuação normal de contactores e outro circuito responsável pelo encaminhamento e inibição dos sinais de disparo. Para a aplicação do conversor estático de potência tolerante a falhas desenvolvido num accionamento com um motor de corrente contínua, foi implementado um algoritmo de controlo numa placa de processamento digital de sinais (DSP), sendo a supervisão e actuação do sistema realizados em tempo-real, para a detecção de falhas e actuação de contactores e controlo de corrente e velocidade do motor utilizando uma estratégia de comando PWM. Foram realizados ensaios que, mediante uma detecção adequada de falhas, realiza a comutação entre blocos de conversores de potência. São apresentados e discutidos resultados experimentais, obtidos usando o protótipo laboratorial.
Resumo:
A two terminal optically addressed image processing device based on two stacked sensing/switching p-i-n a-SiC:H diodes is presented. The charge packets are injected optically into the p-i-n sensing photodiode and confined at the illuminated regions changing locally the electrical field profile across the p-i-n switching diode. A red scanner is used for charge readout. The various design parameters and addressing architecture trade-offs are discussed. The influence on the transfer functions of an a-SiC:H sensing absorber optimized for red transmittance and blue collection or of a floating anode in between is analysed. Results show that the thin a-SiC:H sensing absorber confines the readout to the switching diode and filters the light allowing full colour detection at two appropriated voltages. When the floating anode is used the spectral response broadens, allowing B&W image recognition with improved light-to-dark sensitivity. A physical model supports the image and colour recognition process.
Resumo:
Nanofiltration process for the treatment/valorisation of cork processing wastewaters was studied. A DS-5 DK 20/40 (GE Water Technologies) nanofiltration membrane/module was used, having 2.09 m(2) of surface area. Hydraulic permeability was determined with pure water and the result was 5.2 L.h(-1).m(-2).bar(-1). The membrane presents a rejection of 51% and 99% for NaCl and MgSO4 salts, respectively. Two different types of regimes were used in the wastewaters filtration process, total recycling mode and concentration mode. The first filtration regime showed that the most favourable working transmembrane pressure was 7 bar working at 25 degrees C. For the concentration mode experiments it was observed a 30% decline of the permeate fluxes when a volumetric concentration factor of 5 was reached. The permeate COD, BOD5, colour and TOC rejection values remained well above the 90% value, which allows, therefore, the concentration of organic matter (namely the tannin fraction) in the concentrate stream that can be further used by other industries. The permeate characterization showed that it cannot be directly discharged to the environment as it does not fulfil the values of the Portuguese discharge legislation. However, the permeate stream can be recycled to the process (boiling tanks) as it presents no colour and low TOC (< 60 ppm) or if wastewater discharge is envisaged we have observed that the permeate biodegradability is higher than 0.5, which renders conventional wastewater treatments feasible.
Resumo:
Chromium dioxide (CrO2) has been extensively used in the magnetic recording industry. However, it is its ferromagnetic half-metallic nature that has more recently attracted much attention, primarily for the development of spintronic devices. CrO2 is the only stoichiometric binary oxide theoretically predicted to be fully spin polarized at the Fermi level. It presents a Curie temperature of ∼ 396 K, i.e. well above room temperature, and a magnetic moment of 2 mB per formula unit. However an antiferromagnetic native insulating layer of Cr2O3 is always present on the CrO2 surface which enhances the CrO2 magnetoresistance and might be used as a barrier in magnetic tunnel junctions.
Resumo:
Background: With the decrease of DNA sequencing costs, sequence-based typing methods are rapidly becoming the gold standard for epidemiological surveillance. These methods provide reproducible and comparable results needed for a global scale bacterial population analysis, while retaining their usefulness for local epidemiological surveys. Online databases that collect the generated allelic profiles and associated epidemiological data are available but this wealth of data remains underused and are frequently poorly annotated since no user-friendly tool exists to analyze and explore it. Results: PHYLOViZ is platform independent Java software that allows the integrated analysis of sequence-based typing methods, including SNP data generated from whole genome sequence approaches, and associated epidemiological data. goeBURST and its Minimum Spanning Tree expansion are used for visualizing the possible evolutionary relationships between isolates. The results can be displayed as an annotated graph overlaying the query results of any other epidemiological data available. Conclusions: PHYLOViZ is a user-friendly software that allows the combined analysis of multiple data sources for microbial epidemiological and population studies. It is freely available at http://www.phyloviz.net.
Resumo:
Cork processing wastewater is an aqueous complex mixture of organic compounds that have been extracted from cork planks during the boiling process. These compounds, such as polysaccharides and polyphenols, have different biodegradability rates, which depend not only on the natureof the compound but also on the size of the compound. The aim of this study is to determine the biochemical oxygen demands (BOD) and biodegradationrate constants (k) for different cork wastewater fractions with different organic matter characteristics. These wastewater fractions were obtained using membrane separation processes, namely nanofiltration (NF) and ultrafiltration (UF). The nanofiltration and ultrafiltration membranes molecular weight cut-offs (MWCO) ranged from 0.125 to 91 kDa. The results obtained showed that the biodegradation rate constant for the cork processing wastewater was around 0.3 d(-1) and the k values for the permeates varied between 0.27-0.72 d(-1), being the lower values observed for permeates generated by the membranes with higher MWCO and the higher values observed for the permeates generated by the membranes with lower MWCO. These higher k values indicate that the biodegradable organic matter that is permeated by the membranes with tighter MWCO is more readily biodegradated.
Resumo:
Structures experience various types of loads along their lifetime, which can be either static or dynamic and may be associated to phenomena of corrosion and chemical attack, among others. As a consequence, different types of structural damage can be produced; the deteriorated structure may have its capacity affected, leading to excessive vibration problems or even possible failure. It is very important to develop methods that are able to simultaneously detect the existence of damage and to quantify its extent. In this paper the authors propose a method to detect and quantify structural damage, using response transmissibilities measured along the structure. Some numerical simulations are presented and a comparison is made with results using frequency response functions. Experimental tests are also undertaken to validate the proposed technique. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Personal memories composed of digital pictures are very popular at the moment. To retrieve these media items annotation is required. During the last years, several approaches have been proposed in order to overcome the image annotation problem. This paper presents our proposals to address this problem. Automatic and semi-automatic learning methods for semantic concepts are presented. The automatic method is based on semantic concepts estimated using visual content, context metadata and audio information. The semi-automatic method is based on results provided by a computer game. The paper describes our proposals and presents their evaluations.