36 resultados para Air Quality Modelling
Resumo:
MRI has evolved into an important diagnostic technique in medical imaging. However, reliability of the derived diagnosis can be degraded by artifacts, which challenge both radiologists and automatic computer-aided diagnosis. This work proposes a fully-automatic method for measuring image quality of three-dimensional (3D) structural MRI. Quality measures are derived by analyzing the air background of magnitude images and are capable of detecting image degradation from several sources, including bulk motion, residual magnetization from incomplete spoiling, blurring, and ghosting. The method has been validated on 749 3D T(1)-weighted 1.5T and 3T head scans acquired at 36 Alzheimer's Disease Neuroimaging Initiative (ADNI) study sites operating with various software and hardware combinations. Results are compared against qualitative grades assigned by the ADNI quality control center (taken as the reference standard). The derived quality indices are independent of the MRI system used and agree with the reference standard quality ratings with high sensitivity and specificity (>85%). The proposed procedures for quality assessment could be of great value for both research and routine clinical imaging. It could greatly improve workflow through its ability to rule out the need for a repeat scan while the patient is still in the magnet bore.
Resumo:
Assessment of image quality for digital x-ray mammography systems used in European screening programs relies mainly on contrast-detail CDMAM phantom scoring and requires the acquisition and analysis of many images in order to reduce variability in threshold detectability. Part II of this study proposes an alternative method based on the detectability index (d') calculated for a non-prewhitened model observer with an eye filter (NPWE). The detectability index was calculated from the normalized noise power spectrum and image contrast, both measured from an image of a 5 cm poly(methyl methacrylate) phantom containing a 0.2 mm thick aluminium square, and the pre-sampling modulation transfer function. This was performed as a function of air kerma at the detector for 11 different digital mammography systems. These calculated d' values were compared against threshold gold thickness (T) results measured with the CDMAM test object and against derived theoretical relationships. A simple relationship was found between T and d', as a function of detector air kerma; a linear relationship was found between d' and contrast-to-noise ratio. The values of threshold thickness used to specify acceptable performance in the European Guidelines for 0.10 and 0.25 mm diameter discs were equivalent to threshold calculated detectability indices of 1.05 and 6.30, respectively. The NPWE method is a validated alternative to CDMAM scoring for use in the image quality specification, quality control and optimization of digital x-ray systems for screening mammography.
Resumo:
Abstract One of the most important issues in molecular biology is to understand regulatory mechanisms that control gene expression. Gene expression is often regulated by proteins, called transcription factors which bind to short (5 to 20 base pairs),degenerate segments of DNA. Experimental efforts towards understanding the sequence specificity of transcription factors is laborious and expensive, but can be substantially accelerated with the use of computational predictions. This thesis describes the use of algorithms and resources for transcriptionfactor binding site analysis in addressing quantitative modelling, where probabilitic models are built to represent binding properties of a transcription factor and can be used to find new functional binding sites in genomes. Initially, an open-access database(HTPSELEX) was created, holding high quality binding sequences for two eukaryotic families of transcription factors namely CTF/NF1 and LEFT/TCF. The binding sequences were elucidated using a recently described experimental procedure called HTP-SELEX, that allows generation of large number (> 1000) of binding sites using mass sequencing technology. For each HTP-SELEX experiments we also provide accurate primary experimental information about the protein material used, details of the wet lab protocol, an archive of sequencing trace files, and assembled clone sequences of binding sequences. The database also offers reasonably large SELEX libraries obtained with conventional low-throughput protocols.The database is available at http://wwwisrec.isb-sib.ch/htpselex/ and and ftp://ftp.isrec.isb-sib.ch/pub/databases/htpselex. The Expectation-Maximisation(EM) algorithm is one the frequently used methods to estimate probabilistic models to represent the sequence specificity of transcription factors. We present computer simulations in order to estimate the precision of EM estimated models as a function of data set parameters(like length of initial sequences, number of initial sequences, percentage of nonbinding sequences). We observed a remarkable robustness of the EM algorithm with regard to length of training sequences and the degree of contamination. The HTPSELEX database and the benchmarked results of the EM algorithm formed part of the foundation for the subsequent project, where a statistical framework called hidden Markov model has been developed to represent sequence specificity of the transcription factors CTF/NF1 and LEF1/TCF using the HTP-SELEX experiment data. The hidden Markov model framework is capable of both predicting and classifying CTF/NF1 and LEF1/TCF binding sites. A covariance analysis of the binding sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism. We next tested the LEF1/TCF model by computing binding scores for a set of LEF1/TCF binding sequences for which relative affinities were determined experimentally using non-linear regression. The predicted and experimentally determined binding affinities were in good correlation.
Resumo:
Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.
Resumo:
Especially in panel surveys, respondent attrition, respondent learning, and interviewer experience effects play a crucial role with respect to data quality. We examine three interview survey quality indicators in the same survey in a cross sectional as well as in a longitudinal way. In the cross sectional analysis we compare data quality in the mature original sample with that in a refreshment sample, surveyed in the same wave. Because in the same wave an interviewer survey was conducted, collecting attitudes on their socio demography, survey attitudes and burden measures, we are able to consider interviewer fixed effects as well. The longitudinal analysis gives more insight in the respondent learning effects with respect to the quality indicators considered by considering the very same respondents across waves. The Swiss Household Panel, a CATI survey representative of the Swiss residential population, forms an ideal modelling database: the interviewer - respondent assignment is random, both within and across waves. This design avoids possible confusion with other effects stemming from a non-random assignment of interviewers, e.g. area effects or effects from assigning the best interviewers to the hard cases. In order to separate interviewer, respondent and wave effects, we build cross-classified multilevel models.
Resumo:
Mountain regions worldwide are particularly sensitive to on-going climate change. Specifically in the Alps in Switzerland, the temperature has increased twice as fast than in the rest of the Northern hemisphere. Water temperature closely follows the annual air temperature cycle, severely impacting streams and freshwater ecosystems. In the last 20 years, brown trout (Salmo trutta L) catch has declined by approximately 40-50% in many rivers in Switzerland. Increasing water temperature has been suggested as one of the most likely cause of this decline. Temperature has a direct effect on trout population dynamics through developmental and disease control but can also indirectly impact dynamics via food-web interactions such as resource availability. We developed a spatially explicit modelling framework that allows spatial and temporal projections of trout biomass using the Aare river catchment as a model system, in order to assess the spatial and seasonal patterns of trout biomass variation. Given that biomass has a seasonal variation depending on trout life history stage, we developed seasonal biomass variation models for three periods of the year (Autumn-Winter, Spring and Summer). Because stream water temperature is a critical parameter for brown trout development, we first calibrated a model to predict water temperature as a function of air temperature to be able to further apply climate change scenarios. We then built a model of trout biomass variation by linking water temperature to trout biomass measurements collected by electro-fishing in 21 stations from 2009 to 2011. The different modelling components of our framework had overall a good predictive ability and we could show a seasonal effect of water temperature affecting trout biomass variation. Our statistical framework uses a minimum set of input variables that make it easily transferable to other study areas or fish species but could be improved by including effects of the biotic environment and the evolution of demographical parameters over time. However, our framework still remains informative to spatially highlight where potential changes of water temperature could affect trout biomass. (C) 2015 Elsevier B.V. All rights reserved.-