22 resultados para Typographical Errors


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Microarrays are high throughput biological assays that allow the screening of thousands of genes for their expression. The main idea behind microarrays is to compute for each gene a unique signal that is directly proportional to the quantity of mRNA that was hybridized on the chip. A large number of steps and errors associated with each step make the generated expression signal noisy. As a result, microarray data need to be carefully pre-processed before their analysis can be assumed to lead to reliable and biologically relevant conclusions. This thesis focuses on developing methods for improving gene signal and further utilizing this improved signal for higher level analysis. To achieve this, first, approaches for designing microarray experiments using various optimality criteria, considering both biological and technical replicates, are described. A carefully designed experiment leads to signal with low noise, as the effect of unwanted variations is minimized and the precision of the estimates of the parameters of interest are maximized. Second, a system for improving the gene signal by using three scans at varying scanner sensitivities is developed. A novel Bayesian latent intensity model is then applied on these three sets of expression values, corresponding to the three scans, to estimate the suitably calibrated true signal of genes. Third, a novel image segmentation approach that segregates the fluorescent signal from the undesired noise is developed using an additional dye, SYBR green RNA II. This technique helped in identifying signal only with respect to the hybridized DNA, and signal corresponding to dust, scratch, spilling of dye, and other noises, are avoided. Fourth, an integrated statistical model is developed, where signal correction, systematic array effects, dye effects, and differential expression, are modelled jointly as opposed to a sequential application of several methods of analysis. The methods described in here have been tested only for cDNA microarrays, but can also, with some modifications, be applied to other high-throughput technologies. Keywords: High-throughput technology, microarray, cDNA, multiple scans, Bayesian hierarchical models, image analysis, experimental design, MCMC, WinBUGS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Delay and disruption tolerant networks (DTNs) are computer networks where round trip delays and error rates are high and disconnections frequent. Examples of these extreme networks are space communications, sensor networks, connecting rural villages to the Internet and even interconnecting commodity portable wireless devices and mobile phones. Basic elements of delay tolerant networks are a store-and-forward message transfer resembling traditional mail delivery, an opportunistic and intermittent routing, and an extensible cross-region resource naming service. Individual nodes of the network take an active part in routing the traffic and provide in-network data storage for application data that flows through the network. Application architecture for delay tolerant networks differs also from those used in traditional networks. It has become feasible to design applications that are network-aware and opportunistic, taking an advantage of different network connection speeds and capabilities. This might change some of the basic paradigms of network application design. DTN protocols will also support in designing applications which depend on processes to be persistent over reboots and power failures. DTN protocols could also be applicable to traditional networks in cases where high tolerance to delays or errors would be desired. It is apparent that challenged networks also challenge the traditional strictly layered model of network application design. This thesis provides an extensive introduction to delay tolerant networking concepts and applications. Most attention is given to challenging problems of routing and application architecture. Finally, future prospects of DTN applications and implementations are envisioned through recent research results and an interview with an active researcher of DTN networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Refractive errors, especially myopia, seem to increase worldwide. Concurrently, the number of surgical refractive corrections has increased rapidly, with several million procedures performed annually. However, excimer laser surgery was introduced after a limited number of studies done with animals and to date there still are only few long-term follow-up studies of the results. The present thesis aims to evaluate the safety and functional outcome of, as well as to quantify the cellular changes and remodelling in the human cornea after, photorefractive keratectomy (PRK) and laser assisted in situ keratomileusis (LASIK). These procedures are the two most common laser surgical refractive methods. In Study I, myopic ophthalmic residents at Helsinki University Eye Hospital underwent a refractive correction by PRK. Five patients were followed up for 6 months to assess their subjective experience in hospital work and their performance in car driving simulator and in other visuomotor functions. Corneal morphological changes were assessed by in vivo confocal microscopy (ivCM). Study II comprised 14 patients who had undergone a PRK operation in 1993-1994. Visual acuity was examined and ivCM examinations performed 5 years postoperatively. In Study III 15 patients received LASIK refractive correction for moderate to high myopia (-6 - -12 D). Their corneal recovery was followed by ivCM for 2 years. Diffuse lamellar keratitis (DLK) is a common but variable complication of LASIK. Yet, its aetiology remains unknown. In Study IV we examined six patients who had developed DLK as a consequence of formation of an intraoperative or post-LASIK epithelial defect, to assess the corneal and conjunctival inflammatory reaction. In the whole series, the mean refractive correction was -6.46 diopters. The best spectacle corrected visual acuity (BSCVA) improved in 30 % of patients, whereas in four patients BSCVA decreased slightly. The mean achieved refraction was 0.35 D undercorrected. After PRK, the stromal scar formation was highest at 2 to 3 months postoperatively and subsequently decreased. At 5 years increased reflectivity in the subepithelial stroma was observed in all patients. Interestingly, no Bowman s layer was detected in any patient. Subbasal nerve fiber bundle(snfb) regeneration could be observed already at 2 months in 2 patients after PRK. After 5 years, all corneas presented with snfb, the density of which, however, was still lower than in control corneas. LASIK induced a hypocellular area on both sides of the flap interface. A decrease of the most anterior keratocyte density was also observed. In the corneas that developed DLK, inflammatory cell-type objects were present in the flap interface in half of the patients. The other patients presented only with keratocyte activation and highly reflective extracellular matrix. These changes resolved completely with medication and time. Snfb regeneration was first detected at one month post-LASIK, but still after two years the density of snfb, however, was only 64 % of the preoperative values. The performance of ophthalmological examinations and microsurgery without spectacles was easier postoperatively, which was appreciated by the residents. Both PRK and LASIK showed moderate to good accuracy and high safety. In terms of visual perception and subjective evaluation, few patients stated any complaints in the whole series of studies. Instead, the majority of patients experienced a marked improvement in everyday life and work performance. PRK and LASIK have shown similar results, with good long term morphological healing. It seems evident that, even without the benefit of over-20-year follow-up results, these procedures are sufficiently safe and accurate for refractive corrections and corneal reshaping.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Visual acuities at the time of referral and on the day before surgery were compared in 124 patients operated on for cataract in Vaasa Central Hospital, Finland. Preoperative visual acuity and the occurrence of ocular and general disease were compared in samples of consecutive cataract extractions performed in 1982, 1985, 1990, 1995 and 2000 in two hospitals in the Vaasa region in Finland. The repeatability and standard deviation of random measurement error in visual acuity and refractive error determination in a clinical environment in cataractous, pseudophakic and healthy eyes were estimated by re-examining visual acuity and refractive error of patients referred to cataract surgery or consultation by ophthalmic professionals. Altogether 99 eyes of 99 persons (41 cataractous, 36 pseudophakic and 22 healthy eyes) with a visual acuity range of Snellen 0.3 to 1.3 (0.52 to -0.11 logMAR) were examined. During an average waiting time of 13 months, visual acuity in the study eye decreased from 0.68 logMAR to 0.96 logMAR (from 0.2 to 0.1 in Snellen decimal values). The average decrease in vision was 0.27 logMAR per year. In the fastest quartile, visual acuity change per year was 0.75 logMAR, and in the second fastest 0.29 logMAR, the third and fourth quartiles were virtually unaffected. From 1982 to 2000, the incidence of cataract surgery increased from 1.0 to 7.2 operations per 1000 inhabitants per year in the Vaasa region. The average preoperative visual acuity in the operated eye increased by 0.85 logMAR (in decimal values from 0.03to 0.2) and in the better eye 0.27 logMAR (in decimal values from 0.23 to 0.43) over this period. The proportion of patients profoundly visually handicapped (VA in the better eye <0.1) before the operation fell from 15% to 4%, and that of patients less profoundly visually handicapped (VA in the better eye 0.1 to <0.3) from 47% to 15%. The repeatability visual acuity measurement estimated as a coefficient of repeatability for all 99 eyes was ±0.18 logMAR, and the standard deviation of measurement error was 0.06 logMAR. Eyes with the lowest visual acuity (0.3-0.45) had the largest variability, the coefficient of repeatability values being ±0.24 logMAR and eyes with a visual acuity of 0.7 or better had the smallest, ±0.12 logMAR. The repeatability of refractive error measurement was studied in the same patient material as the repeatability of visual acuity. Differences between measurements 1 and 2 were calculated as three-dimensional vector values and spherical equivalents and expressed by coefficients of repeatability. Coefficients of repeatability for all eyes for vertical, torsional and horisontal vectors were ±0.74D, ±0.34D and ±0.93D, respectively, and for spherical equivalent for all eyes ±0.74D. Eyes with lower visual acuity (0.3-0.45) had larger variability in vector and spherical equivalent values (±1.14), but the difference between visual acuity groups was not statistically significant. The difference in the mean defocus equivalent between measurements 1 and 2 was, however, significantly greater in the lower visual acuity group. If a change of ±0.5D (measured in defocus equivalents) is accepted as a basis for change of spectacles for eyes with good vision, the basis for eyes in the visual acuity range of 0.3 - 0.65 would be ±1D. Differences in repeated visual acuity measurements are partly explained by errors in refractive error measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The greatest effect on reducing mortality in breast cancer comes from the detection and treatment of invasive cancer when it is as small as possible. Although mammography screening is known to be effective, observer errors are frequent and false-negative cancers can be found in retrospective studies of prior mammograms. In the year 2001, 67 women with 69 surgically proven cancers detected at screening in the Mammography Centre of Helsinki University Hospital had previous mammograms as well. These mammograms were analyzed by an experienced screening radiologist, who found that 36 lesions were already visible in previous screening rounds. CAD (Second Look v. 4.01) detected 23 of these missed lesions. Eight readers with different kinds of experience with mammography screening read the films of 200 women with and without CAD. These films included 35 of those missed lesions and 16 screen-detected cancers. CAD sensitivity was 70.6% and specificity 15.8%. Use of CAD lengthened the mean time spent for readings but did not significantly affect readers sensitivities or specificities. Therefore the use of applied version of CAD (Second Look v. 4.01) is questionable. Because none of those eight readers found exactly same cancers, two reading methods were compared: summarized independent reading (at least a single cancer-positive opinion within the group considered decisive) and conference consensus reading (the cancer-positive opinion of the reader majority was considered decisive). The greatest sensitivity of 74.5% was achieved when the independent readings of 4 best-performing readers were summarized. Overall the summarized independent readings were more sensitive than conference consensus readings (64.7% vs. 43.1%) while there was far less difference in mean specificities (92.4% vs. 97.7%). After detecting suspicious lesion, the radiologist has to decide what is the most accurate, fast, and cost-effective means of further work-up. The feasibility of FNAC and CNB in the diagnosis of breast lesions was compared in non-randomised, retrospective study of 580 (503 malignant) breast lesions of 572 patients. The absolute sensitivity for CNB was better than for FNAC, 96% (206/214) vs. 67% (194/289) (p < 0.0001). An additional needle biopsy or surgical biopsy was performed for 93 and 62 patients with FNAC, but for only 2 and 33 patients with CNB. The frequent need of supplement biopsies and unnecessary axillary operations due to false-positive findings made FNAC (294 ) more expensive than CNB (223 ), and because the advantage of quick analysis vanishes during the overall diagnostic and referral process, it is recommendable to use CNB as initial biopsy method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerical weather prediction (NWP) models provide the basis for weather forecasting by simulating the evolution of the atmospheric state. A good forecast requires that the initial state of the atmosphere is known accurately, and that the NWP model is a realistic representation of the atmosphere. Data assimilation methods are used to produce initial conditions for NWP models. The NWP model background field, typically a short-range forecast, is updated with observations in a statistically optimal way. The objective in this thesis has been to develope methods in order to allow data assimilation of Doppler radar radial wind observations. The work has been carried out in the High Resolution Limited Area Model (HIRLAM) 3-dimensional variational data assimilation framework. Observation modelling is a key element in exploiting indirect observations of the model variables. In the radar radial wind observation modelling, the vertical model wind profile is interpolated to the observation location, and the projection of the model wind vector on the radar pulse path is calculated. The vertical broadening of the radar pulse volume, and the bending of the radar pulse path due to atmospheric conditions are taken into account. Radar radial wind observations are modelled within observation errors which consist of instrumental, modelling, and representativeness errors. Systematic and random modelling errors can be minimized by accurate observation modelling. The impact of the random part of the instrumental and representativeness errors can be decreased by calculating spatial averages from the raw observations. Model experiments indicate that the spatial averaging clearly improves the fit of the radial wind observations to the model in terms of observation minus model background (OmB) standard deviation. Monitoring the quality of the observations is an important aspect, especially when a new observation type is introduced into a data assimilation system. Calculating the bias for radial wind observations in a conventional way can result in zero even in case there are systematic differences in the wind speed and/or direction. A bias estimation method designed for this observation type is introduced in the thesis. Doppler radar radial wind observation modelling, together with the bias estimation method, enables the exploitation of the radial wind observations also for NWP model validation. The one-month model experiments performed with the HIRLAM model versions differing only in a surface stress parameterization detail indicate that the use of radar wind observations in NWP model validation is very beneficial.