979 resultados para ARRAY INTERPOLATION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a noninvasive method, exhaled breath condensate (EBC) has gained importance to improve monitoring of lung diseases and to detect biomarkers. The aim of the study was to investigate, whether erythropoietin (EPO) is detectable in EBC. EBC was collected from 22 consecutive patients as well as from healthy individuals. Using a multiplex fluorescent bead immunoassay, we detected EPO in EBC, as well as tumour necrosis factor-alpha (TNF-alpha) in 13 out of 22 patients simultaneously (EPO 0.21 +/- 0.03 in U/mL and TNF-alpha 34.6 +/- 4.2 in pg/mL, mean +/- SEM). No significant differences for EPO levels or correlation between EPO and TNF-alpha were found but TNF-alpha was significantly higher in patients with chronic obstructive pulmonary disease (COPD) than in non-COPD (obstructive sleep apnoea, OSA, and lung healthy patients). This is the first report of detection of EPO in EBC. Due to the small study size more data is needed to clarify the role of EPO in EBC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genomic alterations have been linked to the development and progression of cancer. The technique of Comparative Genomic Hybridization (CGH) yields data consisting of fluorescence intensity ratios of test and reference DNA samples. The intensity ratios provide information about the number of copies in DNA. Practical issues such as the contamination of tumor cells in tissue specimens and normalization errors necessitate the use of statistics for learning about the genomic alterations from array-CGH data. As increasing amounts of array CGH data become available, there is a growing need for automated algorithms for characterizing genomic profiles. Specifically, there is a need for algorithms that can identify gains and losses in the number of copies based on statistical considerations, rather than merely detect trends in the data. We adopt a Bayesian approach, relying on the hidden Markov model to account for the inherent dependence in the intensity ratios. Posterior inferences are made about gains and losses in copy number. Localized amplifications (associated with oncogene mutations) and deletions (associated with mutations of tumor suppressors) are identified using posterior probabilities. Global trends such as extended regions of altered copy number are detected. Since the posterior distribution is analytically intractable, we implement a Metropolis-within-Gibbs algorithm for efficient simulation-based inference. Publicly available data on pancreatic adenocarcinoma, glioblastoma multiforme and breast cancer are analyzed, and comparisons are made with some widely-used algorithms to illustrate the reliability and success of the technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DNA sequence copy number has been shown to be associated with cancer development and progression. Array-based Comparative Genomic Hybridization (aCGH) is a recent development that seeks to identify the copy number ratio at large numbers of markers across the genome. Due to experimental and biological variations across chromosomes and across hybridizations, current methods are limited to analyses of single chromosomes. We propose a more powerful approach that borrows strength across chromosomes and across hybridizations. We assume a Gaussian mixture model, with a hidden Markov dependence structure, and with random effects to allow for intertumoral variation, as well as intratumoral clonal variation. For ease of computation, we base estimation on a pseudolikelihood function. The method produces quantitative assessments of the likelihood of genetic alterations at each clone, along with a graphical display for simple visual interpretation. We assess the characteristics of the method through simulation studies and through analysis of a brain tumor aCGH data set. We show that the pseudolikelihood approach is superior to existing methods both in detecting small regions of copy number alteration and in accurately classifying regions of change when intratumoral clonal variation is present.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In most microarray technologies, a number of critical steps are required to convert raw intensity measurements into the data relied upon by data analysts, biologists and clinicians. These data manipulations, referred to as preprocessing, can influence the quality of the ultimate measurements. In the last few years, the high-throughput measurement of gene expression is the most popular application of microarray technology. For this application, various groups have demonstrated that the use of modern statistical methodology can substantially improve accuracy and precision of gene expression measurements, relative to ad-hoc procedures introduced by designers and manufacturers of the technology. Currently, other applications of microarrays are becoming more and more popular. In this paper we describe a preprocessing methodology for a technology designed for the identification of DNA sequence variants in specific genes or regions of the human genome that are associated with phenotypes of interest such as disease. In particular we describe methodology useful for preprocessing Affymetrix SNP chips and obtaining genotype calls with the preprocessed data. We demonstrate how our procedure improves existing approaches using data from three relatively large studies including one in which large number independent calls are available. Software implementing these ideas are avialble from the Bioconductor oligo package.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this work was the understanding of microbeam radiation therapy at the ESRF in order to find the best compromise between curing of tumors and sparing of normal tissues, to obtain a better understanding of survival curves and to report its efficiency. This method uses synchrotron-generated x-ray microbeams. Rats were implanted with 9L gliosarcomas and the tumors were diagnosed by MRI. They were irradiated 14 days after implantation by arrays of 25 microm wide microbeams in unidirectional mode, with a skin entrance dose of 625 Gy. The effect of using 200 or 100 microm center-to-center spacing between the microbeams was compared. The median survival time (post-implantation) was 40 and 67 days at 200 and 100 microm spacing, respectively. However, 72% of rats irradiated at 100 microm spacing showed abnormal clinical signs and weight patterns, whereas only 12% of rats were affected at 200 microm spacing. In parallel, histological lesions of the normal brain were found in the 100 microm series only. Although the increase in lifespan was equal to 273% and 102% for the 100 and 200 microm series, respectively, the 200 microm spacing protocol provides a better sparing of healthy tissue and may prove useful in combination with other radiation modalities or additional drugs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cochlear implants have been of great benefit in restoring auditory function to individuals with profound bilateral sensorineural deafness. The implants are used to directly stimulate auditory nerves and send a signal to the brain that is then interpreted as sound. This project focuses on the development of a surgical positioning tool to accurately and effectively place an array of stimulating electrodes deep within the cochlea. This will lead to improved efficiency and performance of the stimulating electrodes, reduced surgical trauma to the cochlea, and as a result, improved overall performance to the implant recipient. The positioning tool reported here consists of multiple fluidic chambers providing localized curvature control along the length of the attached silicon electrode array. The chambers consist of 200μm inner diameter PET (polyethylene therephthalate) tubes with 4μm wall thickness. The chambers are molded in a tapered helical configuration to correspond to the cochlear shape upon relaxation of the actuators. This ensures that the optimal electrode placement within the cochlea is retained after the positioning tool becomes dormant (for chronic implants). Actuation is achieved by injecting fluid into the PET chambers and regulating the fluidic pressure. The chambers are arranged in a stacked, overlapping design to provide fluid connectivity with the non-implantable pressure controller and allow for local curvature control of the device. The stacked tube configuration allows for localized curvature control of various areas along the length of the electrode and additional stiffening and actuating power towards the base. Curvature is affected along the entire length of a chamber and the result is cumulative in sections of multiple chambers. The actuating chambers are bonded to the back of a silicon electrode array.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: The objective of this study was to evaluate the feasibility and reproducibility of high-resolution magnetic resonance imaging (MRI) and quantitative T2 mapping of the talocrural cartilage within a clinically applicable scan time using a new dedicated ankle coil and high-field MRI. MATERIALS AND METHODS: Ten healthy volunteers (mean age 32.4 years) underwent MRI of the ankle. As morphological sequences, proton density fat-suppressed turbo spin echo (PD-FS-TSE), as a reference, was compared with 3D true fast imaging with steady-state precession (TrueFISP). Furthermore, biochemical quantitative T2 imaging was prepared using a multi-echo spin-echo T2 approach. Data analysis was performed three times each by three different observers on sagittal slices, planned on the isotropic 3D-TrueFISP; as a morphological parameter, cartilage thickness was assessed and for T2 relaxation times, region-of-interest (ROI) evaluation was done. Reproducibility was determined as a coefficient of variation (CV) for each volunteer; averaged as root mean square (RMSA) given as a percentage; statistical evaluation was done using analysis of variance. RESULTS: Cartilage thickness of the talocrural joint showed significantly higher values for the 3D-TrueFISP (ranging from 1.07 to 1.14 mm) compared with the PD-FS-TSE (ranging from 0.74 to 0.99 mm); however, both morphological sequences showed comparable good results with RMSA of 7.1 to 8.5%. Regarding quantitative T2 mapping, measurements showed T2 relaxation times of about 54 ms with an excellent reproducibility (RMSA) ranging from 3.2 to 4.7%. CONCLUSION: In our study the assessment of cartilage thickness and T2 relaxation times could be performed with high reproducibility in a clinically realizable scan time, demonstrating new possibilities for further investigations into patient groups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation