936 resultados para Halbach Array
Resumo:
As a noninvasive method, exhaled breath condensate (EBC) has gained importance to improve monitoring of lung diseases and to detect biomarkers. The aim of the study was to investigate, whether erythropoietin (EPO) is detectable in EBC. EBC was collected from 22 consecutive patients as well as from healthy individuals. Using a multiplex fluorescent bead immunoassay, we detected EPO in EBC, as well as tumour necrosis factor-alpha (TNF-alpha) in 13 out of 22 patients simultaneously (EPO 0.21 +/- 0.03 in U/mL and TNF-alpha 34.6 +/- 4.2 in pg/mL, mean +/- SEM). No significant differences for EPO levels or correlation between EPO and TNF-alpha were found but TNF-alpha was significantly higher in patients with chronic obstructive pulmonary disease (COPD) than in non-COPD (obstructive sleep apnoea, OSA, and lung healthy patients). This is the first report of detection of EPO in EBC. Due to the small study size more data is needed to clarify the role of EPO in EBC.
Resumo:
Genomic alterations have been linked to the development and progression of cancer. The technique of Comparative Genomic Hybridization (CGH) yields data consisting of fluorescence intensity ratios of test and reference DNA samples. The intensity ratios provide information about the number of copies in DNA. Practical issues such as the contamination of tumor cells in tissue specimens and normalization errors necessitate the use of statistics for learning about the genomic alterations from array-CGH data. As increasing amounts of array CGH data become available, there is a growing need for automated algorithms for characterizing genomic profiles. Specifically, there is a need for algorithms that can identify gains and losses in the number of copies based on statistical considerations, rather than merely detect trends in the data. We adopt a Bayesian approach, relying on the hidden Markov model to account for the inherent dependence in the intensity ratios. Posterior inferences are made about gains and losses in copy number. Localized amplifications (associated with oncogene mutations) and deletions (associated with mutations of tumor suppressors) are identified using posterior probabilities. Global trends such as extended regions of altered copy number are detected. Since the posterior distribution is analytically intractable, we implement a Metropolis-within-Gibbs algorithm for efficient simulation-based inference. Publicly available data on pancreatic adenocarcinoma, glioblastoma multiforme and breast cancer are analyzed, and comparisons are made with some widely-used algorithms to illustrate the reliability and success of the technique.
Resumo:
DNA sequence copy number has been shown to be associated with cancer development and progression. Array-based Comparative Genomic Hybridization (aCGH) is a recent development that seeks to identify the copy number ratio at large numbers of markers across the genome. Due to experimental and biological variations across chromosomes and across hybridizations, current methods are limited to analyses of single chromosomes. We propose a more powerful approach that borrows strength across chromosomes and across hybridizations. We assume a Gaussian mixture model, with a hidden Markov dependence structure, and with random effects to allow for intertumoral variation, as well as intratumoral clonal variation. For ease of computation, we base estimation on a pseudolikelihood function. The method produces quantitative assessments of the likelihood of genetic alterations at each clone, along with a graphical display for simple visual interpretation. We assess the characteristics of the method through simulation studies and through analysis of a brain tumor aCGH data set. We show that the pseudolikelihood approach is superior to existing methods both in detecting small regions of copy number alteration and in accurately classifying regions of change when intratumoral clonal variation is present.
Resumo:
In most microarray technologies, a number of critical steps are required to convert raw intensity measurements into the data relied upon by data analysts, biologists and clinicians. These data manipulations, referred to as preprocessing, can influence the quality of the ultimate measurements. In the last few years, the high-throughput measurement of gene expression is the most popular application of microarray technology. For this application, various groups have demonstrated that the use of modern statistical methodology can substantially improve accuracy and precision of gene expression measurements, relative to ad-hoc procedures introduced by designers and manufacturers of the technology. Currently, other applications of microarrays are becoming more and more popular. In this paper we describe a preprocessing methodology for a technology designed for the identification of DNA sequence variants in specific genes or regions of the human genome that are associated with phenotypes of interest such as disease. In particular we describe methodology useful for preprocessing Affymetrix SNP chips and obtaining genotype calls with the preprocessed data. We demonstrate how our procedure improves existing approaches using data from three relatively large studies including one in which large number independent calls are available. Software implementing these ideas are avialble from the Bioconductor oligo package.
Resumo:
The purpose of this work was the understanding of microbeam radiation therapy at the ESRF in order to find the best compromise between curing of tumors and sparing of normal tissues, to obtain a better understanding of survival curves and to report its efficiency. This method uses synchrotron-generated x-ray microbeams. Rats were implanted with 9L gliosarcomas and the tumors were diagnosed by MRI. They were irradiated 14 days after implantation by arrays of 25 microm wide microbeams in unidirectional mode, with a skin entrance dose of 625 Gy. The effect of using 200 or 100 microm center-to-center spacing between the microbeams was compared. The median survival time (post-implantation) was 40 and 67 days at 200 and 100 microm spacing, respectively. However, 72% of rats irradiated at 100 microm spacing showed abnormal clinical signs and weight patterns, whereas only 12% of rats were affected at 200 microm spacing. In parallel, histological lesions of the normal brain were found in the 100 microm series only. Although the increase in lifespan was equal to 273% and 102% for the 100 and 200 microm series, respectively, the 200 microm spacing protocol provides a better sparing of healthy tissue and may prove useful in combination with other radiation modalities or additional drugs.
Resumo:
Cochlear implants have been of great benefit in restoring auditory function to individuals with profound bilateral sensorineural deafness. The implants are used to directly stimulate auditory nerves and send a signal to the brain that is then interpreted as sound. This project focuses on the development of a surgical positioning tool to accurately and effectively place an array of stimulating electrodes deep within the cochlea. This will lead to improved efficiency and performance of the stimulating electrodes, reduced surgical trauma to the cochlea, and as a result, improved overall performance to the implant recipient. The positioning tool reported here consists of multiple fluidic chambers providing localized curvature control along the length of the attached silicon electrode array. The chambers consist of 200μm inner diameter PET (polyethylene therephthalate) tubes with 4μm wall thickness. The chambers are molded in a tapered helical configuration to correspond to the cochlear shape upon relaxation of the actuators. This ensures that the optimal electrode placement within the cochlea is retained after the positioning tool becomes dormant (for chronic implants). Actuation is achieved by injecting fluid into the PET chambers and regulating the fluidic pressure. The chambers are arranged in a stacked, overlapping design to provide fluid connectivity with the non-implantable pressure controller and allow for local curvature control of the device. The stacked tube configuration allows for localized curvature control of various areas along the length of the electrode and additional stiffening and actuating power towards the base. Curvature is affected along the entire length of a chamber and the result is cumulative in sections of multiple chambers. The actuating chambers are bonded to the back of a silicon electrode array.
Resumo:
OBJECTIVE: The objective of this study was to evaluate the feasibility and reproducibility of high-resolution magnetic resonance imaging (MRI) and quantitative T2 mapping of the talocrural cartilage within a clinically applicable scan time using a new dedicated ankle coil and high-field MRI. MATERIALS AND METHODS: Ten healthy volunteers (mean age 32.4 years) underwent MRI of the ankle. As morphological sequences, proton density fat-suppressed turbo spin echo (PD-FS-TSE), as a reference, was compared with 3D true fast imaging with steady-state precession (TrueFISP). Furthermore, biochemical quantitative T2 imaging was prepared using a multi-echo spin-echo T2 approach. Data analysis was performed three times each by three different observers on sagittal slices, planned on the isotropic 3D-TrueFISP; as a morphological parameter, cartilage thickness was assessed and for T2 relaxation times, region-of-interest (ROI) evaluation was done. Reproducibility was determined as a coefficient of variation (CV) for each volunteer; averaged as root mean square (RMSA) given as a percentage; statistical evaluation was done using analysis of variance. RESULTS: Cartilage thickness of the talocrural joint showed significantly higher values for the 3D-TrueFISP (ranging from 1.07 to 1.14 mm) compared with the PD-FS-TSE (ranging from 0.74 to 0.99 mm); however, both morphological sequences showed comparable good results with RMSA of 7.1 to 8.5%. Regarding quantitative T2 mapping, measurements showed T2 relaxation times of about 54 ms with an excellent reproducibility (RMSA) ranging from 3.2 to 4.7%. CONCLUSION: In our study the assessment of cartilage thickness and T2 relaxation times could be performed with high reproducibility in a clinically realizable scan time, demonstrating new possibilities for further investigations into patient groups.
Resumo:
Metallic nanocups provide a unique method for redirecting scattered light by creating magnetic plasmon responses at optical frequencies. Despite considerable development of nanocup fabrication processes, simultaneously achieving accurate control over the placement, orientation, and geometry of nanocups has remained a significant challenge. Here we present a technique for fabricating large, periodically ordered arrays of uniformly oriented three-dimensional gold nanocups for manipulating light at subwavelength scales. Nanoimprint lithography, soft lithography, and shadow evaporation were used to fabricate nanocups onto the tips of polydimethylsiloxane nanopillars with precise control over the shapes and optical properties of asymmetric nanocups.
Resumo:
Many applications, such as telepresence, virtual reality, and interactive walkthroughs, require a three-dimensional(3D)model of real-world environments. Methods, such as lightfields, geometric reconstruction and computer vision use cameras to acquire visual samples of the environment and construct a model. Unfortunately, obtaining models of real-world locations is a challenging task. In particular, important environments are often actively in use, containing moving objects, such as people entering and leaving the scene. The methods previously listed have difficulty in capturing the color and structure of the environment while in the presence of moving and temporary occluders. We describe a class of cameras called lag cameras. The main concept is to generalize a camera to take samples over space and time. Such a camera, can easily and interactively detect moving objects while continuously moving through the environment. Moreover, since both the lag camera and occluder are moving, the scene behind the occluder is captured by the lag camera even from viewpoints where the occluder lies in between the lag camera and the hidden scene. We demonstrate an implementation of a lag camera, complete with analysis and captured environments.