968 resultados para Sheets normalization
Resumo:
[EN] Acute hypoxia (AH) reduces maximal O2 consumption (VO2 max), but after acclimatization, and despite increases in both hemoglobin concentration and arterial O2 saturation that can normalize arterial O2 concentration ([O2]), VO2 max remains low. To determine why, seven lowlanders were studied at VO2 max (cycle ergometry) at sea level (SL), after 9-10 wk at 5,260 m [chronic hypoxia (CH)], and 6 mo later at SL in AH (FiO2 = 0.105) equivalent to 5,260 m. Pulmonary and leg indexes of O2 transport were measured in each condition. Both cardiac output and leg blood flow were reduced by approximately 15% in both AH and CH (P < 0.05). At maximal exercise, arterial [O2] in AH was 31% lower than at SL (P < 0.05), whereas in CH it was the same as at SL due to both polycythemia and hyperventilation. O2 extraction by the legs, however, remained at SL values in both AH and CH. Although at both SL and in AH, 76% of the cardiac output perfused the legs, in CH the legs received only 67%. Pulmonary VO2 max (4.1 +/- 0.3 l/min at SL) fell to 2.2 +/- 0.1 l/min in AH (P < 0.05) and was only 2.4 +/- 0.2 l/min in CH (P < 0.05). These data suggest that the failure to recover VO2 max after acclimatization despite normalization of arterial [O2] is explained by two circulatory effects of altitude: 1) failure of cardiac output to normalize and 2) preferential redistribution of cardiac output to nonexercising tissues. Oxygen transport from blood to muscle mitochondria, on the other hand, appears unaffected by CH.
Resumo:
Phosphatidylethanol (PEth) is a direct ethanol metabolite, and has recently attracted attention as biomarker of ethanol intake. The aims of the current study are: (1) to characterize the normalization time of PEth in larger samples than previously conducted; (2) to elucidate potential gender differences; and (3) to report the correlation of PEth with other biomarkers and self-reported alcohol consumption. Fifty-seven alcohol-dependent patients (ICD 10 F 10.25; 9 females, 48 males) entering medical detoxification at three study sites were enrolled. The study sample was comprised of 48 males and 9 females, with mean age 43.5. Mean gamma glutamyl transpeptidase (GGT) was 209.61 U/l, average mean corpuscular volume (MCV) was 97.35 fl, mean carbohydrate deficient transferrin (%CDT) was 8.68, and mean total ethanol intake in the last 7 days was 1653 g. PEth was measured in heparinized whole blood with a high-pressure liquid chromatography method, while GGT, MCV and %CDT were measured using routine methods. PEth levels at day 1 of detoxification ranged between 0.63 and 26.95 micromol/l (6.22 mean, 4.70 median, SD 4.97). There were no false negatives at day 1. Sensitivities for the other biomarkers were 40.4% for MCV, 73.1% for GGT and 69.2% for %CDT, respectively. No gender differences were found for PEth levels at any time point. Our data suggest that PEth is (1) a suitable intermediate term marker of ethanol intake in both sexes; and (2) sensitivity is extraordinary high in alcohol dependent patients. The results add further evidence to the data that suggest that PEth has potential as a candidate for a sensitive and specific biomarker, which reflects longer-lasting intake of higher amounts of alcohol and seemingly has the above mentioned certain advantages over traditional biomarkers.
Resumo:
In most microarray technologies, a number of critical steps are required to convert raw intensity measurements into the data relied upon by data analysts, biologists and clinicians. These data manipulations, referred to as preprocessing, can influence the quality of the ultimate measurements. In the last few years, the high-throughput measurement of gene expression is the most popular application of microarray technology. For this application, various groups have demonstrated that the use of modern statistical methodology can substantially improve accuracy and precision of gene expression measurements, relative to ad-hoc procedures introduced by designers and manufacturers of the technology. Currently, other applications of microarrays are becoming more and more popular. In this paper we describe a preprocessing methodology for a technology designed for the identification of DNA sequence variants in specific genes or regions of the human genome that are associated with phenotypes of interest such as disease. In particular we describe methodology useful for preprocessing Affymetrix SNP chips and obtaining genotype calls with the preprocessed data. We demonstrate how our procedure improves existing approaches using data from three relatively large studies including one in which large number independent calls are available. Software implementing these ideas are avialble from the Bioconductor oligo package.
Resumo:
The ability to measure gene expression on a genome-wide scale is one of the most promising accomplishments in molecular biology. Microarrays, the technology that first permitted this, were riddled with problems due to unwanted sources of variability. Many of these problems are now mitigated, after a decade’s worth of statistical methodology development. The recently developed RNA sequencing (RNA-seq) technology has generated much excitement in part due to claims of reduced variability in comparison to microarrays. However, we show RNA-seq data demonstrates unwanted and obscuring variability similar to what was first observed in microarrays. In particular, we find GC-content has a strong sample specific effect on gene expression measurements that, if left uncorrected, leads to false positives in downstream results. We also report on commonly observed data distortions that demonstrate the need for data normalization. Here we describe statistical methodology that improves precision by 42% without loss of accuracy. Our resulting conditional quantile normalization (CQN) algorithm combines robust generalized regression to remove systematic bias introduced by deterministic features such as GC-content, and quantile normalization to correct for global distortions.
Resumo:
Small clusters of gallium oxide, technologically important high temperature ceramic, together with interaction of nucleic acid bases with graphene and small-diameter carbon nanotube are focus of first principles calculations in this work. A high performance parallel computing platform is also developed to perform these calculations at Michigan Tech. First principles calculations are based on density functional theory employing either local density or gradient-corrected approximation together with plane wave and gaussian basis sets. The bulk Ga2O3 is known to be a very good candidate for fabricating electronic devices that operate at high temperatures. To explore the properties of Ga2O3 at nonoscale, we have performed a systematic theoretical study on the small polyatomic gallium oxide clusters. The calculated results find that all lowest energy isomers of GamOn clusters are dominated by the Ga-O bonds over the metal-metal or the oxygen-oxygen bonds. Analysis of atomic charges suggest the clusters to be highly ionic similar to the case of bulk Ga2O3. In the study of sequential oxidation of these slusters starting from Ga2O, it is found that the most stable isomers display up to four different backbones of constituent atoms. Furthermore, the predicted configuration of the ground state of Ga2O is recently confirmed by the experimental result of Neumark's group. Guided by the results of calculations the study of gallium oxide clusters, performance related challenge of computational simulations, of producing high performance computers/platforms, has been addressed. Several engineering aspects were thoroughly studied during the design, development and implementation of the high performance parallel computing platform, rama, at Michigan Tech. In an attempt to stay true to the principles of Beowulf revolutioni, the rama cluster was extensively customized to make it easy to understand, and use - for administrators as well as end-users. Following the results of benchmark calculations and to keep up with the complexity of systems under study, rama has been expanded to a total of sixty four processors. Interest in the non-covalent intereaction of DNA with carbon nanotubes has steadily increased during past several years. This hybrid system, at the junction of the biological regime and the nanomaterials world, possesses features which make it very attractive for a wide range of applicatioins. Using the in-house computational power available, we have studied details of the interaction between nucleic acid bases with graphene sheet as well as high-curvature small-diameter carbon nanotube. The calculated trend in the binding energies strongly suggests that the polarizability of the base molecules determines the interaction strength of the nucleic acid bases with graphene. When comparing the results obtained here for physisorption on the small diameter nanotube considered with those from the study on graphene, it is observed that the interaction strength of nucleic acid bases is smaller for the tube. Thus, these results show that the effect of introducing curvature is to reduce the binding energy. The binding energies for the two extreme cases of negligible curvature (i.e. flat graphene sheet) and of very high curvature (i.e. small diameter nanotube) may be considered as upper and lower bounds. This finding represents an important step towards a better understanding of experimentally observed sequence-dependent interaction of DNA with Carbon nanotubes.
Resumo:
Quantitative reverse transcriptase real-time PCR (QRT-PCR) is a robust method to quantitate RNA abundance. The procedure is highly sensitive and reproducible as long as the initial RNA is intact. However, breaks in the RNA due to chemical or enzymatic cleavage may reduce the number of RNA molecules that contain intact amplicons. As a consequence, the number of molecules available for amplification decreases. We determined the relation between RNA fragmentation and threshold values (Ct values) in subsequent QRT-PCR for four genes in an experimental model of intact and partially hydrolyzed RNA derived from a cell line and we describe the relation between RNA integrity, amplicon size and Ct values in this biologically homogenous system. We demonstrate that degradation-related shifts of Ct values can be compensated by calculating delta Ct values between test genes and the mean values of several control genes. These delta Ct values are less sensitive to fragmentation of the RNA and are unaffected by varying amounts of input RNA. The feasibility of the procedure was demonstrated by comparing Ct values from a larger panel of genes in intact and in partially degraded RNA. We compared Ct values from intact RNA derived from well-preserved tumor material and from fragmented RNA derived from formalin-fixed, paraffin-embedded (FFPE) samples of the same tumors. We demonstrate that the relative abundance of gene expression can be based on FFPE material even when the amount of RNA in the sample and the extent of fragmentation are not known.
Resumo:
Methods for optical motion capture often require timeconsuming manual processing before the data can be used for subsequent tasks such as retargeting or character animation. These processing steps restrict the applicability of motion capturing especially for dynamic VR-environments with real time requirements. To solve these problems, we present two additional, fast and automatic processing stages based on our motion capture pipeline presented in [HSK05]. A normalization step aligns the recorded coordinate systems with the skeleton structure to yield a common and intuitive data basis across different recording sessions. A second step computes a parameterization based on automatically extracted main movement axes to generate a compact motion description. Our method does not restrict the placement of marker bodies nor the recording setup, and only requires a short calibration phase.
Resumo:
Given arbitrary pictures, we explore the possibility of using new techniques from computer vision and artificial intelligence to create customized visual games on-the-fly. This includes coloring books, link-the-dot and spot-the-difference popular games. The feasibility of these systems is discussed and we describe prototype implementation that work well in practice in an automatic or semi-automatic way.
Resumo:
Self – assembly is a powerful tool for the construction of highly organized nanostructures [1]. Therefore, the possibility to control and predict pathways of molecular ordering on the nanoscale level is a critical issue for the production of materials with tunable and adaptive macroscopic properties. Herein, we demonstrate that designed molecule Py3 forms dimensionally - defined supramolecular assemblies under thermodynamic conditions in water [2]. To study Py3 self-assembly, we carried out whole set of spectroscopic and microscopic experiments. The factors influencing stability, morphology and behavior of «nanosheets» in multicomponent systems are discussed
Resumo:
Disc degeneration, usually associated with low back pain and changes of intervertebral stiffness, represents a major health issue. As the intervertebral disc (IVD) morphology influences its stiffness, the link between mechanical properties and degenerative grade is partially lost without an efficient normalization of the stiffness with respect to the morphology. Moreover, although the behavior of soft tissues is highly nonlinear, only linear normalization protocols have been defined so far for the disc stiffness. Thus, the aim of this work is to propose a nonlinear normalization based on finite elements (FE) simulations and evaluate its impact on the stiffness of human anatomical specimens of lumbar IVD. First, a parameter study involving simulations of biomechanical tests (compression, flexion/extension, bilateral torsion and bending) on 20 FE models of IVDs with various dimensions was carried out to evaluate the effect of the disc's geometry on its compliance and establish stiffness/morphology relations necessary to the nonlinear normalization. The computed stiffness was then normalized by height (H), cross-sectional area (CSA), polar moment of inertia (J) or moments of inertia (Ixx, Iyy) to quantify the effect of both linear and nonlinear normalizations. In the second part of the study, T1-weighted MRI images were acquired to determine H, CSA, J, Ixx and Iyy of 14 human lumbar IVDs. Based on the measured morphology and pre-established relation with stiffness, linear and nonlinear normalization routines were then applied to the compliance of the specimens for each quasi-static biomechanical test. The variability of the stiffness prior to and after normalization was assessed via coefficient of variation (CV). The FE study confirmed that larger and thinner IVDs were stiffer while the normalization strongly attenuated the effect of the disc geometry on its stiffness. Yet, notwithstanding the results of the FE study, the experimental stiffness showed consistently higher CV after normalization. Assuming that geometry and material properties affect the mechanical response, they can also compensate for one another. Therefore, the larger CV after normalization can be interpreted as a strong variability of the material properties, previously hidden by the geometry's own influence. In conclusion, a new normalization protocol for the intervertebral disc stiffness in compression, flexion, extension, bilateral torsion and bending was proposed, with the possible use of MRI and FE to acquire the discs' anatomy and determine the nonlinear relations between stiffness and morphology. Such protocol may be useful to relate the disc's mechanical properties to its degree of degeneration.
Resumo:
Thermal convection in the Antarctic and Greenland ice sheets has been dismissed on the grounds that radio-echo stratigraphy is undisturbed for long distances. However, the undisturbed stratigraphy lies, for the most part, above the density inversion in polar ice sheets and therefore does not disprove convection. An echo-free zone is widespread below the density inversion, yet nobody has cited this as a strong indication that convection is indeed present at d�pth. A generalized Rayleigh criterion for thermal convection in e1astic-viscoplastic polycrystalline solids heated from below is developed and applied to ice-sheet convection. An infinite Rayleigh number at the onset of primary creep decreases with time and becomes constant when secondary creep dominates, suggesting that any thermal buoyancy stress can initiate convection but convection cannot be sustained below a buoyancy stress of about 3 kPa. An analysis of the temperature profile down the Byrd Station core hole suggests that about 1000 m of ice below the density inversion will sustain convection. Creep along the Byrd Station strain network, radar sounding in East Antarctica, and seismic sounding in West Antarctica are examined for evidence of convective creep superimposed on advective creep. It is concluded that the evidence for convection is there, if we look for it with the intention offinding it.