964 resultados para swd: Normalization
Resumo:
[EN] Acute hypoxia (AH) reduces maximal O2 consumption (VO2 max), but after acclimatization, and despite increases in both hemoglobin concentration and arterial O2 saturation that can normalize arterial O2 concentration ([O2]), VO2 max remains low. To determine why, seven lowlanders were studied at VO2 max (cycle ergometry) at sea level (SL), after 9-10 wk at 5,260 m [chronic hypoxia (CH)], and 6 mo later at SL in AH (FiO2 = 0.105) equivalent to 5,260 m. Pulmonary and leg indexes of O2 transport were measured in each condition. Both cardiac output and leg blood flow were reduced by approximately 15% in both AH and CH (P < 0.05). At maximal exercise, arterial [O2] in AH was 31% lower than at SL (P < 0.05), whereas in CH it was the same as at SL due to both polycythemia and hyperventilation. O2 extraction by the legs, however, remained at SL values in both AH and CH. Although at both SL and in AH, 76% of the cardiac output perfused the legs, in CH the legs received only 67%. Pulmonary VO2 max (4.1 +/- 0.3 l/min at SL) fell to 2.2 +/- 0.1 l/min in AH (P < 0.05) and was only 2.4 +/- 0.2 l/min in CH (P < 0.05). These data suggest that the failure to recover VO2 max after acclimatization despite normalization of arterial [O2] is explained by two circulatory effects of altitude: 1) failure of cardiac output to normalize and 2) preferential redistribution of cardiac output to nonexercising tissues. Oxygen transport from blood to muscle mitochondria, on the other hand, appears unaffected by CH.
Resumo:
Phosphatidylethanol (PEth) is a direct ethanol metabolite, and has recently attracted attention as biomarker of ethanol intake. The aims of the current study are: (1) to characterize the normalization time of PEth in larger samples than previously conducted; (2) to elucidate potential gender differences; and (3) to report the correlation of PEth with other biomarkers and self-reported alcohol consumption. Fifty-seven alcohol-dependent patients (ICD 10 F 10.25; 9 females, 48 males) entering medical detoxification at three study sites were enrolled. The study sample was comprised of 48 males and 9 females, with mean age 43.5. Mean gamma glutamyl transpeptidase (GGT) was 209.61 U/l, average mean corpuscular volume (MCV) was 97.35 fl, mean carbohydrate deficient transferrin (%CDT) was 8.68, and mean total ethanol intake in the last 7 days was 1653 g. PEth was measured in heparinized whole blood with a high-pressure liquid chromatography method, while GGT, MCV and %CDT were measured using routine methods. PEth levels at day 1 of detoxification ranged between 0.63 and 26.95 micromol/l (6.22 mean, 4.70 median, SD 4.97). There were no false negatives at day 1. Sensitivities for the other biomarkers were 40.4% for MCV, 73.1% for GGT and 69.2% for %CDT, respectively. No gender differences were found for PEth levels at any time point. Our data suggest that PEth is (1) a suitable intermediate term marker of ethanol intake in both sexes; and (2) sensitivity is extraordinary high in alcohol dependent patients. The results add further evidence to the data that suggest that PEth has potential as a candidate for a sensitive and specific biomarker, which reflects longer-lasting intake of higher amounts of alcohol and seemingly has the above mentioned certain advantages over traditional biomarkers.
Resumo:
In most microarray technologies, a number of critical steps are required to convert raw intensity measurements into the data relied upon by data analysts, biologists and clinicians. These data manipulations, referred to as preprocessing, can influence the quality of the ultimate measurements. In the last few years, the high-throughput measurement of gene expression is the most popular application of microarray technology. For this application, various groups have demonstrated that the use of modern statistical methodology can substantially improve accuracy and precision of gene expression measurements, relative to ad-hoc procedures introduced by designers and manufacturers of the technology. Currently, other applications of microarrays are becoming more and more popular. In this paper we describe a preprocessing methodology for a technology designed for the identification of DNA sequence variants in specific genes or regions of the human genome that are associated with phenotypes of interest such as disease. In particular we describe methodology useful for preprocessing Affymetrix SNP chips and obtaining genotype calls with the preprocessed data. We demonstrate how our procedure improves existing approaches using data from three relatively large studies including one in which large number independent calls are available. Software implementing these ideas are avialble from the Bioconductor oligo package.
Resumo:
The ability to measure gene expression on a genome-wide scale is one of the most promising accomplishments in molecular biology. Microarrays, the technology that first permitted this, were riddled with problems due to unwanted sources of variability. Many of these problems are now mitigated, after a decade’s worth of statistical methodology development. The recently developed RNA sequencing (RNA-seq) technology has generated much excitement in part due to claims of reduced variability in comparison to microarrays. However, we show RNA-seq data demonstrates unwanted and obscuring variability similar to what was first observed in microarrays. In particular, we find GC-content has a strong sample specific effect on gene expression measurements that, if left uncorrected, leads to false positives in downstream results. We also report on commonly observed data distortions that demonstrate the need for data normalization. Here we describe statistical methodology that improves precision by 42% without loss of accuracy. Our resulting conditional quantile normalization (CQN) algorithm combines robust generalized regression to remove systematic bias introduced by deterministic features such as GC-content, and quantile normalization to correct for global distortions.
Resumo:
Quantitative reverse transcriptase real-time PCR (QRT-PCR) is a robust method to quantitate RNA abundance. The procedure is highly sensitive and reproducible as long as the initial RNA is intact. However, breaks in the RNA due to chemical or enzymatic cleavage may reduce the number of RNA molecules that contain intact amplicons. As a consequence, the number of molecules available for amplification decreases. We determined the relation between RNA fragmentation and threshold values (Ct values) in subsequent QRT-PCR for four genes in an experimental model of intact and partially hydrolyzed RNA derived from a cell line and we describe the relation between RNA integrity, amplicon size and Ct values in this biologically homogenous system. We demonstrate that degradation-related shifts of Ct values can be compensated by calculating delta Ct values between test genes and the mean values of several control genes. These delta Ct values are less sensitive to fragmentation of the RNA and are unaffected by varying amounts of input RNA. The feasibility of the procedure was demonstrated by comparing Ct values from a larger panel of genes in intact and in partially degraded RNA. We compared Ct values from intact RNA derived from well-preserved tumor material and from fragmented RNA derived from formalin-fixed, paraffin-embedded (FFPE) samples of the same tumors. We demonstrate that the relative abundance of gene expression can be based on FFPE material even when the amount of RNA in the sample and the extent of fragmentation are not known.
Resumo:
Disc degeneration, usually associated with low back pain and changes of intervertebral stiffness, represents a major health issue. As the intervertebral disc (IVD) morphology influences its stiffness, the link between mechanical properties and degenerative grade is partially lost without an efficient normalization of the stiffness with respect to the morphology. Moreover, although the behavior of soft tissues is highly nonlinear, only linear normalization protocols have been defined so far for the disc stiffness. Thus, the aim of this work is to propose a nonlinear normalization based on finite elements (FE) simulations and evaluate its impact on the stiffness of human anatomical specimens of lumbar IVD. First, a parameter study involving simulations of biomechanical tests (compression, flexion/extension, bilateral torsion and bending) on 20 FE models of IVDs with various dimensions was carried out to evaluate the effect of the disc's geometry on its compliance and establish stiffness/morphology relations necessary to the nonlinear normalization. The computed stiffness was then normalized by height (H), cross-sectional area (CSA), polar moment of inertia (J) or moments of inertia (Ixx, Iyy) to quantify the effect of both linear and nonlinear normalizations. In the second part of the study, T1-weighted MRI images were acquired to determine H, CSA, J, Ixx and Iyy of 14 human lumbar IVDs. Based on the measured morphology and pre-established relation with stiffness, linear and nonlinear normalization routines were then applied to the compliance of the specimens for each quasi-static biomechanical test. The variability of the stiffness prior to and after normalization was assessed via coefficient of variation (CV). The FE study confirmed that larger and thinner IVDs were stiffer while the normalization strongly attenuated the effect of the disc geometry on its stiffness. Yet, notwithstanding the results of the FE study, the experimental stiffness showed consistently higher CV after normalization. Assuming that geometry and material properties affect the mechanical response, they can also compensate for one another. Therefore, the larger CV after normalization can be interpreted as a strong variability of the material properties, previously hidden by the geometry's own influence. In conclusion, a new normalization protocol for the intervertebral disc stiffness in compression, flexion, extension, bilateral torsion and bending was proposed, with the possible use of MRI and FE to acquire the discs' anatomy and determine the nonlinear relations between stiffness and morphology. Such protocol may be useful to relate the disc's mechanical properties to its degree of degeneration.
Resumo:
Some consequences of using Atomic Units are treated here.
Resumo:
The Frobenius solution to Legendre/s equation is developed in detail as is Rodrigue's formula, which is employed to normalize Legendre polynomials.
Resumo:
This paper proposes an architecture, based on statistical machine translation, for developing the text normalization module of a text to speech conversion system. The main target is to generate a language independent text normalization module, based on data and flexible enough to deal with all situa-tions presented in this task. The proposed architecture is composed by three main modules: a tokenizer module for splitting the text input into a token graph (tokenization), a phrase-based translation module (token translation) and a post-processing module for removing some tokens. This paper presents initial exper-iments for numbers and abbreviations. The very good results obtained validate the proposed architecture.
Resumo:
Sedentarism has become one of the major concerns of our times. Nowadays people spend most of the time sitting down and moving by mechanical means instead of exercising themselves. Younger generations do only a little more sport today than their counterparts did a decade ago. In other words, sedentary habits have become common in our society, especially among the young. What cultural mechanisms have contributed to this? What are the consequences of a sedentary lifestyle for our health and well-being? These are the questions we have posed in this study. We conducted qualitative research among Spanish young people, and the results have provided important clues to help us understand better how ?active sedentarism? has become the norm among young people.
Resumo:
Acidic and basic fibroblast growth factors (FGFs) share a wide range of diverse biological activities. To date, low levels of FGF have not been correlated with a pathophysiologic state. We report that blood vessels of spontaneously hypertensive rats are shown to be associated with a marked decrement in endothelial basic FGF content. This decrement correlates both with hypertension and with a decrease in the endothelial content of nitric oxide synthase. Restoration of FGF to physiological levels in the vascular wall, either by systemic administration or by in vivo gene transfer, significantly augmented the number of endothelial cells with positive immunostaining for nitric oxide synthase, corrected hypertension, and ameliorated endothelial-dependent responses to vasoconstrictors. These results suggest an important role for FGFs in blood pressure homeostasis and open new avenues for the understanding of the etiology and treatment of hypertension.
Resumo:
The robotics is one of the most active areas. We also need to join a large number of disciplines to create robots. With these premises, one problem is the management of information from multiple heterogeneous sources. Each component, hardware or software, produces data with different nature: temporal frequencies, processing needs, size, type, etc. Nowadays, technologies and software engineering paradigms such as service-oriented architectures are applied to solve this problem in other areas. This paper proposes the use of these technologies to implement a robotic control system based on services. This type of system will allow integration and collaborative work of different elements that make up a robotic system.