769 resultados para normalization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of a spacecraft orbiting the Neptune-Triton system is presented. The new ingredients in this restricted three body problem are the Neptune oblateness and the high inclined and retrograde motion of Triton. First we present some interesting simulations showing the role played by the oblateness on a Neptune's satellite, disturbed by Triton. We also give an extensive numerical exploration in the case when the spacecraft orbits Triton, considering Sun, Neptune and its planetary oblateness as disturbers. In the plane a x I (a = semi-major axis, I = inclination), we give a plot of the stable regions where the massless body can survive for thousand of years. Retrograde and direct orbits were considered and as usual, the region of stability is much more significant for the case of direct orbit of the spacecraft (Triton's orbit is retrograde). Next we explore the dynamics in a vicinity of the Lagrangian points. The Birkhoff normalization is constructed around L-2, followed by its reduction to the center manifold. In this reduced dynamics, a convenient Poincare section shows the interplay of the Lyapunov and halo periodic orbits, Lissajous and quasi-halo tori as well as the stable and unstable manifolds of the planar Lyapunov orbit. To show the effect of the oblateness, the planar Lyapunov family emanating from the Lagrangian points and three-dimensional halo orbits are obtained by the numerical continuation method. Published by Elsevier Ltd. on behalf of COSPAR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selection of reference genes is an essential consideration to increase the precision and quality of relative expression analysis by the quantitative RT-PCR method. The stability of eight expressed sequence tags was evaluated to define potential reference genes to study the differential expression of common bean target genes under biotic (incompatible interaction between common bean and fungus Colletotrichum lindemuthianum) and abiotic (drought; salinity; cold temperature) stresses. The efficiency of amplification curves and quantification cycle (C (q)) were determined using LinRegPCR software. The stability of the candidate reference genes was obtained using geNorm and NormFinder software, whereas the normalization of differential expression of target genes [beta-1,3-glucanase 1 (BG1) gene for biotic stress and dehydration responsive element binding (DREB) gene for abiotic stress] was defined by REST software. High stability was obtained for insulin degrading enzyme (IDE), actin-11 (Act11), unknown 1 (Ukn1) and unknown 2 (Ukn2) genes during biotic stress, and for SKP1/ASK-interacting protein 16 (Skip16), Act11, Tubulin beta-8 (beta-Tub8) and Unk1 genes under abiotic stresses. However, IDE and Act11 were indicated as the best combination of reference genes for biotic stress analysis, whereas the Skip16 and Act11 genes were the best combination to study abiotic stress. These genes should be useful in the normalization of gene expression by RT-PCR analysis in common bean, the most important edible legume.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Background With the development of DNA hybridization microarray technologies, nowadays it is possible to simultaneously assess the expression levels of thousands to tens of thousands of genes. Quantitative comparison of microarrays uncovers distinct patterns of gene expression, which define different cellular phenotypes or cellular responses to drugs. Due to technical biases, normalization of the intensity levels is a pre-requisite to performing further statistical analyses. Therefore, choosing a suitable approach for normalization can be critical, deserving judicious consideration. Results Here, we considered three commonly used normalization approaches, namely: Loess, Splines and Wavelets, and two non-parametric regression methods, which have yet to be used for normalization, namely, the Kernel smoothing and Support Vector Regression. The results obtained were compared using artificial microarray data and benchmark studies. The results indicate that the Support Vector Regression is the most robust to outliers and that Kernel is the worst normalization technique, while no practical differences were observed between Loess, Splines and Wavelets. Conclusion In face of our results, the Support Vector Regression is favored for microarray normalization due to its superiority when compared to the other methods for its robustness in estimating the normalization curve.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] Acute hypoxia (AH) reduces maximal O2 consumption (VO2 max), but after acclimatization, and despite increases in both hemoglobin concentration and arterial O2 saturation that can normalize arterial O2 concentration ([O2]), VO2 max remains low. To determine why, seven lowlanders were studied at VO2 max (cycle ergometry) at sea level (SL), after 9-10 wk at 5,260 m [chronic hypoxia (CH)], and 6 mo later at SL in AH (FiO2 = 0.105) equivalent to 5,260 m. Pulmonary and leg indexes of O2 transport were measured in each condition. Both cardiac output and leg blood flow were reduced by approximately 15% in both AH and CH (P < 0.05). At maximal exercise, arterial [O2] in AH was 31% lower than at SL (P < 0.05), whereas in CH it was the same as at SL due to both polycythemia and hyperventilation. O2 extraction by the legs, however, remained at SL values in both AH and CH. Although at both SL and in AH, 76% of the cardiac output perfused the legs, in CH the legs received only 67%. Pulmonary VO2 max (4.1 +/- 0.3 l/min at SL) fell to 2.2 +/- 0.1 l/min in AH (P < 0.05) and was only 2.4 +/- 0.2 l/min in CH (P < 0.05). These data suggest that the failure to recover VO2 max after acclimatization despite normalization of arterial [O2] is explained by two circulatory effects of altitude: 1) failure of cardiac output to normalize and 2) preferential redistribution of cardiac output to nonexercising tissues. Oxygen transport from blood to muscle mitochondria, on the other hand, appears unaffected by CH.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phosphatidylethanol (PEth) is a direct ethanol metabolite, and has recently attracted attention as biomarker of ethanol intake. The aims of the current study are: (1) to characterize the normalization time of PEth in larger samples than previously conducted; (2) to elucidate potential gender differences; and (3) to report the correlation of PEth with other biomarkers and self-reported alcohol consumption. Fifty-seven alcohol-dependent patients (ICD 10 F 10.25; 9 females, 48 males) entering medical detoxification at three study sites were enrolled. The study sample was comprised of 48 males and 9 females, with mean age 43.5. Mean gamma glutamyl transpeptidase (GGT) was 209.61 U/l, average mean corpuscular volume (MCV) was 97.35 fl, mean carbohydrate deficient transferrin (%CDT) was 8.68, and mean total ethanol intake in the last 7 days was 1653 g. PEth was measured in heparinized whole blood with a high-pressure liquid chromatography method, while GGT, MCV and %CDT were measured using routine methods. PEth levels at day 1 of detoxification ranged between 0.63 and 26.95 micromol/l (6.22 mean, 4.70 median, SD 4.97). There were no false negatives at day 1. Sensitivities for the other biomarkers were 40.4% for MCV, 73.1% for GGT and 69.2% for %CDT, respectively. No gender differences were found for PEth levels at any time point. Our data suggest that PEth is (1) a suitable intermediate term marker of ethanol intake in both sexes; and (2) sensitivity is extraordinary high in alcohol dependent patients. The results add further evidence to the data that suggest that PEth has potential as a candidate for a sensitive and specific biomarker, which reflects longer-lasting intake of higher amounts of alcohol and seemingly has the above mentioned certain advantages over traditional biomarkers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In most microarray technologies, a number of critical steps are required to convert raw intensity measurements into the data relied upon by data analysts, biologists and clinicians. These data manipulations, referred to as preprocessing, can influence the quality of the ultimate measurements. In the last few years, the high-throughput measurement of gene expression is the most popular application of microarray technology. For this application, various groups have demonstrated that the use of modern statistical methodology can substantially improve accuracy and precision of gene expression measurements, relative to ad-hoc procedures introduced by designers and manufacturers of the technology. Currently, other applications of microarrays are becoming more and more popular. In this paper we describe a preprocessing methodology for a technology designed for the identification of DNA sequence variants in specific genes or regions of the human genome that are associated with phenotypes of interest such as disease. In particular we describe methodology useful for preprocessing Affymetrix SNP chips and obtaining genotype calls with the preprocessed data. We demonstrate how our procedure improves existing approaches using data from three relatively large studies including one in which large number independent calls are available. Software implementing these ideas are avialble from the Bioconductor oligo package.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to measure gene expression on a genome-wide scale is one of the most promising accomplishments in molecular biology. Microarrays, the technology that first permitted this, were riddled with problems due to unwanted sources of variability. Many of these problems are now mitigated, after a decade’s worth of statistical methodology development. The recently developed RNA sequencing (RNA-seq) technology has generated much excitement in part due to claims of reduced variability in comparison to microarrays. However, we show RNA-seq data demonstrates unwanted and obscuring variability similar to what was first observed in microarrays. In particular, we find GC-content has a strong sample specific effect on gene expression measurements that, if left uncorrected, leads to false positives in downstream results. We also report on commonly observed data distortions that demonstrate the need for data normalization. Here we describe statistical methodology that improves precision by 42% without loss of accuracy. Our resulting conditional quantile normalization (CQN) algorithm combines robust generalized regression to remove systematic bias introduced by deterministic features such as GC-content, and quantile normalization to correct for global distortions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative reverse transcriptase real-time PCR (QRT-PCR) is a robust method to quantitate RNA abundance. The procedure is highly sensitive and reproducible as long as the initial RNA is intact. However, breaks in the RNA due to chemical or enzymatic cleavage may reduce the number of RNA molecules that contain intact amplicons. As a consequence, the number of molecules available for amplification decreases. We determined the relation between RNA fragmentation and threshold values (Ct values) in subsequent QRT-PCR for four genes in an experimental model of intact and partially hydrolyzed RNA derived from a cell line and we describe the relation between RNA integrity, amplicon size and Ct values in this biologically homogenous system. We demonstrate that degradation-related shifts of Ct values can be compensated by calculating delta Ct values between test genes and the mean values of several control genes. These delta Ct values are less sensitive to fragmentation of the RNA and are unaffected by varying amounts of input RNA. The feasibility of the procedure was demonstrated by comparing Ct values from a larger panel of genes in intact and in partially degraded RNA. We compared Ct values from intact RNA derived from well-preserved tumor material and from fragmented RNA derived from formalin-fixed, paraffin-embedded (FFPE) samples of the same tumors. We demonstrate that the relative abundance of gene expression can be based on FFPE material even when the amount of RNA in the sample and the extent of fragmentation are not known.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Methods for optical motion capture often require timeconsuming manual processing before the data can be used for subsequent tasks such as retargeting or character animation. These processing steps restrict the applicability of motion capturing especially for dynamic VR-environments with real time requirements. To solve these problems, we present two additional, fast and automatic processing stages based on our motion capture pipeline presented in [HSK05]. A normalization step aligns the recorded coordinate systems with the skeleton structure to yield a common and intuitive data basis across different recording sessions. A second step computes a parameterization based on automatically extracted main movement axes to generate a compact motion description. Our method does not restrict the placement of marker bodies nor the recording setup, and only requires a short calibration phase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Disc degeneration, usually associated with low back pain and changes of intervertebral stiffness, represents a major health issue. As the intervertebral disc (IVD) morphology influences its stiffness, the link between mechanical properties and degenerative grade is partially lost without an efficient normalization of the stiffness with respect to the morphology. Moreover, although the behavior of soft tissues is highly nonlinear, only linear normalization protocols have been defined so far for the disc stiffness. Thus, the aim of this work is to propose a nonlinear normalization based on finite elements (FE) simulations and evaluate its impact on the stiffness of human anatomical specimens of lumbar IVD. First, a parameter study involving simulations of biomechanical tests (compression, flexion/extension, bilateral torsion and bending) on 20 FE models of IVDs with various dimensions was carried out to evaluate the effect of the disc's geometry on its compliance and establish stiffness/morphology relations necessary to the nonlinear normalization. The computed stiffness was then normalized by height (H), cross-sectional area (CSA), polar moment of inertia (J) or moments of inertia (Ixx, Iyy) to quantify the effect of both linear and nonlinear normalizations. In the second part of the study, T1-weighted MRI images were acquired to determine H, CSA, J, Ixx and Iyy of 14 human lumbar IVDs. Based on the measured morphology and pre-established relation with stiffness, linear and nonlinear normalization routines were then applied to the compliance of the specimens for each quasi-static biomechanical test. The variability of the stiffness prior to and after normalization was assessed via coefficient of variation (CV). The FE study confirmed that larger and thinner IVDs were stiffer while the normalization strongly attenuated the effect of the disc geometry on its stiffness. Yet, notwithstanding the results of the FE study, the experimental stiffness showed consistently higher CV after normalization. Assuming that geometry and material properties affect the mechanical response, they can also compensate for one another. Therefore, the larger CV after normalization can be interpreted as a strong variability of the material properties, previously hidden by the geometry's own influence. In conclusion, a new normalization protocol for the intervertebral disc stiffness in compression, flexion, extension, bilateral torsion and bending was proposed, with the possible use of MRI and FE to acquire the discs' anatomy and determine the nonlinear relations between stiffness and morphology. Such protocol may be useful to relate the disc's mechanical properties to its degree of degeneration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some consequences of using Atomic Units are treated here.