978 resultados para Normalization constraint


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a majority of species, leaf development is thought to proceed in a bilaterally symmetric fashion without systematic asymmetries. This is despite the left and right sides of an initiating primordium occupying niches that differ in their distance from sinks and sources of auxin. Here, we revisit an existing model of auxin transport sufficient to recreate spiral phyllotactic patterns and find previously overlooked asymmetries between auxin distribution and the centers of leaf primordia. We show that it is the direction of the phyllotactic spiral that determines the side of the leaf these asymmetries fall on. We empirically confirm the presence of an asymmetric auxin response using a DR5 reporter and observe morphological asymmetries in young leaf primordia. Notably, these morphological asymmetries persist in mature leaves, and we observe left-right asymmetries in the superficially bilaterally symmetric leaves of tomato (Solanum lycopersicum) and Arabidopsis thaliana that are consistent with modeled predictions. We further demonstrate that auxin application to a single side of a leaf primordium is sufficient to recapitulate the asymmetries we observe. Our results provide a framework to study a previously overlooked developmental axis and provide insights into the developmental constraints imposed upon leaf morphology by auxin-dependent phyllotactic patterning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In most microarray technologies, a number of critical steps are required to convert raw intensity measurements into the data relied upon by data analysts, biologists and clinicians. These data manipulations, referred to as preprocessing, can influence the quality of the ultimate measurements. In the last few years, the high-throughput measurement of gene expression is the most popular application of microarray technology. For this application, various groups have demonstrated that the use of modern statistical methodology can substantially improve accuracy and precision of gene expression measurements, relative to ad-hoc procedures introduced by designers and manufacturers of the technology. Currently, other applications of microarrays are becoming more and more popular. In this paper we describe a preprocessing methodology for a technology designed for the identification of DNA sequence variants in specific genes or regions of the human genome that are associated with phenotypes of interest such as disease. In particular we describe methodology useful for preprocessing Affymetrix SNP chips and obtaining genotype calls with the preprocessed data. We demonstrate how our procedure improves existing approaches using data from three relatively large studies including one in which large number independent calls are available. Software implementing these ideas are avialble from the Bioconductor oligo package.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to measure gene expression on a genome-wide scale is one of the most promising accomplishments in molecular biology. Microarrays, the technology that first permitted this, were riddled with problems due to unwanted sources of variability. Many of these problems are now mitigated, after a decade’s worth of statistical methodology development. The recently developed RNA sequencing (RNA-seq) technology has generated much excitement in part due to claims of reduced variability in comparison to microarrays. However, we show RNA-seq data demonstrates unwanted and obscuring variability similar to what was first observed in microarrays. In particular, we find GC-content has a strong sample specific effect on gene expression measurements that, if left uncorrected, leads to false positives in downstream results. We also report on commonly observed data distortions that demonstrate the need for data normalization. Here we describe statistical methodology that improves precision by 42% without loss of accuracy. Our resulting conditional quantile normalization (CQN) algorithm combines robust generalized regression to remove systematic bias introduced by deterministic features such as GC-content, and quantile normalization to correct for global distortions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative reverse transcriptase real-time PCR (QRT-PCR) is a robust method to quantitate RNA abundance. The procedure is highly sensitive and reproducible as long as the initial RNA is intact. However, breaks in the RNA due to chemical or enzymatic cleavage may reduce the number of RNA molecules that contain intact amplicons. As a consequence, the number of molecules available for amplification decreases. We determined the relation between RNA fragmentation and threshold values (Ct values) in subsequent QRT-PCR for four genes in an experimental model of intact and partially hydrolyzed RNA derived from a cell line and we describe the relation between RNA integrity, amplicon size and Ct values in this biologically homogenous system. We demonstrate that degradation-related shifts of Ct values can be compensated by calculating delta Ct values between test genes and the mean values of several control genes. These delta Ct values are less sensitive to fragmentation of the RNA and are unaffected by varying amounts of input RNA. The feasibility of the procedure was demonstrated by comparing Ct values from a larger panel of genes in intact and in partially degraded RNA. We compared Ct values from intact RNA derived from well-preserved tumor material and from fragmented RNA derived from formalin-fixed, paraffin-embedded (FFPE) samples of the same tumors. We demonstrate that the relative abundance of gene expression can be based on FFPE material even when the amount of RNA in the sample and the extent of fragmentation are not known.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a frequency-based explanation of the Ditransitive Person-Role Constraint, a cross-linguistic generalization that can be formulated as follows: "Combinations of bound pronouns with the roles Recipient and Theme are disfavored if the Theme pronoun is first or second person and the Recipient pronoun is third person."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Methods for optical motion capture often require timeconsuming manual processing before the data can be used for subsequent tasks such as retargeting or character animation. These processing steps restrict the applicability of motion capturing especially for dynamic VR-environments with real time requirements. To solve these problems, we present two additional, fast and automatic processing stages based on our motion capture pipeline presented in [HSK05]. A normalization step aligns the recorded coordinate systems with the skeleton structure to yield a common and intuitive data basis across different recording sessions. A second step computes a parameterization based on automatically extracted main movement axes to generate a compact motion description. Our method does not restrict the placement of marker bodies nor the recording setup, and only requires a short calibration phase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Disc degeneration, usually associated with low back pain and changes of intervertebral stiffness, represents a major health issue. As the intervertebral disc (IVD) morphology influences its stiffness, the link between mechanical properties and degenerative grade is partially lost without an efficient normalization of the stiffness with respect to the morphology. Moreover, although the behavior of soft tissues is highly nonlinear, only linear normalization protocols have been defined so far for the disc stiffness. Thus, the aim of this work is to propose a nonlinear normalization based on finite elements (FE) simulations and evaluate its impact on the stiffness of human anatomical specimens of lumbar IVD. First, a parameter study involving simulations of biomechanical tests (compression, flexion/extension, bilateral torsion and bending) on 20 FE models of IVDs with various dimensions was carried out to evaluate the effect of the disc's geometry on its compliance and establish stiffness/morphology relations necessary to the nonlinear normalization. The computed stiffness was then normalized by height (H), cross-sectional area (CSA), polar moment of inertia (J) or moments of inertia (Ixx, Iyy) to quantify the effect of both linear and nonlinear normalizations. In the second part of the study, T1-weighted MRI images were acquired to determine H, CSA, J, Ixx and Iyy of 14 human lumbar IVDs. Based on the measured morphology and pre-established relation with stiffness, linear and nonlinear normalization routines were then applied to the compliance of the specimens for each quasi-static biomechanical test. The variability of the stiffness prior to and after normalization was assessed via coefficient of variation (CV). The FE study confirmed that larger and thinner IVDs were stiffer while the normalization strongly attenuated the effect of the disc geometry on its stiffness. Yet, notwithstanding the results of the FE study, the experimental stiffness showed consistently higher CV after normalization. Assuming that geometry and material properties affect the mechanical response, they can also compensate for one another. Therefore, the larger CV after normalization can be interpreted as a strong variability of the material properties, previously hidden by the geometry's own influence. In conclusion, a new normalization protocol for the intervertebral disc stiffness in compression, flexion, extension, bilateral torsion and bending was proposed, with the possible use of MRI and FE to acquire the discs' anatomy and determine the nonlinear relations between stiffness and morphology. Such protocol may be useful to relate the disc's mechanical properties to its degree of degeneration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a new method for stitching multiple fluoroscopic images taken by a C-arm instrument. We employ an X-ray radiolucent ruler with numbered graduations while acquiring the images, and the image stitching is based on detecting and matching ruler parts in the images to the corresponding parts of a virtual ruler. To achieve this goal, we first detect the regular spaced graduations on the ruler and the numbers. After graduation labeling, for each image, we have the location and the associated number for every graduation on the ruler. Then, we initialize the panoramic X-ray image with the virtual ruler, and we “paste” each image by aligning the detected ruler part on the original image, to the corresponding part of the virtual ruler on the panoramic image. Our method is based on ruler matching but without the requirement of matching similar feature points in pairwise images, and thus, we do not necessarily require overlap between the images. We tested our method on eight different datasets of X-ray images, including long bones and a complete spine. Qualitative and quantitative experiments show that our method achieves good results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some consequences of using Atomic Units are treated here.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Frobenius solution to Legendre/s equation is developed in detail as is Rodrigue's formula, which is employed to normalize Legendre polynomials.