205 resultados para QUANTITATIVE PROTEOMICS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The epithelial amiloride-sensitive sodium channel (ENaC) controls transepithelial Na+ movement in Na(+)-transporting epithelia and is associated with Liddle syndrome, an autosomal dominant form of salt-sensitive hypertension. Detailed analysis of ENaC channel properties and the functional consequences of mutations causing Liddle syndrome has been, so far, limited by lack of a method allowing specific and quantitative detection of cell-surface-expressed ENaC. We have developed a quantitative assay based on the binding of 125I-labeled M2 anti-FLAG monoclonal antibody (M2Ab*) directed against a FLAG reporter epitope introduced in the extracellular loop of each of the alpha, beta, and gamma ENaC subunits. Insertion of the FLAG epitope into ENaC sequences did not change its functional and pharmacological properties. The binding specificity and affinity (Kd = 3 nM) allowed us to correlate in individual Xenopus oocytes the macroscopic amiloride-sensitive sodium current (INa) with the number of ENaC wild-type and mutant subunits expressed at the cell surface. These experiments demonstrate that: (i) only heteromultimeric channels made of alpha, beta, and gamma ENaC subunits are maximally and efficiently expressed at the cell surface; (ii) the overall ENaC open probability is one order of magnitude lower than previously observed in single-channel recordings; (iii) the mutation causing Liddle syndrome (beta R564stop) enhances channel activity by two mechanisms, i.e., by increasing ENaC cell surface expression and by changing channel open probability. This quantitative approach provides new insights on the molecular mechanisms underlying one form of salt-sensitive hypertension.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes the composition of fingermark residue as being a complex system with numerous compounds coming from different sources and evolving over time from the initial composition (corresponding to the composition right after deposition) to the aged composition (corresponding to the evolution of the initial composition over time). This complex system will additionally vary due to effects of numerous influence factors grouped in five different classes: the donor characteristics, the deposition conditions, the substrate nature, the environmental conditions and the applied enhancement techniques. The initial and aged compositions as well as the influence factors are thus considered in this article to provide a qualitative and quantitative review of all compounds identified in fingermark residue up to now. The analytical techniques used to obtain these data are also enumerated. This review highlights the fact that despite the numerous analytical processes that have already been proposed and tested to elucidate fingermark composition, advanced knowledge is still missing. Thus, there is a real need to conduct future research on the composition of fingermark residue, focusing particularly on quantitative measurements, aging kinetics and effects of influence factors. The results of future research are particularly important for advances in fingermark enhancement and dating technique developments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the availability of new generation sequencing technologies, bacterial genome projects have undergone a major boost. Still, chromosome completion needs a costly and time-consuming gap closure, especially when containing highly repetitive elements. However, incomplete genome data may be sufficiently informative to derive the pursued information. For emerging pathogens, i.e. newly identified pathogens, lack of release of genome data during gap closure stage is clearly medically counterproductive. We thus investigated the feasibility of a dirty genome approach, i.e. the release of unfinished genome sequences to develop serological diagnostic tools. We showed that almost the whole genome sequence of the emerging pathogen Parachlamydia acanthamoebae was retrieved even with relatively short reads from Genome Sequencer 20 and Solexa. The bacterial proteome was analyzed to select immunogenic proteins, which were then expressed and used to elaborate the first steps of an ELISA. This work constitutes the proof of principle for a dirty genome approach, i.e. the use of unfinished genome sequences of pathogenic bacteria, coupled with proteomics to rapidly identify new immunogenic proteins useful to develop in the future specific diagnostic tests such as ELISA, immunohistochemistry and direct antigen detection. Although applied here to an emerging pathogen, this combined dirty genome sequencing/proteomic approach may be used for any pathogen for which better diagnostics are needed. These genome sequences may also be very useful to develop DNA based diagnostic tests. All these diagnostic tools will allow further evaluations of the pathogenic potential of this obligate intracellular bacterium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Prospective studies have shown that quantitative ultrasound (QUS) techniques predict the risk of fracture of the proximal femur with similar standardised risk ratios to dual-energy x-ray absorptiometry (DXA). Few studies have investigated these devices for the prediction of vertebral fractures. The Basel Osteoporosis Study (BOS) is a population-based prospective study to assess the performance of QUS devices and DXA in predicting incident vertebral fractures. METHODS: 432 women aged 60-80 years were followed-up for 3 years. Incident vertebral fractures were assessed radiologically. Bone measurements using DXA (spine and hip) and QUS measurements (calcaneus and proximal phalanges) were performed. Measurements were assessed for their value in predicting incident vertebral fractures using logistic regression. RESULTS: QUS measurements at the calcaneus and DXA measurements discriminated between women with and without incident vertebral fracture, (20% height reduction). The relative risks (RRs) for vertebral fracture, adjusted for age, were 2.3 for the Stiffness Index (SI) and 2.8 for the Quantitative Ultrasound Index (QUI) at the calcaneus and 2.0 for bone mineral density at the lumbar spine. The predictive value (AUC (95% CI)) of QUS measurements at the calcaneus remained highly significant (0.70 for SI, 0.72 for the QUI, and 0.67 for DXA at the lumbar spine) even after adjustment for other confounding variables. CONCLUSIONS: QUS of the calcaneus and bone mineral density measurements were shown to be significant predictors of incident vertebral fracture. The RRs for QUS measurements at the calcaneus are of similar magnitude as for DXA measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the central role of quantitative PCR (qPCR) in the quantification of mRNA transcripts, most analyses of qPCR data are still delegated to the software that comes with the qPCR apparatus. This is especially true for the handling of the fluorescence baseline. This article shows that baseline estimation errors are directly reflected in the observed PCR efficiency values and are thus propagated exponentially in the estimated starting concentrations as well as 'fold-difference' results. Because of the unknown origin and kinetics of the baseline fluorescence, the fluorescence values monitored in the initial cycles of the PCR reaction cannot be used to estimate a useful baseline value. An algorithm that estimates the baseline by reconstructing the log-linear phase downward from the early plateau phase of the PCR reaction was developed and shown to lead to very reproducible PCR efficiency values. PCR efficiency values were determined per sample by fitting a regression line to a subset of data points in the log-linear phase. The variability, as well as the bias, in qPCR results was significantly reduced when the mean of these PCR efficiencies per amplicon was used in the calculation of an estimate of the starting concentration per sample.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proteomics has changed the way proteins are analyzed in living systems. This approach has been applied to blood products and protein profiling has evolved in parallel with the development of techniques. The identification of proteins belonging to red blood cell, platelets or plasma was achieved at the end of the last century. Then, the questions on the applications emerged. Hence, several studies have focused on problems related to blood banking and products, such as the aging of blood products, identification of biomarkers, related diseases and the protein-protein interactions. More recently, a mass spectrometry-based proteomics approach to quality control has been applied in order to offer solutions and improve the quality of blood products. The current challenge we face is developing a closer relationship between transfusion medicine and proteomics. In this article, these issues will be approached by focusing first on the proteome identification of blood products and then on the applications and future developments within the field of proteomics and blood products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: The optimal coronary MR angiography sequence has yet to be determined. We sought to quantitatively and qualitatively compare four coronary MR angiography sequences. SUBJECTS AND METHODS. Free-breathing coronary MR angiography was performed in 12 patients using four imaging sequences (turbo field-echo, fast spin-echo, balanced fast field-echo, and spiral turbo field-echo). Quantitative comparisons, including signal-to-noise ratio, contrast-to-noise ratio, vessel diameter, and vessel sharpness, were performed using a semiautomated analysis tool. Accuracy for detection of hemodynamically significant disease (> 50%) was assessed in comparison with radiographic coronary angiography. RESULTS: Signal-to-noise and contrast-to-noise ratios were markedly increased using the spiral (25.7 +/- 5.7 and 15.2 +/- 3.9) and balanced fast field-echo (23.5 +/- 11.7 and 14.4 +/- 8.1) sequences compared with the turbo field-echo (12.5 +/- 2.7 and 8.3 +/- 2.6) sequence (p < 0.05). Vessel diameter was smaller with the spiral sequence (2.6 +/- 0.5 mm) than with the other techniques (turbo field-echo, 3.0 +/- 0.5 mm, p = 0.6; balanced fast field-echo, 3.1 +/- 0.5 mm, p < 0.01; fast spin-echo, 3.1 +/- 0.5 mm, p < 0.01). Vessel sharpness was highest with the balanced fast field-echo sequence (61.6% +/- 8.5% compared with turbo field-echo, 44.0% +/- 6.6%; spiral, 44.7% +/- 6.5%; fast spin-echo, 18.4% +/- 6.7%; p < 0.001). The overall accuracies of the sequences were similar (range, 74% for turbo field-echo, 79% for spiral). Scanning time for the fast spin-echo sequences was longest (10.5 +/- 0.6 min), and for the spiral acquisitions was shortest (5.2 +/- 0.3 min). CONCLUSION: Advantages in signal-to-noise and contrast-to-noise ratios, vessel sharpness, and the qualitative results appear to favor spiral and balanced fast field-echo coronary MR angiography sequences, although subjective accuracy for the detection of coronary artery disease was similar to that of other sequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inter-individual differences in gene expression are likely to account for an important fraction of phenotypic differences, including susceptibility to common disorders. Recent studies have shown extensive variation in gene expression levels in humans and other organisms, and that a fraction of this variation is under genetic control. We investigated the patterns of gene expression variation in a 25 Mb region of human chromosome 21, which has been associated with many Down syndrome (DS) phenotypes. Taqman real-time PCR was used to measure expression variation of 41 genes in lymphoblastoid cells of 40 unrelated individuals. For 25 genes found to be differentially expressed, additional analysis was performed in 10 CEPH families to determine heritabilities and map loci harboring regulatory variation. Seventy-six percent of the differentially expressed genes had significant heritabilities, and genomewide linkage analysis led to the identification of significant eQTLs for nine genes. Most eQTLs were in trans, with the best result (P=7.46 x 10(-8)) obtained for TMEM1 on chromosome 12q24.33. A cis-eQTL identified for CCT8 was validated by performing an association study in 60 individuals from the HapMap project. SNP rs965951 located within CCT8 was found to be significantly associated with its expression levels (P=2.5 x 10(-5)) confirming cis-regulatory variation. The results of our study provide a representative view of expression variation of chromosome 21 genes, identify loci involved in their regulation and suggest that genes, for which expression differences are significantly larger than 1.5-fold in control samples, are unlikely to be involved in DS-phenotypes present in all affected individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural selection is typically exerted at some specific life stages. If natural selection takes place before a trait can be measured, using conventional models can cause wrong inference about population parameters. When the missing data process relates to the trait of interest, a valid inference requires explicit modeling of the missing process. We propose a joint modeling approach, a shared parameter model, to account for nonrandom missing data. It consists of an animal model for the phenotypic data and a logistic model for the missing process, linked by the additive genetic effects. A Bayesian approach is taken and inference is made using integrated nested Laplace approximations. From a simulation study we find that wrongly assuming that missing data are missing at random can result in severely biased estimates of additive genetic variance. Using real data from a wild population of Swiss barn owls Tyto alba, our model indicates that the missing individuals would display large black spots; and we conclude that genes affecting this trait are already under selection before it is expressed. Our model is a tool to correctly estimate the magnitude of both natural selection and additive genetic variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La douleur neuropathique est définie comme une douleur causée par une lésion du système nerveux somato-sensoriel. Elle se caractérise par des douleurs exagérées, spontanées, ou déclenchées par des stimuli normalement non douloureux (allodynie) ou douloureux (hyperalgésie). Bien qu'elle concerne 7% de la population, ses mécanismes biologiques ne sont pas encore élucidés. L'étude des variations d'expressions géniques dans les tissus-clés des voies sensorielles (notamment le ganglion spinal et la corne dorsale de la moelle épinière) à différents moments après une lésion nerveuse périphérique permettrait de mettre en évidence de nouvelles cibles thérapeutiques. Elles se détectent de manière sensible par reverse transcription quantitative real-time polymerase chain reaction (RT- qPCR). Pour garantir des résultats fiables, des guidelines ont récemment recommandé la validation des gènes de référence utilisés pour la normalisation des données ("Minimum information for publication of quantitative real-time PCR experiments", Bustin et al 2009). Après recherche dans la littérature des gènes de référence fréquemment utilisés dans notre modèle de douleur neuropathique périphérique SNI (spared nerve injury) et dans le tissu nerveux en général, nous avons établi une liste de potentiels bons candidats: Actin beta (Actb), Glyceraldehyde-3-phosphate dehydrogenase (GAPDH), ribosomal proteins 18S (18S), L13a (RPL13a) et L29 (RPL29), hypoxanthine phosphoribosyltransferase 1 (HPRT1) et hydroxymethyl-bilane synthase (HMBS). Nous avons évalué la stabilité d'expression de ces gènes dans le ganglion spinal et dans la corne dorsale à différents moments après la lésion nerveuse (SNI) en calculant des coefficients de variation et utilisant l'algorithme geNorm qui compare les niveaux d'expression entre les différents candidats et détermine la paire de gènes restante la plus stable. Il a aussi été possible de classer les gènes selon leur stabilité et d'identifier le nombre de gènes nécessaires pour une normalisation la plus précise. Les gènes les plus cités comme référence dans le modèle SNI ont été GAPDH, HMBS, Actb, HPRT1 et 18S. Seuls HPRT1 and 18S ont été précédemment validés dans des arrays de RT-qPCR. Dans notre étude, tous les gènes testés dans le ganglion spinal et dans la corne dorsale satisfont au critère de stabilité exprimé par une M-value inférieure à 1. Par contre avec un coefficient de variation (CV) supérieur à 50% dans le ganglion spinal, 18S ne peut être retenu. La paire de gènes la plus stable dans le ganglion spinal est HPRT1 et Actb et dans la corne dorsale il s'agit de RPL29 et RPL13a. L'utilisation de 2 gènes de référence stables suffit pour une normalisation fiable. Nous avons donc classé et validé Actb, RPL29, RPL13a, HMBS, GAPDH, HPRT1 et 18S comme gènes de référence utilisables dans la corne dorsale pour le modèle SNI chez le rat. Dans le ganglion spinal 18S n'a pas rempli nos critères. Nous avons aussi déterminé que la combinaison de deux gènes de référence stables suffit pour une normalisation précise. Les variations d'expression génique de potentiels gènes d'intérêts dans des conditions expérimentales identiques (SNI, tissu et timepoints post SNI) vont pouvoir se mesurer sur la base d'une normalisation fiable. Non seulement il sera possible d'identifier des régulations potentiellement importantes dans la genèse de la douleur neuropathique mais aussi d'observer les différents phénotypes évoluant au cours du temps après lésion nerveuse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: In the middle of the 90's, the discovery of endogenous ligands for cannabinoid receptors opened a new era in this research field. Amides and esters of arachidonic acid have been identified as these endogenous ligands. Arachidonoylethanolamide (anandamide or AEA) and 2-Arachidonoylglycerol (2-AG) seem to be the most important of these lipid messengers. In addition, virodhamine (VA), noladin ether (2-AGE), and N-arachidonoyl dopamine (NADA) have been shown to bind to CB receptors with varying affinities. During recent years, it has become more evident that the EC system is part of fundamental regulatory mechanisms in many physiological processes such as stress and anxiety responses, depression, anorexia and bulimia, schizophrenia disorders, neuroprotection, Parkinson disease, anti-proliferative effects on cancer cells, drug addiction, and atherosclerosis. Aims: This work presents the problematic of EC analysis and the input of Information Dependant Acquisition based on hybrid triple quadrupole linear ion trap (QqQLIT) system for the profiling of these lipid mediators. Methods: The method was developed on a LC Ultimate 3000 series (Dionex, Sunnyvale, CA, USA) coupled to a QTrap 4000 system (Applied biosystems, Concord, ON, Canada). The ECs were separated on an XTerra C18 MS column (50 × 3.0 mm i.d., 3.5 μm) with a 5 min gradient elution. For confirmatory analysis, an information-dependant acquisition experiment was performed with selected reaction monitoring (SRM) as survey scan and enhanced produced ion (EPI) as dependant scan. Results: The assay was found to be linear in the concentration range of 0.1-5 ng/mL for AEA, 0.3-5 ng/mL for VA, 2-AGE, and NADA and 1-20 ng/mL for 2-AG using 0.5 mL of plasma. Repeatability and intermediate precision were found less than 15% over the tested concentration ranges. Under non-pathophysiological conditions, only AEA and 2-AG were actually detected in plasma with concentration ranges going from 104 to 537 pg/mL and from 2160 to 3990 pg/mL respectively. We have particularly focused our scopes on the evaluation of EC level changes in biological matrices through drug addiction and atherosclerosis processes. We will present preliminary data obtained during pilot study after administration of cannabis on human patients. Conclusion: ECs have been shown to play a key role in regulation of many pathophysiological processes. Medical research in these different fields continues to growth in order to understand and to highlight the predominant role of EC in the CNS and peripheral tissues signalisation. The profiling of these lipids needs to develop rapid, highly sensitive and selective analytical methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Boundaries for delta, representing a "quantitatively significant" or "substantively impressive" distinction, have not been established, analogous to the boundary of alpha, usually set at 0.05, for the stochastic or probabilistic component of "statistical significance". To determine what boundaries are being used for the "quantitative" decisions, we reviewed pertinent articles in three general medical journals. For each contrast of two means, contrast of two rates, or correlation coefficient, we noted the investigators' decisions about stochastic significance, stated in P values or confidence intervals, and about quantitative significance, indicated by interpretive comments. The boundaries between impressive and unimpressive distinctions were best formed by a ratio of greater than or equal to 1.2 for the smaller to the larger mean in 546 comparisons, by a standardized increment of greater than or equal to 0.28 and odds ratio of greater than or equal to 2.2 in 392 comparisons of two rates; and by an r value of greater than or equal to 0.32 in 154 correlation coefficients. Additional boundaries were also identified for "substantially" and "highly" significant quantitative distinctions. Although the proposed boundaries should be kept flexible, indexes and boundaries for decisions about "quantitative significance" are particularly useful when a value of delta must be chosen for calculating sample size before the research is done, and when the "statistical significance" of completed research is appraised for its quantitative as well as stochastic components.