31 resultados para Quantitative sensory test

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Quantitative sensory testing (QST) is widely used in human research to investigate the integrity of the sensory function in patients with pain of neuropathic origin, or other causes such as low back pain. Reliability of QST has been evaluated on both sides of the face, hands and feet as well as on the trunk (Th3-L3). In order to apply these tests on other body-parts such as the lower lumbar spine, it is important first to establish reliability on healthy individuals. The aim of this study was to investigate intra-rater reliability of thermal QST in healthy adults, on two sites within the L5 dermatome of the lumbar spine and lower extremity. METHODS: Test-retest reliability of thermal QST was determined at the L5-level of the lumbar spine and in the same dermatome on the lower extremity in 30 healthy persons under 40 years of age. Results were analyzed using descriptive statistics and intraclass correlation coefficient (ICC). Values were compared to normative data, using Z-transformation. RESULTS: Mean intraindividual differences were small for cold and warm detection thresholds but larger for pain thresholds. ICC values showed excellent reliability for warm detection and heat pain threshold, good-to-excellent reliability for cold pain threshold and fair-to-excellent reliability for cold detection threshold. ICC had large ranges of confidence interval (95%). CONCLUSION: In healthy adults, thermal QST on the lumbar spine and lower extremity demonstrated fair-to-excellent test-retest reliability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Chronic exposure to food of low quality may exert conflicting selection pressures on foraging behaviour. On the one hand, more active search behaviour may allow the animal to find patches with slightly better, or more, food; on the other hand, such active foraging is energetically costly, and thus may be opposed by selection for energetic efficiency. Here, we test these alternative hypotheses in Drosophila larvae. We show that populations which experimentally evolved improved tolerance to larval chronic malnutrition have shorter foraging path length than unselected control populations. A behavioural polymorphism in foraging path length (the rover-sitter polymorphism) exists in nature and is attributed to the foraging locus (for). We show that a sitter strain (for(s2)) survives better on the poor food than the rover strain (for(R)), confirming that the sitter foraging strategy is advantageous under malnutrition. Larvae of the selected and control populations did not differ in global for expression. However, a quantitative complementation test suggests that the for locus may have contributed to the adaptation to poor food in one of the selected populations, either through a change in for allele frequencies, or by interacting epistatically with alleles at other loci. Irrespective of its genetic basis, our results provide two independent lines of evidence that sitter-like foraging behaviour is favoured under chronic larval malnutrition.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The evolution of altruism is a fundamental and enduring puzzle in biology. In a seminal paper Hamilton showed that altruism can be selected for when rb - c > 0, where c is the fitness cost to the altruist, b is the fitness benefit to the beneficiary, and r is their genetic relatedness. While many studies have provided qualitative support for Hamilton's rule, quantitative tests have not yet been possible due to the difficulty of quantifying the costs and benefits of helping acts. Here we use a simulated system of foraging robots to experimentally manipulate the costs and benefits of helping and determine the conditions under which altruism evolves. By conducting experimental evolution over hundreds of generations of selection in populations with different c/b ratios, we show that Hamilton's rule always accurately predicts the minimum relatedness necessary for altruism to evolve. This high accuracy is remarkable given the presence of pleiotropic and epistatic effects as well as mutations with strong effects on behavior and fitness (effects not directly taken into account in Hamilton's original 1964 rule). In addition to providing the first quantitative test of Hamilton's rule in a system with a complex mapping between genotype and phenotype, these experiments demonstrate the wide applicability of kin selection theory.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Several methods and algorithms have recently been proposed that allow for the systematic evaluation of simple neuron models from intracellular or extracellular recordings. Models built in this way generate good quantitative predictions of the future activity of neurons under temporally structured current injection. It is, however, difficult to compare the advantages of various models and algorithms since each model is designed for a different set of data. Here, we report about one of the first attempts to establish a benchmark test that permits a systematic comparison of methods and performances in predicting the activity of rat cortical pyramidal neurons. We present early submissions to the benchmark test and discuss implications for the design of future tests and simple neurons models

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular monitoring of BCR/ABL transcripts by real time quantitative reverse transcription PCR (qRT-PCR) is an essential technique for clinical management of patients with BCR/ABL-positive CML and ALL. Though quantitative BCR/ABL assays are performed in hundreds of laboratories worldwide, results among these laboratories cannot be reliably compared due to heterogeneity in test methods, data analysis, reporting, and lack of quantitative standards. Recent efforts towards standardization have been limited in scope. Aliquots of RNA were sent to clinical test centers worldwide in order to evaluate methods and reporting for e1a2, b2a2, and b3a2 transcript levels using their own qRT-PCR assays. Total RNA was isolated from tissue culture cells that expressed each of the different BCR/ABL transcripts. Serial log dilutions were prepared, ranging from 100 to 10-5, in RNA isolated from HL60 cells. Laboratories performed 5 independent qRT-PCR reactions for each sample type at each dilution. In addition, 15 qRT-PCR reactions of the 10-3 b3a2 RNA dilution were run to assess reproducibility within and between laboratories. Participants were asked to run the samples following their standard protocols and to report cycle threshold (Ct), quantitative values for BCR/ABL and housekeeping genes, and ratios of BCR/ABL to housekeeping genes for each sample RNA. Thirty-seven (n=37) participants have submitted qRT-PCR results for analysis (36, 37, and 34 labs generated data for b2a2, b3a2, and e1a2, respectively). The limit of detection for this study was defined as the lowest dilution that a Ct value could be detected for all 5 replicates. For b2a2, 15, 16, 4, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. For b3a2, 20, 13, and 4 labs showed a limit of detection at the 10-5, 10-4, and 10-3 dilutions, respectively. For e1a2, 10, 21, 2, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. Log %BCR/ABL ratio values provided a method for comparing results between the different laboratories for each BCR/ABL dilution series. Linear regression analysis revealed concordance among the majority of participant data over the 10-1 to 10-4 dilutions. The overall slope values showed comparable results among the majority of b2a2 (mean=0.939; median=0.9627; range (0.399 - 1.1872)), b3a2 (mean=0.925; median=0.922; range (0.625 - 1.140)), and e1a2 (mean=0.897; median=0.909; range (0.5174 - 1.138)) laboratory results (Fig. 1-3)). Thirty-four (n=34) out of the 37 laboratories reported Ct values for all 15 replicates and only those with a complete data set were included in the inter-lab calculations. Eleven laboratories either did not report their copy number data or used other reporting units such as nanograms or cell numbers; therefore, only 26 laboratories were included in the overall analysis of copy numbers. The median copy number was 348.4, with a range from 15.6 to 547,000 copies (approximately a 4.5 log difference); the median intra-lab %CV was 19.2% with a range from 4.2% to 82.6%. While our international performance evaluation using serially diluted RNA samples has reinforced the fact that heterogeneity exists among clinical laboratories, it has also demonstrated that performance within a laboratory is overall very consistent. Accordingly, the availability of defined BCR/ABL RNAs may facilitate the validation of all phases of quantitative BCR/ABL analysis and may be extremely useful as a tool for monitoring assay performance. Ongoing analyses of these materials, along with the development of additional control materials, may solidify consensus around their application in routine laboratory testing and possible integration in worldwide efforts to standardize quantitative BCR/ABL testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unraveling the effect of selection vs. drift on the evolution of quantitative traits is commonly achieved by one of two methods. Either one contrasts population differentiation estimates for genetic markers and quantitative traits (the Q(st)-F(st) contrast) or multivariate methods are used to study the covariance between sets of traits. In particular, many studies have focused on the genetic variance-covariance matrix (the G matrix). However, both drift and selection can cause changes in G. To understand their joint effects, we recently combined the two methods into a single test (accompanying article by Martin et al.), which we apply here to a network of 16 natural populations of the freshwater snail Galba truncatula. Using this new neutrality test, extended to hierarchical population structures, we studied the multivariate equivalent of the Q(st)-F(st) contrast for several life-history traits of G. truncatula. We found strong evidence of selection acting on multivariate phenotypes. Selection was homogeneous among populations within each habitat and heterogeneous between habitats. We found that the G matrices were relatively stable within each habitat, with proportionality between the among-populations (D) and the within-populations (G) covariance matrices. The effect of habitat heterogeneity is to break this proportionality because of selection for habitat-dependent optima. Individual-based simulations mimicking our empirical system confirmed that these patterns are expected under the selective regime inferred. We show that homogenizing selection can mimic some effect of drift on the G matrix (G and D almost proportional), but that incorporating information from molecular markers (multivariate Q(st)-F(st)) allows disentangling the two effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative ultrasound (QUS) appears to be developing into an acceptable, low-cost and readily-accessible alternative to dual X-ray absorptiometry (DXA) measurements of bone mineral density (BMD) in the detection and management of osteoporosis. Perhaps the major difficulty with their widespread use is that many different QUS devices exist that differ substantially from each other, in terms of the parameters they measure and the strength of empirical evidence supporting their use. But another problem is that virtually no data exist outside of Caucasian or Asian populations. In general, heel QUS appears to be most tested and most effective. Some, but not all heel QUS devices are effective assessing fracture risk in some, but not all populations, the evidence being strongest for Caucasian females > 55 years old, though some evidence exists for Asian females > 55 and for Caucasian and Asian males > 70. Certain devices may allow to estimate the likelihood of osteoporosis, but very limited evidence exists supporting QUS use during the initiation or monitoring of osteoporosis treatment. Likely, QUS is most effective when combined with an assessment of clinical risk factors (CRF); with DXA reserved for individuals who are not identified as either high or low risk using QUS and CRF. However, monitoring and maintenance of test and instrument accuracy, precision and reproducibility are essential if QUS devices are to be used in clinical practice; and further scientific research in non-Caucasian, non-Asian populations clearly is compulsory to validate this tool for more widespread use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Q(st)-F(st)) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2F(st)/(1 - F(st))G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2F(st)/(1 - F(st))] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Q(st)-F(st) comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are no validated criteria for the diagnosis of sensory neuronopathy (SNN) yet. In a preliminary monocenter study a set of criteria relying on clinical and electrophysiological data showed good sensitivity and specificity for a diagnosis of probable SNN. The aim of this study was to test these criteria on a French multicenter study. 210 patients with sensory neuropathies from 15 francophone reference centers for neuromuscular diseases were included in the study with an expert diagnosis of non-SNN, SNN or suspected SNN according to the investigations performed in these centers. Diagnosis was obtained independently from the set of criteria to be tested. The expert diagnosis was taken as the reference against which the proposed SNN criteria were tested. The set relied on clinical and electrophysiological data easily obtainable with routine investigations. 9/61 (16.4 %) of non-SNN patients, 23/36 (63.9 %) of suspected SNN, and 102/113 (90.3 %) of SNN patients according to the expert diagnosis were classified as SNN by the criteria. The SNN criteria tested against the expert diagnosis in the SNN and non-SNN groups had 90.3 % (102/113) sensitivity, 85.2 % (52/61) specificity, 91.9 % (102/111) positive predictive value, and 82.5 % (52/63) negative predictive value. Discordance between the expert diagnosis and the SNN criteria occurred in 20 cases. After analysis of these cases, 11 could be reallocated to a correct diagnosis in accordance with the SNN criteria. The proposed criteria may be useful for the diagnosis of probable SNN in patients with sensory neuropathy. They can be reached with simple clinical and paraclinical investigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To objectively compare quantitative parameters related to image quality attained at coronary magnetic resonance (MR) angiography of the right coronary artery (RCA) performed at 7 T and 3 T. MATERIALS AND METHODS: Institutional review board approval was obtained, and volunteers provided signed informed consent. Ten healthy adult volunteers (mean age ± standard deviation, 25 years ± 4; seven men, three women) underwent navigator-gated three-dimensional MR angiography of the RCA at 7 T and 3 T. For 7 T, a custom-built quadrature radiofrequency transmit-receive surface coil was used. At 3 T, a commercial body radiofrequency transmit coil and a cardiac coil array for signal reception were used. Segmented k-space gradient-echo imaging with spectrally selective adiabatic fat suppression was performed, and imaging parameters were similar at both field strengths. Contrast-to-noise ratio between blood and epicardial fat; signal-to-noise ratio of the blood pool; RCA vessel sharpness, diameter, and length; and navigator efficiency were quantified at both field strengths and compared by using a Mann-Whitney U test. RESULTS: The contrast-to-noise ratio between blood and epicardial fat was significantly improved at 7 T when compared with that at 3 T (87 ± 34 versus 52 ± 13; P = .01). Signal-to-noise ratio of the blood pool was increased at 7 T (109 ± 47 versus 67 ± 19; P = .02). Vessel sharpness obtained at 7 T was also higher (58% ± 9 versus 50% ± 5; P = .04). At the same time, RCA vessel diameter and length and navigator efficiency showed no significant field strength-dependent difference. CONCLUSION: In our quantitative and qualitative study comparing in vivo human imaging of the RCA at 7 T and 3 T in young healthy volunteers, parameters related to image quality attained at 7 T equal or surpass those from 3 T.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A critical feature of cooperative animal societies is the reproductive skew, a shorthand term for the degree to which a dominant individual monopolizes overall reproduction in the group. Our theoretical analysis of the evolutionarily stable skew in matrifilial (i.e., mother-daughter) societies, in which relatednesses to offspring are asymmetrical, predicts that reproductive skews in such societies should tend to be greater than those of semisocial societies (i.e., societies composed of individuals of the same generation, such as siblings), in which relatednesses to offspring are symmetrical. Quantitative data on reproductive skews in semisocial and matrifilial associations within the same species for 17 eusocial Hymenoptera support this prediction. Likewise, a survey of reproductive partitioning within 20 vertebrate societies demonstrates that complete reproductive monopoly is more likely to occur in matrifilial than in semisocial societies, also as predicted by the optimal skew model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapport de synthèse : L'ostéoporose est reconnue comme un problème majeur de santé publique. Comme il existe actuellement des traitements préventifs efficaces pour minimiser le risque de fracture, il est essentiel de développer des nouvelles stratégies de détection des femmes à risque de fracture. Les marqueurs spécifiques du remodelage osseux dosés dans les urines ainsi que les ultrasons quantitatifs du talon ont été étudiés comme outils cliniques pour prédire le risque fracturaire chez les femmes âgées. Il n'existe cependant que très peu de donnée sur la combinaison de ces deux outils pour améliorer la prédiction du risque de fracture. Cette étude cas-contrôle, réalisée chez 368 femmes âgées de 76 ans en moyenne d'une cohorte suisse de femmes ambulatoires, évalue la capacité discriminative entre 195 femmes avec fracture non-vertébrale à bas traumatisme et 173 femmes sans fractures - de deux marqueurs urinaires de la résorption osseuse, les pyridinolines et les deoxypyridinolines, ainsi que deux ultrasons quantitatifs du talon, le Achilles+ (GE-Lunar, Madison, USA) et le Sahara (Hologic, Waltham, USA). Les 195 patientes avec une fracture ont été choisies identiques aux 173 contrôles concernant Page, l'indice de masse corporel, le centre médical et la durée de suivi jusqu'à la fracture. Cette étude montre que les marqueurs urinaires de la résorption osseuse ont une capacité environ identique aux ultrasons quantitatifs du talon pour discriminer entre les patientes avec fracture non-vertébrale à bas traumatisme et les contrôles. La combinaison des deux tests n'est cependant pas plus performante qu'un seul test. Les résultats de cette étude peuvent aider à concevoir les futures stratégies de détection du risque fracturaire chez les femmes âgées, qui intègrent notamment des facteurs de risque cliniques, radiologiques et biochimiques. Abstract : Summary : This nested case-control analysis of a Swiss ambulatory cohort of elderly women assessed the discriminatory power of urinary markers of bone resorption and heel quantitative ultrasound for non-vertebral fractures. The tests all discriminated between cases and controls, but combining the two strategies yielded no additional relevant information. Introduction : Data are limited regarding the combination of bone resorption markers and heel quantitative bone ultrasound (QUS) in the detection of women at risk for fracture. Methods In a nested case-control analysis, we studied 368 women (mean age 76.213.2 years), 195 with low-trauma non-vertebral fractures and 173 without, matched for age, BMI, medical center, and follow-up duration, from a prospective study designed to predict fractures. Urinary total pyridinolines (PYD) and deoxypyridinolines (DPD) were measured by high performance liquid chromatography. All women underwent bone evaluations using Achilles+ and Sahara heel QUS. Results : Areas under the receiver operating-characteristic curve (AUC) for discriminative models of the fracture group, with 95% confidence intervals, were 0.62 (0.560.68) and 0.59 (0.53-0.65) for PYD and DPD, and 0.64 (0.58-0.69) and 0.65 (0.59-0.71) for Achilles+ and Sahara QUS, respectively. The combination of resorption markers and QUS added no significant discriminatory information to either measurement alone with an AUC of 0.66 (0.600.71) for Achilles+ with PYD and 0.68 (0.62-0.73) for Sahara with PYD. Conclusions : Urinary bone resorption markers and QUS are equally discriminatory between non-vertebral fracture patients and controls. However, the combination of bone resorption markers and QUS is not better than either test used alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Sedation and therapeutic hypothermia (TH) delay neurological responses and might reduce the accuracy of clinical examination to predict outcome after cardiac arrest (CA). We examined the accuracy of quantitative pupillary light reactivity (PLR), using an automated infrared pupillometry, to predict outcome of post-CA coma in comparison to standard PLR, EEG, and somato-sensory evoked potentials (SSEP). METHODS: We prospectively studied over a 1-year period (June 2012-June 2013) 50 consecutive comatose CA patients treated with TH (33 °C, 24 h). Quantitative PLR (expressed as the % of pupillary response to a calibrated light stimulus) and standard PLR were measured at day 1 (TH and sedation; on average 16 h after CA) and day 2 (normothermia, off sedation: on average 46 h after CA). Neurological outcome was assessed at 90 days with Cerebral Performance Categories (CPC), dichotomized as good (CPC 1-2) versus poor (CPC 3-5). Predictive performance was analyzed using area under the ROC curves (AUC). RESULTS: Patients with good outcome [n = 23 (46 %)] had higher quantitative PLR than those with poor outcome [n = 27; 16 (range 9-23) vs. 10 (1-30) % at day 1, and 20 (13-39) vs. 11 (1-55) % at day 2, both p < 0.001]. Best cut-off for outcome prediction of quantitative PLR was <13 %. The AUC to predict poor outcome was higher for quantitative than for standard PLR at both time points (day 1, 0.79 vs. 0.56, p = 0.005; day 2, 0.81 vs. 0.64, p = 0.006). Prognostic accuracy of quantitative PLR was comparable to that of EEG and SSEP (0.81 vs. 0.80 and 0.73, respectively, both p > 0.20). CONCLUSIONS: Quantitative PLR is more accurate than standard PLR in predicting outcome of post-anoxic coma, irrespective of temperature and sedation, and has comparable prognostic accuracy than EEG and SSEP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: HIV-1 RNA viral load is a key parameter for reliable treatment monitoring of HIV-1 infection. Accurate HIV-1 RNA quantitation can be impaired by primer and probe sequence polymorphisms as a result of tremendous genetic diversity and ongoing evolution of HIV-1. A novel dual HIV-1 target amplification approach was realized in the quantitative COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 (HIV-1 TaqMan test v2.0) to cope with the high genetic diversity of the virus. OBJECTIVES AND STUDY DESIGN: The performance of the new assay was evaluated for sensitivity, dynamic range, precision, subtype inclusivity, diagnostic and analytical specificity, interfering substances, and correlation with the COBAS AmpliPrep/COBAS TaqMan HIV-1 (HIV-1 TaqMan test v1.0) predecessor test in patients specimens. RESULTS: The new assay demonstrated a sensitivity of 20 copies/mL, a linear measuring range of 20-10,000,000 copies/mL, with a lower limit of quantitation of 20 copies/mL. HIV-1 Group M subtypes and HIV-1 Group O were quantified within +/-0.3 log(10) of the assigned titers. Specificity was 100% in 660 tested specimens, no cross reactivity was found for 15 pathogens nor any interference for endogenous substances or 29 drugs. Good comparability with the predecessor assay was demonstrated in 82 positive patient samples. In selected clinical samples 35/66 specimens were found underquantitated in the predecessor assay; all were quantitated correctly in the new assay. CONCLUSIONS: The dual-target approach for the HIV-1 TaqMan test v2.0 enables superior HIV-1 Group M subtype coverage including HIV-1 Group O detection. Correct quantitation of specimens underquantitated in the HIV-1 TaqMan test v1.0 test was demonstrated.