12 resultados para Value-Adding

em Université de Lausanne, Switzerland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIMS/HYPOTHESIS: Several susceptibility genes for type 2 diabetes have been discovered recently. Individually, these genes increase the disease risk only minimally. The goals of the present study were to determine, at the population level, the risk of diabetes in individuals who carry risk alleles within several susceptibility genes for the disease and the added value of this genetic information over the clinical predictors. METHODS: We constructed an additive genetic score using the most replicated single-nucleotide polymorphisms (SNPs) within 15 type 2 diabetes-susceptibility genes, weighting each SNP with its reported effect. We tested this score in the extensively phenotyped population-based cross-sectional CoLaus Study in Lausanne, Switzerland (n = 5,360), involving 356 diabetic individuals. RESULTS: The clinical predictors of prevalent diabetes were age, BMI, family history of diabetes, WHR, and triacylglycerol/HDL-cholesterol ratio. After adjustment for these variables, the risk of diabetes was 2.7 (95% CI 1.8-4.0, p = 0.000006) for individuals with a genetic score within the top quintile, compared with the bottom quintile. Adding the genetic score to the clinical covariates improved the area under the receiver operating characteristic curve slightly (from 0.86 to 0.87), yet significantly (p = 0.002). BMI was similar in these two extreme quintiles. CONCLUSIONS/INTERPRETATION: In this population, a simple weighted 15 SNP-based genetic score provides additional information over clinical predictors of prevalent diabetes. At this stage, however, the clinical benefit of this genetic information is limited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To determine the value of applying finger trap distraction during direct MR arthrography of the wrist to assess intrinsic ligament and triangular fibrocartilage complex (TFCC) tears. MATERIALS AND METHODS: Twenty consecutive patients were prospectively investigated by three-compartment wrist MR arthrography. Imaging was performed with 3-T scanners using a three-dimensional isotropic (0.4 mm) T1-weighted gradient-recalled echo sequence, with and without finger trap distraction (4 kg). In a blind and independent fashion, two musculoskeletal radiologists measured the width of the scapholunate (SL), lunotriquetral (LT) and ulna-TFC (UTFC) joint spaces. They evaluated the amount of contrast medium within these spaces using a four-point scale, and assessed SL, LT and TFCC tears, as well as the disruption of Gilula's carpal arcs. RESULTS: With finger trap distraction, both readers found a significant increase in width of the SL space (mean Δ = +0.1mm, p ≤ 0.040), and noticed more contrast medium therein (p ≤ 0.035). In contrast, the differences in width of the LT (mean Δ = +0.1 mm, p ≥ 0.057) and UTFC (mean Δ = 0mm, p ≥ 0.728) spaces, as well as the amount of contrast material within these spaces were not statistically significant (p = 0.607 and ≥ 0.157, respectively). Both readers detected more SL (Δ = +1, p = 0.157) and LT (Δ = +2, p = 0.223) tears, although statistical significance was not reached, and Gilula's carpal arcs were more frequently disrupted during finger trap distraction (Δ = +5, p = 0.025). CONCLUSION: The application of finger trap distraction during direct wrist MR arthrography may enhance both detection and characterisation of SL and LT ligament tears by widening the SL space and increasing the amount of contrast within the SL and LT joint spaces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We assessed the blockade of the renin-angiotensin system (RAS) achieved with 2 angiotensin (Ang) antagonists given either alone at different doses or with an ACE inhibitor. First, 20 normotensive subjects were randomly assigned to 100 mg OD losartan (LOS) or 80 mg OD telmisartan (TEL) for 1 week; during another week, the same doses of LOS and TEL were combined with 20 mg OD lisinopril. Then, 10 subjects were randomly assigned to 200 mg OD LOS and 160 mg OD TEL for 1 week and 100 mg BID LOS and 80 mg BID TEL during the second week. Blockade of the RAS was evaluated with the inhibition of the pressor effect of exogenous Ang I, an ex vivo receptor assay, and the changes in plasma Ang II. Trough blood pressure response to Ang I was blocked by 35+/-16% (mean+/-SD) with 100 mg OD LOS and by 36+/-13% with 80 mg OD TEL. When combined with lisinopril, blockade was 76+/-7% with LOS and 79+/-9% with TEL. With 200 mg OD LOS, trough blockade was 54+/-14%, but with 100 mg BID it increased to 77+/-8% (P<0.01). Telmisartan (160 mg OD and 80 mg BID) produced a comparable effect. Thus, at their maximal recommended doses, neither LOS nor TEL blocks the RAS for 24 hours; hence, the addition of an ACE inhibitor provides an additional blockade. A 24-hour blockade can be achieved with an angiotensin antagonist alone, provided higher doses or a BID regimen is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although not specific, an increased in peripheral blood eosinophils may contribute substantially to the diagnosis of numerous infectious, allergic and inflammatory diseases. The scope of this article is to detail pathologies associated with peripheral eosinophilia by order of frequency and to guide further investigations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: To confirm the accuracy of sentinel node biopsy (SNB) procedure and its morbidity, and to investigate predictive factors for SN status and prognostic factors for disease-free survival (DFS) and disease-specific survival (DSS). MATERIALS AND METHODS: Between October 1997 and December 2004, 327 consecutive patients in one centre with clinically node-negative primary skin melanoma underwent an SNB by the triple technique, i.e. lymphoscintigraphy, blue-dye and gamma-probe. Multivariate logistic regression analyses as well as the Kaplan-Meier were performed. RESULTS: Twenty-three percent of the patients had at least one metastatic SN, which was significantly associated with Breslow thickness (p<0.001). The success rate of SNB was 99.1% and its morbidity was 7.6%. With a median follow-up of 33 months, the 5-year DFS/DSS were 43%/49% for patients with positive SN and 83.5%/87.4% for patients with negative SN, respectively. The false-negative rate of SNB was 8.6% and sensitivity 91.4%. On multivariate analysis, DFS was significantly worsened by Breslow thickness (RR=5.6, p<0.001), positive SN (RR=5.0, p<0.001) and male sex (RR=2.9, p=0.001). The presence of a metastatic SN (RR=8.4, p<0.001), male sex (RR=6.1, p<0.001), Breslow thickness (RR=3.2, p=0.013) and ulceration (RR=2.6, p=0.015) were significantly associated with a poorer DSS. CONCLUSION: SNB is a reliable procedure with high sensitivity (91.4%) and low morbidity. Breslow thickness was the only statistically significant parameter predictive of SN status. DFS was worsened in decreasing order by Breslow thickness, metastatic SN and male gender. Similarly DSS was significantly worsened by a metastatic SN, male gender, Breslow thickness and ulceration. These data reinforce the SN status as a powerful staging procedure

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attrition in longitudinal studies can lead to biased results. The study is motivated by the unexpected observation that alcohol consumption decreased despite increased availability, which may be due to sample attrition of heavy drinkers. Several imputation methods have been proposed, but rarely compared in longitudinal studies of alcohol consumption. The imputation of consumption level measurements is computationally particularly challenging due to alcohol consumption being a semi-continuous variable (dichotomous drinking status and continuous volume among drinkers), and the non-normality of data in the continuous part. Data come from a longitudinal study in Denmark with four waves (2003-2006) and 1771 individuals at baseline. Five techniques for missing data are compared: Last value carried forward (LVCF) was used as a single, and Hotdeck, Heckman modelling, multivariate imputation by chained equations (MICE), and a Bayesian approach as multiple imputation methods. Predictive mean matching was used to account for non-normality, where instead of imputing regression estimates, "real" observed values from similar cases are imputed. Methods were also compared by means of a simulated dataset. The simulation showed that the Bayesian approach yielded the most unbiased estimates for imputation. The finding of no increase in consumption levels despite a higher availability remained unaltered. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CONTEXT: In populations of older adults, prediction of coronary heart disease (CHD) events through traditional risk factors is less accurate than in middle-aged adults. Electrocardiographic (ECG) abnormalities are common in older adults and might be of value for CHD prediction. OBJECTIVE: To determine whether baseline ECG abnormalities or development of new and persistent ECG abnormalities are associated with increased CHD events. DESIGN, SETTING, AND PARTICIPANTS: A population-based study of 2192 white and black older adults aged 70 to 79 years from the Health, Aging, and Body Composition Study (Health ABC Study) without known cardiovascular disease. Adjudicated CHD events were collected over 8 years between 1997-1998 and 2006-2007. Baseline and 4-year ECG abnormalities were classified according to the Minnesota Code as major and minor. Using Cox proportional hazards regression models, the addition of ECG abnormalities to traditional risk factors were examined to predict CHD events. MAIN OUTCOME MEASURE: Adjudicated CHD events (acute myocardial infarction [MI], CHD death, and hospitalization for angina or coronary revascularization). RESULTS: At baseline, 276 participants (13%) had minor and 506 (23%) had major ECG abnormalities. During follow-up, 351 participants had CHD events (96 CHD deaths, 101 acute MIs, and 154 hospitalizations for angina or coronary revascularizations). Both baseline minor and major ECG abnormalities were associated with an increased risk of CHD after adjustment for traditional risk factors (17.2 per 1000 person-years among those with no abnormalities; 29.3 per 1000 person-years; hazard ratio [HR], 1.35; 95% CI, 1.02-1.81; for minor abnormalities; and 31.6 per 1000 person-years; HR, 1.51; 95% CI, 1.20-1.90; for major abnormalities). When ECG abnormalities were added to a model containing traditional risk factors alone, 13.6% of intermediate-risk participants with both major and minor ECG abnormalities were correctly reclassified (overall net reclassification improvement [NRI], 7.4%; 95% CI, 3.1%-19.0%; integrated discrimination improvement, 0.99%; 95% CI, 0.32%-2.15%). After 4 years, 208 participants had new and 416 had persistent abnormalities. Both new and persistent ECG abnormalities were associated with an increased risk of subsequent CHD events (HR, 2.01; 95% CI, 1.33-3.02; and HR, 1.66; 95% CI, 1.18-2.34; respectively). When added to the Framingham Risk Score, the NRI was not significant (5.7%; 95% CI, -0.4% to 11.8%). CONCLUSIONS: Major and minor ECG abnormalities among older adults were associated with an increased risk of CHD events. Depending on the model, adding ECG abnormalities was associated with improved risk prediction beyond traditional risk factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices.