90 resultados para Relative risk aversion


Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Meta-analyses of continuous outcomes typically provide enough information for decision-makers to evaluate the extent to which chance can explain apparent differences between interventions. The interpretation of the magnitude of these differences - from trivial to large - can, however, be challenging. We investigated clinicians' understanding and perceptions of usefulness of 6 statistical formats for presenting continuous outcomes from meta-analyses (standardized mean difference, minimal important difference units, mean difference in natural units, ratio of means, relative risk and risk difference). METHODS We invited 610 staff and trainees in internal medicine and family medicine programs in 8 countries to participate. Paper-based, self-administered questionnaires presented summary estimates of hypothetical interventions versus placebo for chronic pain. The estimates showed either a small or a large effect for each of the 6 statistical formats for presenting continuous outcomes. Questions addressed participants' understanding of the magnitude of treatment effects and their perception of the usefulness of the presentation format. We randomly assigned participants 1 of 4 versions of the questionnaire, each with a different effect size (large or small) and presentation order for the 6 formats (1 to 6, or 6 to 1). RESULTS Overall, 531 (87.0%) of the clinicians responded. Respondents best understood risk difference, followed by relative risk and ratio of means. Similarly, they perceived the dichotomous presentation of continuous outcomes (relative risk and risk difference) to be most useful. Presenting results as a standardized mean difference, the longest standing and most widely used approach, was poorly understood and perceived as least useful. INTERPRETATION None of the presentation formats were well understood or perceived as extremely useful. Clinicians best understood the dichotomous presentations of continuous outcomes and perceived them to be the most useful. Further initiatives to help clinicians better grasp the magnitude of the treatment effect are needed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES This study sought to compare rates of stent thrombosis and major adverse cardiac and cerebrovascular events (MACCE) (composite of death, myocardial infarction, or stroke) after coronary stenting with drug-eluting stents (DES) versus bare-metal stents (BMS) in patients who participated in the DAPT (Dual Antiplatelet Therapy) study, an international multicenter randomized trial comparing 30 versus 12 months of dual antiplatelet therapy in subjects undergoing coronary stenting with either DES or BMS. BACKGROUND Despite antirestenotic efficacy of coronary DES compared with BMS, the relative risk of stent thrombosis and adverse cardiovascular events is unclear. Many clinicians perceive BMS to be associated with fewer adverse ischemic events and to require shorter-duration dual antiplatelet therapy than DES. METHODS Prospective propensity-matched analysis of subjects enrolled into a randomized trial of dual antiplatelet therapy duration was performed. DES- and BMS-treated subjects were propensity-score matched in a many-to-one fashion. The study design was observational for all subjects 0 to 12 months following stenting. A subset of eligible subjects without major ischemic or bleeding events were randomized at 12 months to continued thienopyridine versus placebo; all subjects were followed through 33 months. RESULTS Among 10,026 propensity-matched subjects, DES-treated subjects (n = 8,308) had a lower rate of stent thrombosis through 33 months compared with BMS-treated subjects (n = 1,718, 1.7% vs. 2.6%; weighted risk difference -1.1%, p = 0.01) and a noninferior rate of MACCE (11.4% vs. 13.2%, respectively, weighted risk difference -1.8%, p = 0.053, noninferiority p < 0.001). CONCLUSIONS DES-treated subjects have long-term rates of stent thrombosis that are lower than BMS-treated subjects. (The Dual Antiplatelet Therapy Study [DAPT study]; NCT00977938).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Anticoagulation is required during transcatheter aortic valve replacement (TAVR) procedures. Although an optimal regimen has not been determined, heparin is mainly used. Direct thrombin inhibition with bivalirudin may be an effective alternative to heparin as the procedural anticoagulant agent in this setting. OBJECTIVES The goal of this study was to determine whether bivalirudin offers an alternative to heparin as the procedural anticoagulant agent in patients undergoing TAVR. METHODS A total of 802 patients with aortic stenosis were randomized to undergo transfemoral TAVR with bivalirudin versus unfractionated heparin during the procedure. The 2 primary endpoints were major bleeding within 48 h or before hospital discharge (whichever occurred first) and 30-day net adverse clinical events, defined as the combination of major adverse cardiovascular events (all-cause mortality, myocardial infarction, or stroke) and major bleeding. RESULTS Anticoagulation with bivalirudin versus heparin did not meet superiority because it did not result in significantly lower rates of major bleeding at 48 h (6.9% vs. 9.0%; relative risk: 0.77; 95% confidence interval [CI]: 0.48 to 1.23; p = 0.27) or net adverse cardiovascular events at 30 days (14.4% vs. 16.1%; relative risk: 0.89; 95% CI: 0.64 to 1.24; risk difference: -1.72; 95% CI: -6.70 to 3.25; p = 0.50); regarding the latter, the prespecified noninferiority hypothesis was met (pnoninferiority < 0.01). Rates of major adverse cardiovascular events at 48 h were not significantly different (3.5% vs. 4.8%; relative risk: 0.73; 95% CI: 0.37 to 1.43; p = 0.35). At 48 h, the bivalirudin group had significantly fewer myocardial infarctions but more acute kidney injury events than the heparin group; at 30 days, these differences were no longer significant. CONCLUSIONS In this randomized trial of TAVR procedural pharmacotherapy, bivalirudin did not reduce rates of major bleeding at 48 h or net adverse cardiovascular events within 30 days compared with heparin. Although superiority was not shown, the noninferiority hypothesis was met with respect to the latter factor. Given the lower cost, heparin should remain the standard of care, and bivalirudin can be an alternative anticoagulant option in patients unable to receive heparin in TAVR. (International, Multi-center, Open-label, Randomized Controlled Trial in Patients Undergoing TAVR to Determine the Treatment Effect [Both Safety and Efficacy] of Using Bivalirudin Instead of UFH [BRAVO-2/3]; NCT01651780).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

IMPORTANCE Despite antirestenotic efficacy of coronary drug-eluting stents (DES) compared with bare metal stents (BMS), the relative risk of stent thrombosis and adverse cardiovascular events is unclear. Although dual antiplatelet therapy (DAPT) beyond 1 year provides ischemic event protection after DES, ischemic event risk is perceived to be less after BMS, and the appropriate duration of DAPT after BMS is unknown. OBJECTIVE To compare (1) rates of stent thrombosis and major adverse cardiac and cerebrovascular events (MACCE; composite of death, myocardial infarction, or stroke) after 30 vs 12 months of thienopyridine in patients treated with BMS taking aspirin and (2) treatment duration effect within the combined cohorts of randomized patients treated with DES or BMS as prespecified secondary analyses. DESIGN, SETTING, AND PARTICIPANTS International, multicenter, randomized, double-blinded, placebo-controlled trial comparing extended (30-months) thienopyridine vs placebo in patients taking aspirin who completed 12 months of DAPT without bleeding or ischemic events after receiving stents. The study was initiated in August 2009 with the last follow-up visit in May 2014. INTERVENTIONS Continued thienopyridine or placebo at months 12 through 30 after stent placement, in 11,648 randomized patients treated with aspirin, of whom 1687 received BMS and 9961 DES. MAIN OUTCOMES AND MEASURES Stent thrombosis, MACCE, and moderate or severe bleeding. RESULTS Among 1687 patients treated with BMS who were randomized to continued thienopyridine vs placebo, rates of stent thrombosis were 0.5% vs 1.11% (n = 4 vs 9; hazard ratio [HR], 0.49; 95% CI, 0.15-1.64; P = .24), rates of MACCE were 4.04% vs 4.69% (n = 33 vs 38; HR, 0.92; 95% CI, 0.57-1.47; P = .72), and rates of moderate/severe bleeding were 2.03% vs 0.90% (n = 16 vs 7; P = .07), respectively. Among all 11,648 randomized patients (both BMS and DES), stent thrombosis rates were 0.41% vs 1.32% (n = 23 vs 74; HR, 0.31; 95% CI, 0.19-0.50; P < .001), rates of MACCE were 4.29% vs 5.74% (n = 244 vs 323; HR, 0.73; 95% CI, 0.62-0.87; P < .001), and rates of moderate/severe bleeding were 2.45% vs 1.47% (n = 135 vs 80; P < .001). CONCLUSIONS AND RELEVANCE Among patients undergoing coronary stent placement with BMS and who tolerated 12 months of thienopyridine, continuing thienopyridine for an additional 18 months compared with placebo did not result in statistically significant differences in rates of stent thrombosis, MACCE, or moderate or severe bleeding. However, the BMS subset may have been underpowered to identify such differences, and further trials are suggested. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00977938.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Evidence suggests that cannabinoids can prevent chemotherapy-induced nausea and vomiting. The use of tetrahydrocannabinol (THC) has also been suggested for the prevention of postoperative nausea and vomiting (PONV), but evidence is very limited and inconclusive. To evaluate the effectiveness of IV THC in the prevention of PONV, we performed this double-blind, randomized, placebo-controlled trial with patient stratification according to the risk of PONV. Our hypothesis was that THC would reduce the relative risk of PONV by 25% compared with placebo. METHODS With IRB approval and written informed consent, 40 patients at high risk for PONV received either 0.125 mg/kg IV THC or placebo at the end of surgery before emergence from anesthesia. The primary outcome parameter was PONV during the first 24 hours after emergence. Secondary outcome parameters included early and late nausea, emetic episodes and PONV, and side effects such as sedation or psychotropic alterations. RESULTS The relative risk reduction of overall PONV in the THC group was 12% (95% confidence interval, -37% to 43%), potentially less than the clinically significant 25% relative risk reduction demonstrated by other drugs used for PONV prophylaxis. Calculation of the effect of treatment group on overall PONV by logistic regression adjusted for anesthesia time gave an odds ratio of 0.97 (95% confidence interval, 0.21 to 4.43, P = 0.97). Psychotropic THC side effects were clinically relevant and mainly consisted of sedation and confusion that were not tampered by the effects of anesthesia. The study was discontinued after 40 patients because of the inefficacy of THC against PONV and the finding of clinically unacceptable side effects that would impede the use of THC in the studied setting. CONCLUSIONS Because of an unacceptable side effect profile and uncertain antiemetic effects, IV THC administered at the end of surgery before emergence from anesthesia cannot be recommended for the prevention of PONV in high-risk patients.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Elderly individuals who provide care to a spouse suffering from dementia bear an increased risk of coronary heart disease (CHD). OBJECTIVE: To test the hypothesis that the Framingham CHD Risk Score would be higher in dementia caregivers relative to non-caregiving controls. METHODS: We investigated 64 caregivers providing in-home care for their spouse with Alzheimer's disease and 41 gender-matched non-caregiving controls. All subjects (mean age 70 +/- 8 years, 75% women, 93% Caucasian) had a negative history of CHD and cerebrovascular disease. The original Framingham CHD Risk Score was computed adding up categorical scores for age, blood lipids, blood pressure, diabetes, and smoking with adjustment made for sex. RESULTS: The average CHD risk score was higher in caregivers than in controls even when co-varying for socioeconomic status, health habits, medication, and psychological distress (8.0 +/- 2.9 vs. 6.3 +/- 3.0 points, p = 0.013). The difference showed a medium effect size (Cohen's d = 0.57). A relatively higher blood pressure in caregivers than in controls made the greatest contribution to this difference. The probability (area under the receiver operator curve) that a randomly selected caregiver had a greater CHD risk score than a randomly selected non-caregiver was 65.5%. CONCLUSIONS: Based on the Framingham CHD Risk Score, the potential to develop overt CHD in the following 10 years was predicted to be greater in dementia caregivers than in non-caregiving controls. The magnitude of the difference in the CHD risk between caregivers and controls appears to be clinically relevant. Clinicians may want to monitor caregiving status as a routine part of standard evaluation of their elderly patients' cardiovascular risk.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND Mortality risk for people with chronic kidney disease is substantially greater than that for the general population, increasing to a 7-fold greater risk for those on dialysis therapy. Higher body mass index, generally due to higher energy intake, appears protective for people on dialysis therapy, but the relationship between energy intake and survival in those with reduced kidney function is unknown. STUDY DESIGN Prospective cohort study with a median follow-up of 14.5 (IQR, 11.2-15.2) years. SETTING & PARTICIPANTS Blue Mountains Area, west of Sydney, Australia. Participants in the general community enrolled in the Blue Mountains Eye Study (n=2,664) who underwent a detailed interview, food frequency questionnaire, and physical examination including body weight, height, blood pressure, and laboratory tests. PREDICTORS Relative energy intake, food components (carbohydrates, total sugars, fat, protein, and water), and estimated glomerular filtration rate (eGFR). Relative energy intake was dichotomized at 100%, and eGFR, at 60mL/min/1.73m(2). OUTCOMES All-cause and cardiovascular mortality. MEASUREMENTS All-cause and cardiovascular mortality using unadjusted and adjusted Cox proportional regression models. RESULTS 949 people died during follow-up, 318 of cardiovascular events. In people with eGFR<60mL/min/1.73m(2) (n=852), there was an increased risk of all-cause mortality (HR, 1.48; P=0.03), but no increased risk of cardiovascular mortality (HR, 1.59; P=0.1) among those with higher relative energy intake compared with those with lower relative energy intake. Increasing intake of carbohydrates (HR per 100g/d, 1.50; P=0.04) and total sugars (HR per 100g/d, 1.62; P=0.03) was associated significantly with increased risk of cardiovascular mortality. LIMITATIONS Under-reporting of energy intake, baseline laboratory and food intake values only, white population. CONCLUSIONS Increasing relative energy intake was associated with increased all-cause mortality in patients with eGFR<60mL/min/1.73m(2). This effect may be mediated by increasing total sugars intake on subsequent cardiovascular events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Learned irrelevance (LIrr) refers to a form of selective learning that develops as a result of prior noncorrelated exposures of the predicted and predictor stimuli. In learning situations that depend on the associative link between the predicted and predictor stimuli, LIrr is expressed as a retardation of learning. It represents a form of modulation of learning by selective attention. Given the relevance of selective attention impairment to both positive and cognitive schizophrenia symptoms, the question remains whether LIrr impairment represents a state (relating to symptom manifestation) or trait (relating to schizophrenia endophenotypes) marker of human psychosis. We examined this by evaluating the expression of LIrr in an associative learning paradigm in (1) asymptomatic first-degree relatives of schizophrenia patients (SZ-relatives) and in (2) individuals exhibiting prodromal signs of psychosis ("ultrahigh risk" [UHR] patients) in each case relative to demographically matched healthy control subjects. There was no evidence for aberrant LIrr in SZ-relatives, but LIrr as well as associative learning were attenuated in UHR patients. It is concluded that LIrr deficiency in conjunction with a learning impairment might be a useful state marker predictive of psychotic state but a relatively weak link to a potential schizophrenia endophenotype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The parasite Echinococcus multilocularis was first detected in The Netherlands in 1996 and repeated studies have shown that the parasite subsequently spread in the local population of foxes in the province of Limburg. It was not possible to quantify the human risk of alveolar echinococcosis because no relationship between the amount of parasite eggs in the environment and the probability of infection in humans was known. Here, we used the spread of the parasite in The Netherlands as a predictor, together with recently published historical records of the epidemiology of alveolar echinococcosis in Switzerland, to achieve a relative quantification of the risk. Based on these analyses, the human risk in Limburg was simulated and up to three human cases are predicted by 2018. We conclude that the epidemiology of alveolar echinococcosis in The Netherlands might have changed from a period of negligible risk in the past to a period of increasing risk in the forthcoming years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge on the relative importance of alternative sources of human campylobacteriosis is important in order to implement effective disease prevention measures. The objective of this study was to assess the relative importance of three key exposure pathways (travelling abroad, poultry meat, pet contact) for different patient age groups in Switzerland. With a stochastic exposure model data on Campylobacter incidence for the years 2002-2007 were linked with data for the three exposure pathways and the results of a case-control study. Mean values for the population attributable fractions (PAF) over all age groups and years were 27% (95% CI 17-39) for poultry consumption, 27% (95% CI 22-32) for travelling abroad, 8% (95% CI 6-9) for pet contact and 39% (95% CI 25-50) for other risk factors. This model provided robust results when using data available for Switzerland, but the uncertainties remained high. The output of the model could be improved if more accurate input data are available to estimate the infection rate per exposure. In particular, the relatively high proportion of cases attributed to 'other risk factors' requires further attention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Published opinions regarding the outcomes and complications in older patients have a broad spectrum and there is a disagreement whether surgery in older patients entails a higher risk. Therefore this study examines the risk of surgery for lumbar spinal stenosis relative to age in the pooled data set of the Spine Tango registry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Neurofunctional alterations are correlates of vulnerability to psychosis, as well as of the disorder itself. How these abnormalities relate to different probabilities for later transition to psychosis is unclear. We investigated vulnerability- versus disease-related versus resilience biomarkers of psychosis during working memory (WM) processing in individuals with an at-risk mental state (ARMS). Experimental design: Patients with “first-episode psychosis” (FEP, n = 21), short-term ARMS (ARMS-ST, n = 17), long-term ARMS (ARMS-LT, n = 16), and healthy controls (HC, n = 20) were investigated with an n-back WM task. We examined functional magnetic resonance imaging (fMRI) and structural magnetic resonance imaging (sMRI) data in conjunction using biological parametric mapping (BPM) toolbox. Principal observations: There were no differences in accuracy, but the FEP and the ARMS-ST group had longer reaction times compared with the HC and the ARMS-LT group. With the 2-back > 0-back contrast, we found reduced functional activation in ARMS-ST and FEP compared with the HC group in parietal and middle frontal regions. Relative to ARMS-LT individuals, FEP patients showed decreased activation in the bilateral inferior frontal gyrus and insula, and in the left prefrontal cortex. Compared with the ARMS-LT, the ARMS-ST subjects showed reduced activation in the right inferior frontal gyrus and insula. Reduced insular and prefrontal activation was associated with gray matter volume reduction in the same area in the ARMS-LT group. Conclusions: These findings suggest that vulnerability to psychosis was associated with neurofunctional alterations in fronto-temporo-parietal networks in a WM task. Neurofunctional differences within the ARMS were related to different duration of the prodromal state and resilience factors

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomarkers are currently best used as mechanistic "signposts" rather than as "traffic lights" in the environmental risk assessment of endocrine-disrupting chemicals (EDCs). In field studies, biomarkers of exposure [e.g., vitellogenin (VTG) induction in male fish] are powerful tools for tracking single substances and mixtures of concern. Biomarkers also provide linkage between field and laboratory data, thereby playing an important role in directing the need for and design of fish chronic tests for EDCs. It is the adverse effect end points (e.g., altered development, growth, and/or reproduction) from such tests that are most valuable for calculating adverseNOEC (no observed effect concentration) or adverseEC10 (effective concentration for a 10% response) and subsequently deriving predicted no effect concentrations (PNECs). With current uncertainties, biomarkerNOEC or biomarkerEC10 data should not be used in isolation to derive PNECs. In the future, however, there may be scope to increasingly use biomarker data in environmental decision making, if plausible linkages can be made across levels of organization such that adverse outcomes might be envisaged relative to biomarker responses. For biomarkers to fulfil their potential, they should be mechanistically relevant and reproducible (as measured by interlaboratory comparisons of the same protocol). VTG is a good example of such a biomarker in that it provides an insight to the mode of action (estrogenicity) that is vital to fish reproductive health. Interlaboratory reproducibility data for VTG are also encouraging; recent comparisons (using the same immunoassay protocol) have provided coefficients of variation (CVs) of 38-55% (comparable to published CVs of 19-58% for fish survival and growth end points used in regulatory test guidelines). While concern over environmental xenoestrogens has led to the evaluation of reproductive biomarkers in fish, it must be remembered that many substances act via diverse mechanisms of action such that the environmental risk assessment for EDCs is a broad and complex issue. Also, biomarkers such as secondary sexual characteristics, gonadosomatic indices, plasma steroids, and gonadal histology have significant potential for guiding interspecies assessments of EDCs and designing fish chronic tests. To strengthen the utility of EDC biomarkers in fish, we need to establish a historical control database (also considering natural variability) to help differentiate between statistically detectable versus biologically significant responses. In conclusion, as research continues to develop a range of useful EDC biomarkers, environmental decision-making needs to move forward, and it is proposed that the "biomarkers as signposts" approach is a pragmatic way forward in the current risk assessment of EDCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the influence of recipient's and donor's factors as well as surgical events on the occurrence of reperfusion injury after lung transplantation. DESIGN AND SETTING: Retrospective study in the surgical intensive care unit (ICU) of a university hospital. METHODS: We collected data on 60 lung transplantation donor/recipient pairs from June 1993 to May 2001, and compared the demographic, peri- and postoperative variables of patients who experienced reperfusion injury (35%) and those who did not. RESULTS: The occurrence of high systolic pulmonary pressure immediately after transplantation and/or its persistence during the first 48[Symbol: see text]h after surgery was associated with reperfusion injury, independently of preoperative values. Reperfusion injury was associated with difficult hemostasis during transplantation (p[Symbol: see text]=[Symbol: see text]0.03). Patients with reperfusion injury were more likely to require the administration of catecholamine during the first 48[Symbol: see text]h after surgery (p[Symbol: see text]=[Symbol: see text]0.014). The extubation was delayed (p[Symbol: see text]=[Symbol: see text]0.03) and the relative odds of ICU mortality were significantly greater (OR 4.8, 95% CI: 1.06, 21.8) in patients with reperfusion injury. Our analysis confirmed that preexisting pulmonary hypertension increased the incidence of reperfusion injury (p[Symbol: see text]<[Symbol: see text]0.01). CONCLUSIONS: Difficulties in perioperative hemostasis were associated with reperfusion injury. Occurrence of reperfusion injury was associated with postoperative systolic pulmonary hypertension, longer mechanical ventilation and higher mortality. Whether early recognition and treatment of pulmonary hypertension during transplantation can prevent the occurrence of reperfusion injury needs to be investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Time delays from stroke onset to arrival at the hospital are the main obstacles for widespread use of thrombolysis. In order to decrease the delays, educational campaigns try to inform the general public how to act optimally in case of stroke. To determine the content of such a campaign, we assessed the stroke knowledge in our population. METHODS: The stroke knowledge was studied by means of a closed-ended questionnaire. 422 randomly chosen inhabitants of Bern, Switzerland, were interviewed. RESULTS: The knowledge of stroke warning signs (WS) was classified as good in 64.7%. A good knowledge of stroke risk factors (RF) was noted in 6.4%. 4.2% knew both the WS and the RF of stroke indicating a very good global knowledge of stroke. Only 8.3% recognized TIA as symptoms of stroke resolving within 24 hours, and only 2.8% identified TIA as a disease requiring immediate medical help. In multivariate analysis being a woman, advancing age, and having an afflicted relative were associated with a good knowledge of WS (p = 0.048, p < 0.001 and p = 0.043). Good knowledge of RF was related to university education (p < 0.001). The good knowledge of TIA did not depend on age, sex, level of education or having an afflicted relative. CONCLUSIONS: The study brings to light relevant deficits of stroke knowledge in our population. A small number of participants could recognize TIA as stroke related symptoms resolving completely within 24 hours. Only a third of the surveyed persons would seek immediate medical help in case of TIA. The information obtained will be used in the development of future educational campaigns.