940 resultados para Risk adjustment
Resumo:
BACKGROUND: Extracapsular tumor spread (ECS) has been identified as a possible risk factor for breast cancer recurrence, but controversy exists regarding its role in decision making for regional radiotherapy. This study evaluates ECS as a predictor of local, axillary, and supraclavicular recurrence. PATIENTS AND METHODS: International Breast Cancer Study Group Trial VI accrued 1475 eligible pre- and perimenopausal women with node-positive breast cancer who were randomly assigned to receive three to nine courses of classical combination chemotherapy with cyclophosphamide, methotrexate, and fluorouracil. ECS status was determined retrospectively in 933 patients based on review of pathology reports. Cumulative incidence and hazard ratios (HRs) were estimated using methods for competing risks analysis. Adjustment factors included treatment group and baseline patient and tumor characteristics. The median follow-up was 14 years. RESULTS: In univariable analysis, ECS was significantly associated with supraclavicular recurrence (HR = 1.96; 95% confidence interval 1.23-3.13; P = 0.005). HRs for local and axillary recurrence were 1.38 (P = 0.06) and 1.81 (P = 0.11), respectively. Following adjustment for number of lymph node metastases and other baseline prognostic factors, ECS was not significantly associated with any of the three recurrence types studied. CONCLUSIONS: Our results indicate that the decision for additional regional radiotherapy should not be based solely on the presence of ECS.
Resumo:
BACKGROUND: Elderly individuals who provide care to a spouse suffering from dementia bear an increased risk of coronary heart disease (CHD). OBJECTIVE: To test the hypothesis that the Framingham CHD Risk Score would be higher in dementia caregivers relative to non-caregiving controls. METHODS: We investigated 64 caregivers providing in-home care for their spouse with Alzheimer's disease and 41 gender-matched non-caregiving controls. All subjects (mean age 70 +/- 8 years, 75% women, 93% Caucasian) had a negative history of CHD and cerebrovascular disease. The original Framingham CHD Risk Score was computed adding up categorical scores for age, blood lipids, blood pressure, diabetes, and smoking with adjustment made for sex. RESULTS: The average CHD risk score was higher in caregivers than in controls even when co-varying for socioeconomic status, health habits, medication, and psychological distress (8.0 +/- 2.9 vs. 6.3 +/- 3.0 points, p = 0.013). The difference showed a medium effect size (Cohen's d = 0.57). A relatively higher blood pressure in caregivers than in controls made the greatest contribution to this difference. The probability (area under the receiver operator curve) that a randomly selected caregiver had a greater CHD risk score than a randomly selected non-caregiver was 65.5%. CONCLUSIONS: Based on the Framingham CHD Risk Score, the potential to develop overt CHD in the following 10 years was predicted to be greater in dementia caregivers than in non-caregiving controls. The magnitude of the difference in the CHD risk between caregivers and controls appears to be clinically relevant. Clinicians may want to monitor caregiving status as a routine part of standard evaluation of their elderly patients' cardiovascular risk.
Resumo:
Because of the important morbidity and mortality associated with osteoporosis, it is essential to detect subjects at risk by screening methods, such as bone quantitative ultrasounds (QUSs). Several studies showed that QUS could predict fractures. None, however, compared prospectively different QUS devices, and few data of quality controls (QCs) have been published. The Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk is a prospective multicenter study that compared three QUSs for the assessment of hip fracture risk in a population of 7609 women age >/=70 yr. Because the inclusion phase lasted 20 mo, and because 10 centers participated in this study, QC became a major issue. We therefore developed a QC procedure to assess the stability and precision of the devices, and for their cross-calibration. Our study focuses on the two heel QUSs. The water bath system (Achilles+) had a higher precision than the dry system (Sahara). The QC results were highly dependent on temperature. QUS stability was acceptable, but Sahara must be calibrated regularly. A sufficient homogeneity among all the Sahara devices could be demonstrated, whereas significant differences were found among the Achilles+ devices. For speed of sound, 52% of the differences among the Achilles+ was explained by the water s temperature. However, for broadband ultrasound attenuation, a maximal difference of 23% persisted after adjustment for temperature. Because such differences could influence measurements in vivo, it is crucial to develop standardized phantoms to be used in prospective multicenter studies.
Resumo:
Background Whereas it is well established that various soluble biomarkers can predict level of liver fibrosis, their ability to predict liver-related clinical outcomes is less clearly established, in particular among HIV/viral hepatitis co-infected persons. We investigated plasma hyaluronic acid’s (HA) ability to predict risk of liver-related events (LRE; hepatic coma or liver-related death) in the EuroSIDA study. Methods Patients included were positive for anti-HCV and/or HBsAg with at least one available plasma sample. The earliest collected plasma sample was tested for HA (normal range 0–75 ng/mL) and levels were associated with risk of LRE. Change in HA per year of follow-up was estimated after measuring HA levels in latest sample before the LRE for those experiencing this outcome (cases) and in a random selection of one sixth of the remaining patients (controls). Results During a median of 8.2 years of follow-up, 84/1252 (6.7%) patients developed a LRE. Baseline median (IQR) HA in those without and with a LRE was 31.8 (17.2–62.6) and 221.6 ng/mL (74.9–611.3), respectively (p<0.0001). After adjustment, HA levels predicted risk of contracting a LRE; incidence rate ratios for HA levels 75–250 or ≥250 vs. <75 ng/mL were 5.22 (95% CI 2.86–9.26, p<0.0007) and 28.22 (95% CI 14.95–46.00, p<0.0001), respectively. Median HA levels increased substantially prior to developing a LRE (107.6 ng/mL, IQR 0.8 to 251.1), but remained stable for controls (1.0 ng/mL, IQR –5.1 to 8.2), (p<0.0001 comparing cases and controls), and greater increases predicted risk of a LRE in adjusted models (p<0.001). Conclusions An elevated level of plasma HA, particularly if the level further increases over time, substantially increases the risk of contracting LRE over the next five years. HA is an inexpensive, standardized and non-invasive supplement to other methods aimed at identifying HIV/viral hepatitis co-infected patients at risk of hepatic complications.
Resumo:
Background Few data exist on tuberculosis (TB) incidence according to time from HIV seroconversion in high-income countries and whether rates following initiation of a combination of antiretroviral treatments (cARTs) differ from those soon after seroconversion. Methods Data on individuals with well estimated dates of HIV seroconversion were used to analyse post-seroconversion TB rates, ending at the earliest of 1 January 1997, death or last clinic visit. TB rates were also estimated following cART initiation, ending at the earliest of death or last clinic visit. Poisson models were used to examine the effect of current and past level of immunosuppression on TB risk after cART initiation. Results Of 19 815 individuals at risk during 1982–1996, TB incidence increased from 5.89/1000 person-years (PY) (95% CI 3.77 to 8.76) in the first year after seroconversion to 10.56 (4.83 to 20.04, p=0.01) at 10 years. Among 11 178 TB-free individuals initiating cART, the TB rate in the first year after cART initiation was 4.23/1000 PY (3.07 to 5.71) and dropped thereafter, remaining constant from year 2 onwards averaging at 1.64/1000 PY (1.29 to 2.05). Current CD4 count was inversely associated with TB rates, while nadir CD4 count was not associated with TB rates after adjustment for current CD4 count, HIV-RNA at cART initiation. Conclusions TB risk increases with duration of HIV infection in the absence of cART. Following cART initiation, TB incidence rates were lower than levels immediately following seroconversion. Implementation of current recommendations to prevent TB in early HIV infection could be beneficial.
Resumo:
Background Persons infected with human immunodeficiency virus (HIV) have increased rates of coronary artery disease (CAD). The relative contribution of genetic background, HIV-related factors, antiretroviral medications, and traditional risk factors to CAD has not been fully evaluated in the setting of HIV infection. Methods In the general population, 23 common single-nucleotide polymorphisms (SNPs) were shown to be associated with CAD through genome-wide association analysis. Using the Metabochip, we genotyped 1875 HIV-positive, white individuals enrolled in 24 HIV observational studies, including 571 participants with a first CAD event during the 9-year study period and 1304 controls matched on sex and cohort. Results A genetic risk score built from 23 CAD-associated SNPs contributed significantly to CAD (P = 2.9×10−4). In the final multivariable model, participants with an unfavorable genetic background (top genetic score quartile) had a CAD odds ratio (OR) of 1.47 (95% confidence interval [CI], 1.05–2.04). This effect was similar to hypertension (OR = 1.36; 95% CI, 1.06–1.73), hypercholesterolemia (OR = 1.51; 95% CI, 1.16–1.96), diabetes (OR = 1.66; 95% CI, 1.10–2.49), ≥1 year lopinavir exposure (OR = 1.36; 95% CI, 1.06–1.73), and current abacavir treatment (OR = 1.56; 95% CI, 1.17–2.07). The effect of the genetic risk score was additive to the effect of nongenetic CAD risk factors, and did not change after adjustment for family history of CAD. Conclusions In the setting of HIV infection, the effect of an unfavorable genetic background was similar to traditional CAD risk factors and certain adverse antiretroviral exposures. Genetic testing may provide prognostic information complementary to family history of CAD.
Resumo:
Environmental factors can determine which group size will maximize the fitness of group members. This is particularly important in cooperative breeders, where group members often serve different purposes. Experimental studies are yet lacking to check whether ecologically mediated need for help will change the propensity of dominant group members to accept immigrants. Here, we manipulated the perceived risk of predation for dominant breeders of the cooperatively breeding cichlid fish Neolamprologus pulcher to test their response to unrelated and previously unknown immigrants. Potential immigrants were more readily accepted if groups were exposed to fish predators or egg predators than to herbivorous fish or control situations lacking predation risk. Our data are consistent with both risk dilution and helping effects. Egg predators were presented before spawning, which might suggest that the fish adjust acceptance rates also to a potential future threat. Dominant group members of N. pulcher apparently consider both present and future need of help based on ecological demand. This suggests that acceptance of immigrants and, more generally, tolerance of group members on demand could be a widespread response to ecological conditions in cooperatively breeding animals.
Resumo:
BACKGROUND Empirical research has illustrated an association between study size and relative treatment effects, but conclusions have been inconsistent about the association of study size with the risk of bias items. Small studies give generally imprecisely estimated treatment effects, and study variance can serve as a surrogate for study size. METHODS We conducted a network meta-epidemiological study analyzing 32 networks including 613 randomized controlled trials, and used Bayesian network meta-analysis and meta-regression models to evaluate the impact of trial characteristics and study variance on the results of network meta-analysis. We examined changes in relative effects and between-studies variation in network meta-regression models as a function of the variance of the observed effect size and indicators for the adequacy of each risk of bias item. Adjustment was performed both within and across networks, allowing for between-networks variability. RESULTS Imprecise studies with large variances tended to exaggerate the effects of the active or new intervention in the majority of networks, with a ratio of odds ratios of 1.83 (95% CI: 1.09,3.32). Inappropriate or unclear conduct of random sequence generation and allocation concealment, as well as lack of blinding of patients and outcome assessors, did not materially impact on the summary results. Imprecise studies also appeared to be more prone to inadequate conduct. CONCLUSIONS Compared to more precise studies, studies with large variance may give substantially different answers that alter the results of network meta-analyses for dichotomous outcomes.
Resumo:
Many persons in the U.S. gain weight during young adulthood, and the prevalence of obesity has been increasing among young adults. Although obesity and physical inactivity are generally recognized as risk factors for coronary heart disease (CHD), the magnitude of their effect on risk may have been seriously underestimated due to failure to adequately handle the problem of cigarette smoking. Since cigarette smoking causes weight loss, physically inactive cigarette smokers may remain relatively lean because they smoke cigarettes. We hypothesize cigarette smoking modifies the association between weight gain during young adulthood and risk of coronary heart disease during middle age, and that the true effect of weight gain during young adulthood on risk of CHD can be assessed only in persons who have not smoked cigarettes. Specifically, we hypothesize that weight gain during young adulthood is positively associated with risk of CHD during middle-age in nonsmokers but that the association is much smaller or absent entirely among cigarette smokers. The purpose of this study was to test this hypothesis. The population for analysis was comprised of 1,934 middle-aged, employed men whose average age at the baseline examination was 48.7 years. Information collected at the baseline examinations in 1958 and 1959 included recalled weight at age 20, present weight, height, smoking status, and other CHD risk factors. To decrease the effect of intraindividual variation, the mean values of the 1958 and 1959 baseline examinations were used in analyses. Change in body mass index ($\Delta$BMI) during young adulthood was the primary exposure variable and was measured as BMI at baseline (kg/m$\sp2)$ minus BMI at age 20 (kg/m$\sp2).$ Proportional hazards regression analysis was used to generate relative risks of CHD mortality by category of $\Delta$BMI and cigarette smoking status after adjustment for age, family history of CVD, major organ system disease, BMI at age 20, and number of cigarettes smoked per day. Adjustment was not performed for systolic blood pressure or total serum cholesterol as these were regarded as intervening variables. Vital status was known for all men on the 25th anniversary of their baseline examinations. 705 deaths (including 319 CHD deaths) occurred over 40,136 person-years of experience. $\Delta$BMI was positively associated with risk of CHD mortality in never-smokers, but not in ever-smokers (p for interaction = 0.067). For never-smokers with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 1.62, 1.61, and 2.78, respectively (p for trend = 0.010). For ever-smokers, with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 0.74, 1.07, and 1.06, respectively (p for trend = 0.422). These results support the research hypothesis that cigarette smoking modifies the association between weight gain and CHD mortality. Current estimates of the magnitude of effect of obesity and physical inactivity on risk of coronary mortality may have been seriously underestimated due to inadequate handling of cigarette smoking. ^
Resumo:
A cohort of 418 United States Air Force (USAF) personnel from over 15 different bases deployed to Morocco in 1994. This was the first study of its kind and was designed with two primary goals: to determine if the USAF was medically prepared to deploy with its changing mission in the new world order, and to evaluate factors that might improve or degrade USAF medical readiness. The mean length of deployment was 21 days. The cohort was 95% male, 86% enlisted, 65% married, and 78% white.^ This study shows major deficiencies indicating the USAF medical readiness posture has not fully responded to meet its new mission requirements. Lack of required logistical items (e.g., mosquito nets, rainboots, DEET insecticide cream, etc.) revealed a low state of preparedness. The most notable deficiency was that 82.5% (95% CI = 78.4, 85.9) did not have permethrin pretreated mosquito nets and 81.0% (95% CI = 76.8, 84.6) lacked mosquito net poles. Additionally, 18% were deficient on vaccinations and 36% had not received a tuberculin skin test. Excluding injections, the overall compliance for preventive medicine requirements had a mean frequency of only 50.6% (95% CI = 45.36, 55.90).^ Several factors had a positive impact on compliance with logistical requirements. The most prominent was "receiving a medical intelligence briefing" from the USAF Public Health. After adjustment for mobility and age, individuals who underwent a briefing were 17.2 (95% CI = 4.37, 67.99) times more likely to have received an immunoglobulin shot and 4.2 (95% CI = 1.84, 9.45) times more likely to start their antimalarial prophylaxsis at the proper time. "Personnel on mobility" had the second strongest positive effect on medical readiness. When mobility and briefing were included in models, "personnel on mobility" were 2.6 (95% CI = 1.19, 5.53) times as likely to have DEET insecticide and 2.2 (95% CI = 1.16, 4.16) times as likely to have had a TB skin test.^ Five recommendations to improve the medical readiness of the USAF were outlined: upgrade base level logistical support, improve medical intelligence messages, include medical requirements on travel orders, place more personnel on mobility or only deploy personnel on mobility, and conduct research dedicated to capitalize on the powerful effect from predeployment briefings.^ Since this is the first study of its kind, more studies should be performed in different geographic theaters to assess medical readiness and establish acceptable compliance levels for the USAF. ^
Resumo:
AIMS The genetic polymorphism of apolipoprotein E (APOE) has been suggested to modify the effect of smoking on the development of coronary artery disease (CAD) in apparently healthy persons. The interaction of these factors in persons undergoing coronary angiography is not known. METHODS AND RESULTS We analysed the association between the APOE-genotype, smoking, angiographic CAD, and mortality in 3263 participants of the LUdwigshafen RIsk and Cardiovascular Health study. APOE-genotypes were associated with CAD [ε22 or ε23: odds ratio (OR) 0.56, 95% confidence interval (CI) 0.43-0.71; ε24 or ε34 or ε44: OR 1.10, 95% CI 0.89-1.37 compared with ε33] and moderately with cardiovascular mortality [ε22 or ε23: hazard ratio (HR) 0.71, 95% CI 0.51-0.99; ε33: HR 0.92, 95% CI 0.75-1.14 compared with ε24 or ε34 or ε44]. HRs for total mortality were 1.39 (95% CI 0.39-0.1.67), 2.29 (95% CI 1.85-2.83), 2.07 (95% CI 1.64-2.62), and 2.95 (95% CI 2.10-4.17) in ex-smokers, current smokers, current smokers without, or current smokers with one ε4 allele, respectively, compared with never-smokers. Carrying ε4 increased mortality in current, but not in ex-smokers (HR 1.66, 95% CI 1.04-2.64 for interaction). These findings applied to cardiovascular mortality, were robust against adjustment for cardiovascular risk factors, and consistent across subgroups. No interaction of smoking and ε4 was seen regarding non-cardiovascular mortality. Smokers with ε4 had reduced average low-density lipoprotein (LDL) diameters, elevated oxidized LDL, and lipoprotein-associated phospholipase A2. CONCLUSION In persons undergoing coronary angiography, there is a significant interaction between APOE-genotype and smoking. The presence of the ε4 allele in current smokers increases cardiovascular and all-cause mortality.
Resumo:
Aims: The reported rate of stent thrombosis (ST) after drug-eluting stent (DES) implantation varies among registries. To investigate differences in baseline characteristics and clinical outcome in European and Japanese all-comers registries, we performed a pooled analysis of patient-level data. Methods and results: The j-Cypher registry (JC) is a multicentre observational study conducted in Japan, including 12,824 patients undergoing SES implantation. From the Bern-Rotterdam registry (BR) enrolled at two academic hospitals in Switzerland and the Netherlands, 3,823 patients with SES were included in the current analysis. Patients in BR were younger, more frequently smokers and presented more frequently with ST-elevation myocardial infarction (MI). Conversely, JC patients more frequently had diabetes and hypertension. At five years, the definite ST rate was significantly lower in JC than BR (JC 1.6% vs. BR 3.3%, p<0.001), while the unadjusted mortality tended to be lower in BR than in JC (BR 13.2% vs. JC 14.4%, log-rank p=0.052). After adjustment, the j-Cypher registry was associated with a significantly lower risk of all-cause mortality (HR 0.56, 95% CI: 0.49-0.64) as well as definite stent thrombosis (HR 0.46, 95% CI: 0.35-0.61). Conclusions: The baseline characteristics of the two large registries were different. After statistical adjustment, JC was associated with lower mortality and ST.
Resumo:
BACKGROUND Polypharmacy, defined as the concomitant use of multiple medications, is very common in the elderly and may trigger drug-drug interactions and increase the risk of falls in patients receiving vitamin K antagonists. OBJECTIVE To examine whether polypharmacy increases the risk of bleeding in elderly patients who receive vitamin K antagonists for acute venous thromboembolism (VTE). DESIGN We used a prospective cohort study. PARTICIPANTS In a multicenter Swiss cohort, we studied 830 patients aged ≥ 65 years with VTE. MAIN MEASURES We defined polypharmacy as the prescription of more than four different drugs. We assessed the association between polypharmacy and the time to a first major and clinically relevant non-major bleeding, accounting for the competing risk of death. We adjusted for known bleeding risk factors (age, gender, pulmonary embolism, active cancer, arterial hypertension, cardiac disease, cerebrovascular disease, chronic liver and renal disease, diabetes mellitus, history of major bleeding, recent surgery, anemia, thrombocytopenia) and periods of vitamin K antagonist treatment as a time-varying covariate. KEY RESULTS Overall, 413 (49.8 %) patients had polypharmacy. The mean follow-up duration was 17.8 months. Patients with polypharmacy had a significantly higher incidence of major (9.0 vs. 4.1 events/100 patient-years; incidence rate ratio [IRR] 2.18, 95 % confidence interval [CI] 1.32-3.68) and clinically relevant non-major bleeding (14.8 vs. 8.0 events/100 patient-years; IRR 1.85, 95 % CI 1.27-2.71) than patients without polypharmacy. After adjustment, polypharmacy was significantly associated with major (sub-hazard ratio [SHR] 1.83, 95 % CI 1.03-3.25) and clinically relevant non-major bleeding (SHR 1.60, 95 % CI 1.06-2.42). CONCLUSIONS Polypharmacy is associated with an increased risk of both major and clinically relevant non-major bleeding in elderly patients receiving vitamin K antagonists for VTE.
Resumo:
OBJECTIVE Whether or not a high risk of falls increases the risk of bleeding in patients receiving anticoagulants remains a matter of debate. METHODS We conducted a prospective cohort study involving 991 patients ≥ 65 years of age who received anticoagulants for acute venous thromboembolism (VTE) at nine Swiss hospitals between September 2009 and September 2012. The study outcomes were as follows: the time to a first major episode of bleeding; and clinically relevant nonmajor bleeding. We determined the associations between the risk of falls and the time to a first episode of bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS Four hundred fifty-eight of 991 patients (46%) were at high risk of falls. The mean duration of follow-up was 16.7 months. Patients at high risk of falls had a higher incidence of major bleeding (9.6 vs. 6.6 events/100 patient-years; P = 0.05) and a significantly higher incidence of clinically relevant nonmajor bleeding (16.7 vs. 8.3 events/100 patient-years; P < 0.001) than patients at low risk of falls. After adjustment, a high risk of falls was associated with clinically relevant nonmajor bleeding [subhazard ratio (SHR) = 1.74, 95% confidence interval (CI) = 1.23-2.46], but not with major bleeding (SHR = 1.24, 95% CI = 0.83-1.86). CONCLUSION In elderly patients who receive anticoagulants because of VTE, a high risk of falls is significantly associated with clinically relevant nonmajor bleeding, but not with major bleeding. Whether or not a high risk of falls is a reason against providing anticoagulation beyond 3 months should be based on patient preferences and the risk of VTE recurrence.
Resumo:
OBJECTIVE To determine the prognostic accuracy of cardiac biomarkers alone and in combination with clinical scores in elderly patients with non-high-risk pulmonary embolism (PE). DESIGN Ancillary analysis of a Swiss multicentre prospective cohort study. SUBJECTS A total of 230 patients aged ≥65 years with non-high-risk PE. MAIN OUTCOME MEASURES The study end-point was a composite of PE-related complications, defined as PE-related death, recurrent venous thromboembolism or major bleeding during a follow-up of 30 days. The prognostic accuracy of the Pulmonary Embolism Severity Index (PESI), the Geneva Prognostic Score (GPS), the precursor of brain natriuretic peptide (NT-proBNP) and high-sensitivity cardiac troponin T (hs-cTnT) was determined using sensitivity, specificity, predictive values, receiver operating characteristic (ROC) curve analysis, logistic regression and reclassification statistics. RESULTS The overall complication rate during follow-up was 8.7%. hs-cTnT achieved the highest prognostic accuracy [area under the ROC curve: 0.75, 95% confidence interval (CI): 0.63-0.86, P < 0.001). At the predefined cut-off values, the negative predictive values of the biomarkers were above 95%. For levels above the cut-off, the risk of complications increased fivefold for hs-cTnT [odds ratio (OR): 5.22, 95% CI: 1.49-18.25] and 14-fold for NT-proBNP (OR: 14.21, 95% CI: 1.73-116.93) after adjustment for both clinical scores and renal function. Reclassification statistics indicated that adding hs-cTnT to the GPS or the PESI significantly improved the prognostic accuracy of both clinical scores. CONCLUSION In elderly patients with nonmassive PE, NT-proBNP or hs-cTnT could be an adequate alternative to clinical scores for identifying low-risk individuals suitable for outpatient management.