983 resultados para fixed-width confidence interval
Resumo:
Introduction.- Knowledge of predictors of an unfavourable outcome, e.g. non-return to work after an injury enables to identify patients at risk and to target interventions for modifiable predictors. It has been recently shown that INTERMED; a tool to measure biopsychosocial complexity in four domains (biologic, psychologic, social and care, with a score between 0-60 points) can be useful in this context. The aim of this study was to set up a predictive model for non-return to work using INTERMED in patients in vocational rehabilitation after orthopaedic injury.Patients and methods.- In this longitudinal prospective study, the cohort consisted of 2156 consecutively included inpatients with orthopaedic trauma attending a rehabilitation hospital after a work, traffic or sport related injury. Two years after discharge, a questionnaire regarding return to work was sent (1502 returned their questionnaires). In addition to INTERMED, 18 predictors known at baseline of the rehabilitation were selected based on previous research. A multivariable logistic regression was performed.Results.- In the multivariate model, not-returning to work at 2 years was significantly predicted by the INTERMED: odds-ratio (OR) 1.08 (95% confidence interval, CI [1.06; 1.11]) for a one point increase in scale; by qualified work-status before the injury OR = 0.74, CI (0.54; 0.99), by using French as preferred language OR = 0.60, CI (0.45; 0.80), by upper-extremity injury OR = 1.37, CI (1.03; 1.81), by higher education (> 9 years) OR = 0.74, CI (0.55; 1.00), and by a 10 year increase in age OR = 1.15, CI (1.02; 1.29). The area under the receiver-operator-characteristics curve (ROC)-curve was 0.733 for the full model (INTERMED plus 18 variables).Discussion.- These results confirm that the total score of the INTERMED is a significant predictor for return to work. The full model with 18 predictors combined with the total score of INTERMED has good predictive value. However, the number of variables (19) to measure is high for the use as screening tool in a clinic.
Resumo:
Purpose: There is evidence indicating that adolescent females smoke as a way to control weight. The aim of our research is to assess whether daily smoking is a marker for weight control practices among adolescent females. Methods: Data were drawn from the 2002 Swiss Multicenter Adolescent Survey on Health (SMASH02) data base, a survey including 7,548 [3,838 females] in-school adolescents aged 16-20 years in Switzerland. Among females self-reporting BMI (N _ 3,761), two groups were drawn: daily smokers (DS, N _ 1,273) included all those smoking at least 1 cigarette/day and never smokers (NS, N _ 1,888) included all those having never smoked. Former (N _ 177) and occasional (N _ 423) smokers were not included. Groups were compared on weight control practices (being on a diet, self-induced vomiting, use of doctor-prescribed or over-the-counter appetite suppressors) controlling for possible confounding variables (age, BMI, feeling fat, body image, use of other substances, depression, sport practice, academic track and perceived advanced puberty). Analyses were performed with STATA 9. Bivariate analyses are presented as point-prevalence and multivariate analysis (using logistic regression) are presented as adjusted odds ratio (AOR) and [95% confidence interval]. Results: In the bivariate analysis, DS females were significantly more likely (p _ 0.001) than NS to be on a diet (DS: 33.2%, NS: 22.2%), to self-induce vomiting (DS: 9.0%, NS: 3.3%), and to use doctor prescribed (DS: 2.3%, NS: 0.9%) or over-the-counter (DS: 3.2%, NS: 1.2%) appetite suppressors. In the multivariate analysis, DS females were more likely than NS to be on a diet (AOR: 1.40 [1.17/1.68]), to self-induce vomiting (AOR: 2.07 [1.45/2.97]), and to use doctor-prescribed appetite suppressors (AOR: 1.99 [1.00/ 3.96]). Conclusions: Weight control practices are more frequent among female daily smokers than among never smokers. This finding seems to confirm cigarette smoking as a way to control weight among adolescent females. Health professionals should inquire adolescent female smokers about weight control practices, and this association must be kept in mind when discussing tobacco cessation options with adolescent females. Sources of Support: The SMASH02 survey was funded by the Swiss Federal Office of Public Health and the participating cantons.
Resumo:
BACKGROUND: The risk of end stage renal disease (ESRD) is increased among individuals with low income and in low income communities. However, few studies have examined the relation of both individual and community socioeconomic status (SES) with incident ESRD. METHODS: Among 23,314 U.S. adults in the population-based Reasons for Geographic and Racial Differences in Stroke study, we assessed participant differences across geospatially-linked categories of county poverty [outlier poverty, extremely high poverty, very high poverty, high poverty, neither (reference), high affluence and outlier affluence]. Multivariable Cox proportional hazards models were used to examine associations of annual household income and geospatially-linked county poverty measures with incident ESRD, while accounting for death as a competing event using the Fine and Gray method. RESULTS: There were 158 ESRD cases during follow-up. Incident ESRD rates were 178.8 per 100,000 person-years (105 py) in high poverty outlier counties and were 76.3 /105 py in affluent outlier counties, p trend = 0.06. In unadjusted competing risk models, persons residing in high poverty outlier counties had higher incidence of ESRD (which was not statistically significant) when compared to those persons residing in counties with neither high poverty nor affluence [hazard ratio (HR) 1.54, 95% Confidence Interval (CI) 0.75-3.20]. This association was markedly attenuated following adjustment for socio-demographic factors (age, sex, race, education, and income); HR 0.96, 95% CI 0.46-2.00. However, in the same adjusted model, income was independently associated with risk of ESRD [HR 3.75, 95% CI 1.62-8.64, comparing the < $20,000 income group to the > $75,000 group]. There were no statistically significant associations of county measures of poverty with incident ESRD, and no evidence of effect modification. CONCLUSIONS: In contrast to annual family income, geospatially-linked measures of county poverty have little relation with risk of ESRD. Efforts to mitigate socioeconomic disparities in kidney disease may be best appropriated at the individual level.
Resumo:
BACKGROUND. Glomerular hyperfiltration (GHF) is a well-recognized early renal alteration in diabetic patients. As the prevalence of GHF is largely unknown in populations in the African region with respect to normal fasting glucose (NFG), impaired fasting glucose (IFG) and type 2 diabetes [diabetes mellitus (DM)], we conducted a cross-sectional study in the Seychelles islands among families including at least one member with hypertension. METHODS. The glomerular filtration rate (GFR), effective renal plasma flow (ERPF) and proximal tubular sodium reabsorption were measured using inulin, p-aminohippurate (PAH) and endogenous lithium clearance, respectively. Twenty-four-hour urine was collected on the preceding day. RESULTS. Of the 363 participants (mean age 44.7 years), 6.6% had IFG, 9.9% had DM and 63.3% had hypertension. The prevalence of GHF, defined as a GFR >140 ml/min, was 17.2%, 29.2% and 52.8% in NFG, IFG and DM, respectively (P trend <0.001). Compared to NFG, the adjusted odds ratio for GHF was 1.99 [95% confidence interval (CI) 0.73-5.44] for IFG and 5.88 (2.39-14.45) for DM. Lithium clearance and fractional excretion of lithium were lower in DM and IFG than NFG (P < 0.001). CONCLUSION. In this population of African descent, subjects with impaired fasting glucose or type 2 diabetes had a high prevalence of GHF and enhanced proximal sodium reabsorption. These findings provide further insight on the elevated incidence of nephropathy reported among African diabetic individuals.
Resumo:
BACKGROUND: Multiple risk prediction models have been validated in all-age patients presenting with acute coronary syndrome (ACS) and treated with percutaneous coronary intervention (PCI); however, they have not been validated specifically in the elderly. METHODS: We calculated the GRACE (Global Registry of Acute Coronary Events) score, the logistic EuroSCORE, the AMIS (Acute Myocardial Infarction Swiss registry) score, and the SYNTAX (Synergy between Percutaneous Coronary Intervention with TAXUS and Cardiac Surgery) score in a consecutive series of 114 patients ≥75 years presenting with ACS and treated with PCI within 24 hours of hospital admission. Patients were stratified according to score tertiles and analysed retrospectively by comparing the lower/mid tertiles as an aggregate group with the higher tertile group. The primary endpoint was 30-day mortality. Secondary endpoints were the composite of death and major adverse cardiovascular events (MACE) at 30 days, and 1-year MACE-free survival. Model discrimination ability was assessed using the area under receiver operating characteristic curve (AUC). RESULTS: Thirty-day mortality was higher in the upper tertile compared with the aggregate lower/mid tertiles according to the logistic EuroSCORE (42% vs 5%; odds ratio [OR] = 14, 95% confidence interval [CI] = 4-48; p <0.001; AUC = 0.79), the GRACE score (40% vs 4%; OR = 17, 95% CI = 4-64; p <0.001; AUC = 0.80), the AMIS score (40% vs 4%; OR = 16, 95% CI = 4-63; p <0.001; AUC = 0.80), and the SYNTAX score (37% vs 5%; OR = 11, 95% CI = 3-37; p <0.001; AUC = 0.77). CONCLUSIONS: In elderly patients presenting with ACS and referred to PCI within 24 hours of admission, the GRACE score, the EuroSCORE, the AMIS score, and the SYNTAX score predicted 30 day mortality. The predictive value of clinical scores was improved by using them in combination.
Resumo:
OBJECTIVE: Hypopituitarism is associated with an increased mortality rate but the reasons underlying this have not been fully elucidated. The purpose of this study was to evaluate mortality and associated factors within a large GH-replaced population of hypopituitary patients. DESIGN: In KIMS (Pfizer International Metabolic Database) 13,983 GH-deficient patients with 69,056 patient-years of follow-up were available. METHODS: This study analysed standardised mortality ratios (SMRs) by Poisson regression. IGF1 SDS was used as an indicator of adequacy of GH replacement. Statistical significance was set to P<0.05. RESULTS: All-cause mortality was 13% higher compared with normal population rates (SMR, 1.13; 95% confidence interval, 1.04-1.24). Significant associations were female gender, younger age at follow-up, underlying diagnosis of Cushing's disease, craniopharyngioma and aggressive tumour and presence of diabetes insipidus. After controlling for confounding factors, there were statistically significant negative associations between IGF1 SDS after 1, 2 and 3 years of GH replacement and SMR. For cause-specific mortality there was a negative association between 1-year IGF1 SDS and SMR for deaths from cardiovascular diseases (P=0.017) and malignancies (P=0.044). CONCLUSIONS: GH-replaced patients with hypopituitarism demonstrated a modest increase in mortality rate; this appears lower than that previously published in GH-deficient patients. Factors associated with increased mortality included female gender, younger attained age, aetiology and lower IGF1 SDS during therapy. These data indicate that GH replacement in hypopituitary adults with GH deficiency may be considered a safe treatment.
Resumo:
BACKGROUND: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. METHODS: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. RESULTS: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of -0.82 (-1.06 to -0.58) mm Hg and -0.89 (-1.05 to -0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor-based and triple nucleoside regimens were associated with cardiovascular events. CONCLUSIONS: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.
Resumo:
Bacterial factors may contribute to the global emergence and spread of drug-resistant tuberculosis (TB). Only a few studies have reported on the interactions between different bacterial factors. We studied drug-resistant Mycobacterium tuberculosis isolates from a nationwide study conducted from 2000 to 2008 in Switzerland. We determined quantitative drug resistance levels of first-line drugs by using Bactec MGIT-960 and drug resistance genotypes by sequencing the hot-spot regions of the relevant genes. We determined recent transmission by molecular methods and collected clinical data. Overall, we analyzed 158 isolates that were resistant to isoniazid, rifampin, or ethambutol, 48 (30.4%) of which were multidrug resistant. Among 154 isoniazid-resistant strains, katG mutations were associated with high-level and inhA promoter mutations with low-level drug resistance. Only katG(S315T) (65.6% of all isoniazid-resistant strains) and inhA promoter -15C/T (22.7%) were found in molecular clusters. M. tuberculosis lineage 2 (includes Beijing genotype) was associated with any drug resistance (adjusted odds ratio [OR], 3.0; 95% confidence interval [CI], 1.7 to 5.6; P < 0.0001). Lineage 1 was associated with inhA promoter -15C/T mutations (OR, 6.4; 95% CI, 2.0 to 20.7; P = 0.002). We found that the genetic strain background influences the level of isoniazid resistance conveyed by particular mutations (interaction tests of drug resistance mutations across all lineages; P < 0.0001). In conclusion, M. tuberculosis drug resistance mutations were associated with various levels of drug resistance and transmission, and M. tuberculosis lineages were associated with particular drug resistance-conferring mutations and phenotypic drug resistance. Our study also supports a role for epistatic interactions between different drug resistance mutations and strain genetic backgrounds in M. tuberculosis drug resistance.
Resumo:
Purpose: Young cannabis users are at increased risk for cigarette initiation and later progression to nicotine dependence. The present study assesses to which extent cannabis users are exposed to nicotine through mulling, a widespread process consisting of mixing tobacco to cannabis for its consumption. Methods: Data are issued from an ongoing observational study taking place in Switzerland. A total of 267 eligible participants (mean age 19 years, 46.4% males) completed an anonymous self-administered questionnaire on their tobacco and cannabis intake in the previous 5 days. They also provided a urine sample that was blindly analyzed for cotinine (a key metabolite of nicotine) using liquid-chromatography coupled mass-spectrometry. After the exclusion of cannabis users not having smoked at least one joint/blunt in which tobacco had been mixed (n _ 2), and participants reporting other sources of nicotine exposition than cigarettes or mulling (n _37), four groups were created: cannabis and cigarette abstainers (ABS, n_ 69), cannabis only smokers (CAS; n _ 33), cigarette only smokers (CIS; n _ 62); and cannabis and cigarette smokers (CCS, n _ 64). Cotinine measures of CAS were compared to those of ABS, CIS and CCS. All comparisons were performed using ANCOVA, controlling for age, gender, ethnicity, BMI and environmental exposure to cigarette smoke in the past month (at home, in school/at work, in social settings). The number of mixed joints/blunts smoked in the previous 5 days was additionally taken into account when comparing CAS to CCS. Cotinine values (ng/ml) are reported as means with 95% confidence interval (95% CI). Results: In the previous 5 days, CAS had smoked on average 10 mixed joints/blunts, CIS 30 cigarettes, and CCS 8 mixed joints/ blunts and 41 cigarettes. Cotinine levels of participants considerably differed between groups. The lowest measure was found among ABS (3.2 [0.5-5.9]), followed in growing order by CAS (294.6 [157.1-432.0]), CIS (362.8 [258.4-467.3]), and CCS (649.9 [500.7-799.2]). In the multivariate analysis, cotinine levels of CAS were significantly higher than those of ABS (p _.001), lower than those of CCS (p _ .003), but did not differ from levels of CIS (p _ .384). Conclusions: Our study reveals cannabis users to be significantly exposed to nicotine through mulling, even after controlling for several possible confounders such as environmental exposure to cigarette smoke. Utmost, mixing tobacco to Poster cannabis can result in a substantial nicotine exposition as cotinine levels from cannabis only smokers were as high as those of moderate cigarette smokers. Our findings also suggest that mulling is adding up to the already important nicotine exposition of cigarettes smokers. Because of the addictiveness of nicotine, mulling should be part of a comprehensive assessment of substance use among adolescents and young adults, especially when supporting their cannabis and cigarette quitting attempts. Sources of Support: This study was funded by the Public Health Service of the Canton de Vaud. Dr. BÊlanger's contribution was possible through grants from the Royal College of Physicians and Surgeons of Canada, the CHUQ/CMDP Foundation and the Laval University McLaughlin program, QuÊbec, Canada.
Resumo:
RATIONALE: Concomitant deep vein thrombosis (DVT) in patients with acute pulmonary embolism (PE) has an uncertain prognostic significance. OBJECTIVES: In a cohort of patients with PE, this study compared the risk of death in those with and those without concomitant DVT. METHODS: We conducted a prospective cohort study of outpatients diagnosed with a first episode of acute symptomatic PE. Patients underwent bilateral lower extremity venous compression ultrasonography to assess for concomitant DVT. MEASUREMENTS AND MAIN RESULTS: The primary study outcome, all-cause mortality, and the secondary outcome of PE-specific mortality were assessed during the 3 months of follow-up after PE diagnosis. Multivariate Cox proportional hazards regression was done to adjust for significant covariates. Of 707 patients diagnosed with PE, 51.2% (362 of 707) had concomitant DVT and 10.9% (77 of 707) died during follow-up. Patients with concomitant DVT had an increased all-cause mortality (adjusted hazard ratio [HR], 2.05; 95% confidence interval [CI], 1.24 to 3.38; P = 0.005) and PE-specific mortality (adjusted HR, 4.25; 95% CI, 1.61 to 11.25; P = 0.04) compared with those without concomitant DVT. In an external validation cohort of 4,476 patients with acute PE enrolled in the international multicenter RIETE Registry, concomitant DVT remained a significant predictor of all-cause (adjusted HR, 1.66; 95% CI, 1.28 to 2.15; P < 0.001) and PE-specific mortality (adjusted HR, 2.01; 95% CI, 1.18 to 3.44; P = 0.01). CONCLUSIONS: In patients with a first episode of acute symptomatic PE, the presence of concomitant DVT is an independent predictor of death in the ensuing 3 months after diagnosis. Assessment of the thrombotic burden should assist with risk stratification of patients with acute PE.
Resumo:
Purpose/Objective(s): Adenosquamous carcinoma (AC) of the head and neck is a distinct entity first described in 1968. Its natural history is more aggressive than squamous cell carcinoma but this is based on very small series reported in the literature. The goal of this study was to assess the clinical profile, outcome, patterns of failure and prognostic factors in patients with AC of the head and neck treated by radiation therapy (RT) with or without chemotherapy (CT).Materials/Methods: Data from 18 patients with Stage I (n = 3), II (n = 1), III (n = 4), or IVa (n = 10) AC, treated between 1989 and 2009, were collected in a retrospective multicenter Rare Cancer Network study. Median age was 60 years (range, 48 - 73 years). Fourteen patients were male and 4 female. Risk factors, including perineural invasion, lymphangitis, vascular invasion, positive margins, were present in 83% of the patients. Tumor sites included oral cavity in 4, oropharynx in 4, hypopharynx in2, larynx in 2, salivary glands in 2, nasal vestibule in 2, nasopharynx in 1, and maxillary sinus in 1 patient. Surgery (S) was performed in all but 5 patients. S alone was performed in only 1 patient, and definitive RT alone in 3 patients. Fourteen patients received combined modality treatment (S+RT in 10, RT+CT in 2, and all of the three modalities in 2 patients). Median RT dose to the primary and to the nodes was 66 Gy (range, 50 - 72 Gy) and 53 Gy (range, 44 - 66 Gy), respectively (1.8 - 2.0 Gy/fr., 5 fr./ week). In 4 patients, the planning treatment volume included the primary tumor site only. Seven patients were treated with 2D RT, 7 with 3D conformal RT, and 2 with intensity-modulated RT.Results: After a median follow-up period of 38 months (range, 9 - 62 months), 8 patients developed distant metastases (lung, bone, mediastinum, and liver), 6 presented nodal recurrences, and only 4 had a local relapse at the primary site (all in-field recurrences). At last follow-up, 6 patients were alive without disease, 1 alive with disease, 9 died from progressive disease, and 2 died from intercurrent disease. The 3-year and median overall survival, disease-free survival (DFS) and locoregional control rates were 52% (95% confidence interval [CI]: 28 - 76%) and 39 months, 36% (95% CI: 13 - 49%) and 12 months, and 54% (95% CI: 26 - 82%) and 40 months, respectively. In multivariate analysis (Cox model), DFS was negatively influenced by the presence of extracapsular extension (p = 0.02) and advanced stage (IV versus I-III, p = 0.003).Conclusions: Overall prognosis of locoregionally advanced AC remains poor, and distant metastases and nodal relapse occur in almost half of the cases. However, local control is relatively good, and early stage AC patients had prolonged DFS when treated with combined modality treatment.
Resumo:
OBJECTIVES: Because early etiologic identification is critical to select appropriate specific status epilepticus (SE) management, we aim to validate a clinical tool we developed that uses history and readily available investigations to guide prompt etiologic assessment. METHODS: This prospective multicenter study included all adult patients treated for SE of all but anoxic causes from four academic centers. The proposed tool is designed as a checklist covering frequent precipitating factors for SE. The study team completed the checklist at the time the patient was identified by electroencephalography (EEG) request. Only information available in the emergency department or at the time of in-hospital SE identification was used. Concordance between the etiology indicated by the tool and the determined etiology at hospital discharge was analyzed, together with interrater agreement. RESULTS: Two hundred twelve patients were included. Concordance between the etiology hypothesis generated using the tool and the finally determined etiology was 88.7% (95% confidence interval (CI) 86.4-89.8) (κ = 0.88). Interrater agreement was 83.3% (95% CI 80.4-96) (κ = 0.81). SIGNIFICANCE: This tool is valid and reliable for identification early the etiology of an SE. Physicians managing patients in SE may benefit from using it to identify promptly the underlying etiology, thus facilitating selection of the appropriate treatment.
Resumo:
BACKGROUND: Fractional flow reserve (FFR) has become an established tool for guiding treatment, but its graded relationship to clinical outcomes as modulated by medical therapy versus revascularization remains unclear. OBJECTIVES: The study hypothesized that FFR displays a continuous relationship between its numeric value and prognosis, such that lower FFR values confer a higher risk and therefore receive larger absolute benefits from revascularization. METHODS: Meta-analysis of study- and patient-level data investigated prognosis after FFR measurement. An interaction term between FFR and revascularization status allowed for an outcomes-based threshold. RESULTS: A total of 9,173 (study-level) and 6,961 (patient-level) lesions were included with a median follow-up of 16 and 14 months, respectively. Clinical events increased as FFR decreased, and revascularization showed larger net benefit for lower baseline FFR values. Outcomes-derived FFR thresholds generally occurred around the range 0.75 to 0.80, although limited due to confounding by indication. FFR measured immediately after stenting also showed an inverse relationship with prognosis (hazard ratio: 0.86, 95% confidence interval: 0.80 to 0.93; p < 0.001). An FFR-assisted strategy led to revascularization roughly half as often as an anatomy-based strategy, but with 20% fewer adverse events and 10% better angina relief. CONCLUSIONS: FFR demonstrates a continuous and independent relationship with subsequent outcomes, modulated by medical therapy versus revascularization. Lesions with lower FFR values receive larger absolute benefits from revascularization. Measurement of FFR immediately after stenting also shows an inverse gradient of risk, likely from residual diffuse disease. An FFR-guided revascularization strategy significantly reduces events and increases freedom from angina with fewer procedures than an anatomy-based strategy.
Resumo:
BACKGROUND: Because of the known relationship between exposure to combination antiretroviral therapy and cardiovascular disease (CVD), it has become increasingly important to intervene against risk of CVD in human immunodeficiency virus (HIV)-infected patients. We evaluated changes in risk factors for CVD and the use of lipid-lowering therapy in HIV-infected individuals and assessed the impact of any changes on the incidence of myocardial infarction. METHODS: The Data Collection on Adverse Events of Anti-HIV Drugs Study is a collaboration of 11 cohorts of HIV-infected patients that included follow-up for 33,389 HIV-infected patients from December 1999 through February 2006. RESULTS: The proportion of patients at high risk of CVD increased from 35.3% during 1999-2000 to 41.3% during 2005-2006. Of 28,985 patients, 2801 (9.7%) initiated lipid-lowering therapy; initiation of lipid-lowering therapy was more common for those with abnormal lipid values and those with traditional risk factors for CVD (male sex, older age, higher body mass index [calculated as the weight in kilograms divided by the square of the height in meters], family and personal history of CVD, and diabetes mellitus). After controlling for these, use of lipid-lowering drugs became relatively less common over time. The incidence of myocardial infarction (0.32 cases per 100 person-years [PY]; 95% confidence interval [CI], 0.29-0.35 cases per 100 PY) appeared to remain stable. However, after controlling for changes in risk factors for CVD, the rate decreased over time (relative rate in 2003 [compared with 1999-2000], 0.73 cases per 100 PY [95% CI, 0.50-1.05 cases per 100 PY]; in 2004, 0.64 cases per 100 PY [95% CI, 0.44-0.94 cases per 100 PY]; in 2005-2006, 0.36 cases per 100 PY [95% CI, 0.24-0.56 cases per 100 PY]). Further adjustment for lipid levels attenuated the relative rates towards unity (relative rate in 2003 [compared with 1999-2000], 1.06 cases per 100 PY [95% CI, 0.63-1.77 cases per 100 PY]; in 2004, 1.02 cases per 100 PY [95% CI, 0.61-1.71 cases per 100 PY]; in 2005-2006, 0.63 cases per 100 PY [95% CI, 0.36-1.09 cases per 100 PY]). CONCLUSIONS: Although the CVD risk profile among patients in the Data Collection on Adverse Events of Anti-HIV Drugs Study has decreased since 1999, rates have remained relatively stable, possibly as a result of a more aggressive approach towards managing the risk of CVD.
Resumo:
BACKGROUND: We retrospectively reviewed the long-term outcome and late side effects of endometrial cancer (EC) patients treated with different techniques of postoperative radiotherapy (PORT). METHODS: Between 1999 and 2012, 237 patients with EC were treated with PORT. Two-dimensional external beam radiotherapy (2D-EBRT) was used in 69 patients (30 %), three-dimensional EBRT (3D-EBRT) in 51 (21 %), and intensity-modulated RT (IMRT) with helical Tomotherapy in 47 (20 %). All patients received a vaginal brachytherapy (VB) boost. Seventy patients (29 %) received VB alone. RESULTS: After a median of 68 months (range, 6-154) of follow-up, overall survival was 75 % [95 % confidence interval (CI), 69-81], disease-free survival was 72 % (95% CI, 66-78), cancer-specific survival was 85 % (95 % CI, 80-89), and locoregional control was 86 % (95 % CI, 81-91). The 5-year estimates of grade 3 or more toxicity and second cancer rates were 0 and 7 % (95 % CI, 1-13) for VB alone, 6 % (95 % CI, 1-11) and 0 % for IMRT + VB, 9 % (95 % CI, 1-17) and 5 % (95 % CI, 1-9) for 3D-EBRT + VB, and 22 % (95 % CI, 12-32) and 12 % (95 % CI, 4-20) for 2D-EBRT + VB (P = 0.002 and P = 0.01), respectively. CONCLUSIONS: Pelvic EBRT should be tailored to patients with high-risk EC because the severe late toxicity observed might outweigh the benefits. When EBRT is prescribed for EC, IMRT should be considered, because it was associated with a significant reduction of severe late side effects.