114 resultados para Leaflet incidence
Resumo:
The variation with latitude of incidence and mortality for cutaneous malignant melanoma (CMM) in the non-Maori population of New Zealand was assessed. For those aged 20 to 74 years, the effects of age, time period, birth-cohort, gender, and region (latitude), and some interactions between them were evaluated by log-linear regression methods. Increasing age-standardized incidence and mortality rates with increasing proximity to the equator were found for men and women. These latitude gradients were greater for males than females. The relative risk of melanoma in the most southern part of New Zealand (latitude 44 degrees S) compared with the most northern region (latitude 36 degrees S) was 0.63 (95 percent confidence interval [CI] = 0.60-0.67) for incidence and 0.76 (CI = 0.68-0.86) for mortality, both genders combined. The mean percentage change in CMM rates per degree of latitude for males was greater than those reported in other published studies. Differences between men and women in melanoma risk with latitude suggest that regional sun-behavior patterns or other risk factors may contribute to the latitude gradient observed.
Resumo:
Objective. Mandibular osteoradionecrosis (ORN) is a serious complication of radiotherapy (RT) in head and neck cancer patients. The aim of this study was to analyze the incidence of and risk factors for mandibular ORN in squamous cell carcinoma (SCC) of the oral cavity and oropharynx.Study Design. Case series with chart review.Setting. University tertiary care center for head and neck oncology.Subjects and Methods. Seventy-three patients treated for stage I to IV SCC of the oral cavity and oropharynx between 2000 and 2007, with a minimum follow-up of 2 years, were included in the study. Treatment modalities included both RT with curative intent and adjuvant RT following tumor surgery. The log-rank test and Cox model were used for univariate and multivariate analyses.Results. The incidence of mandibular ORN was 40% at 5 years. Using univariate analysis, the following risk factors were identified: oral cavity tumors (P < .01), bone invasion (P < .02), any surgery prior to RT (P < .04), and bone surgery (P < .0001). By multivariate analysis, mandibular surgery proved to be the most important risk factor and the only one reaching statistical significance (P < .0002).Conclusion. Mandibular ORN is a frequent long-term complication of RT for oral cavity and oropharynx cancers. Mandibular surgery before irradiation is the only independent risk factor. These aspects must be considered when planning treatment for these tumors.
Resumo:
Between 1984 and 2006, 12 959 people with HIV/AIDS (PWHA) in the Swiss HIV Cohort Study contributed a total of 73 412 person-years (py) of follow-up, 35 551 of which derived from PWHA treated with highly active antiretroviral therapy (HAART). Five hundred and ninety-seven incident Kaposi sarcoma (KS) cases were identified of whom 52 were among HAART users. Cox regression was used to estimate hazard ratios (HR) and corresponding 95% confidence intervals (CI). Kaposi sarcoma incidence fell abruptly in 1996-1998 to reach a plateau at 1.4 per 1000 py afterwards. Men having sex with men and birth in Africa or the Middle East were associated with KS in both non-users and users of HAART but the risk pattern by CD4 cell count differed. Only very low CD4 cell count (<50 cells microl(-1)) at enrollment or at HAART initiation were significantly associated with KS among HAART users. The HR for KS declined steeply in the first months after HAART initiation and continued to be low 7-10 years afterwards (HR, 0.06; 95% CI, 0.02-0.17). Thirty-three out of 52 (63.5%) KS cases among HAART users arose among PWHA who had stopped treatment or used HAART for less than 6 months.
Resumo:
BACKGROUND: Few studies describe recent changes in the incidence, treatment, and outcomes of cardiogenic shock. OBJECTIVE: To examine temporal trends in the incidence, therapeutic management, and mortality rates of patients with the acute coronary syndrome (ACS) and cardiogenic shock, and to assess associations of therapeutic management with death and cardiogenic shock developing during hospitalization. DESIGN: Analysis of registry data collected among patients admitted to hospitals between 1997 and 2006. SETTING: 70 of the 106 acute cardiac care hospitals in Switzerland. PATIENTS: 23 696 adults with ACS enrolled in the AMIS (Acute Myocardial Infarction in Switzerland) Plus Registry. MEASUREMENTS: Cardiogenic shock incidence; treatment, including rates of percutaneous coronary intervention; and in-hospital mortality rates. RESULTS: Rates of overall cardiogenic shock (8.3% of patients with ACS) and cardiogenic shock developing during hospitalization (6.0% of patients with ACS and 71.5% of patients with cardiogenic shock) decreased during the past decade (P < 0.001 for temporal trend), whereas rates of cardiogenic shock on admission remained constant (2.3% of patients with ACS and 28.5% of patients with cardiogenic shock). Rates of percutaneous coronary intervention increased among patients with cardiogenic shock (7.6% to 65.9%; P = 0.010), whereas in-hospital mortality decreased (62.8% to 47.7%; P = 0.010). Percutaneous coronary intervention was independently associated with lower risk for both in-hospital mortality in all patients with ACS (odds ratio, 0.47 [95% CI, 0.30 to 0.73]; P = 0.001) and cardiogenic shock development during hospitalization in patients with ACS but without cardiogenic shock on admission (odds ratio, 0.59 [CI, 0.39 to 0.89]; P = 0.012). LIMITATIONS: There was no central review of cardiogenic shock diagnoses, and follow-up duration was confined to the hospital stay. Unmeasured or inaccurately measured characteristics may have confounded observed associations of treatment with outcomes. CONCLUSION: Over the past decade, rates of cardiogenic shock developing during hospitalization and in-hospital mortality decreased among patients with ACS. Increased percutaneous coronary intervention rates were associated with decreased mortality among patients with cardiogenic shock and with decreased development of cardiogenic shock during hospitalization.
Resumo:
BACKGROUND: Estimates of drug resistance incidence to modern first-line combination antiretroviral therapies against human immunodeficiency virus (HIV) type 1 are complicated by limited availability of genotypic drug resistance tests (GRTs) and uncertain timing of resistance emergence. METHODS: Five first-line combinations were studied (all paired with lamivudine or emtricitabine): efavirenz (EFV) plus zidovudine (AZT) (n = 524); EFV plus tenofovir (TDF) (n = 615); lopinavir (LPV) plus AZT (n = 573); LPV plus TDF (n = 301); and ritonavir-boosted atazanavir (ATZ/r) plus TDF (n = 250). Virological treatment outcomes were classified into 3 risk strata for emergence of resistance, based on whether undetectable HIV RNA levels were maintained during therapy and, if not, whether viral loads were >500 copies/mL during treatment. Probabilities for presence of resistance mutations were estimated from GRTs (n = 2876) according to risk stratum and therapy received at time of testing. On the basis of these data, events of resistance emergence were imputed for each individual and were assessed using survival analysis. Imputation was repeated 100 times, and results were summarized by median values (2.5th-97.5th percentile range). RESULTS: Six years after treatment initiation, EFV plus AZT showed the highest cumulative resistance incidence (16%) of all regimens (<11%). Confounder-adjusted Cox regression confirmed that first-line EFV plus AZT (reference) was associated with a higher median hazard for resistance emergence, compared with other treatments: EFV plus TDF (hazard ratio [HR], 0.57; range, 0.42-0.76), LPV plus AZT (HR, 0.63; range, 0.45-0.89), LPV plus TDF (HR, 0.55; range, 0.33-0.83), ATZ/r plus TDF (HR, 0.43; range, 0.17-0.83). Two-thirds of resistance events were associated with detectable HIV RNA level ≤500 copies/mL during treatment, and only one-third with virological failure (HIV RNA level, >500 copies/mL). CONCLUSIONS: The inclusion of TDF instead of AZT and ATZ/r was correlated with lower rates of resistance emergence, most likely because of improved tolerability and pharmacokinetics resulting from a once-daily dosage.
Resumo:
Immigrants from high-burden countries and HIV-coinfected individuals are risk groups for tuberculosis (TB) in countries with low TB incidence. Therefore, we studied their role in transmission of Mycobacterium tuberculosis in Switzerland. We included all TB patients from the Swiss HIV Cohort and a sample of patients from the national TB registry. We identified molecular clusters by spoligotyping and mycobacterial interspersed repetitive-unit-variable-number tandem-repeat (MIRU-VNTR) analysis and used weighted logistic regression adjusted for age and sex to identify risk factors for clustering, taking sampling proportions into account. In total, we analyzed 520 TB cases diagnosed between 2000 and 2008; 401 were foreign born, and 113 were HIV coinfected. The Euro-American M. tuberculosis lineage dominated throughout the study period (378 strains; 72.7%), with no evidence for another lineage, such as the Beijing genotype, emerging. We identified 35 molecular clusters with 90 patients, indicating recent transmission; 31 clusters involved foreign-born patients, and 15 involved HIV-infected patients. Birth origin was not associated with clustering (adjusted odds ratio [aOR], 1.58; 95% confidence interval [CI], 0.73 to 3.43; P = 0.25, comparing Swiss-born with foreign-born patients), but clustering was reduced in HIV-infected patients (aOR, 0.49; 95% CI, 0.26 to 0.93; P = 0.030). Cavitary disease, male sex, and younger age were all associated with molecular clustering. In conclusion, most TB patients in Switzerland were foreign born, but transmission of M. tuberculosis was not more common among immigrants and was reduced in HIV-infected patients followed up in the national HIV cohort study. Continued access to health services and clinical follow-up will be essential to control TB in this population.
Resumo:
BACKGROUND: Solid-organ transplant recipients are at high risk for the development of herpes zoster. Epidemiologic data in lung transplant recipients are lacking. We determined the incidence and clinical characteristics of herpes zoster, and the risk factors for developing herpes zoster, after lung transplantation. METHODS: We retrospectively reviewed all adult (>18 years old) lung transplants performed at our institution between January 2001 and December 2005. Clinical characteristics of herpes zoster and potential risk factors associated with herpes zoster were assessed. RESULTS: Two hundred thirty-nine lung transplant recipients were included in the analysis. Median time of follow-up was 722 days (range 18 to 1,943 days). Thirty-five episodes of herpes zoster occurred in 29 patients, with a calculated incidence of 55.1 cases per 1,000 person-years of follow-up. The cumulative probability of herpes zoster was 5.8% at 1 year, 18.1% at 3 years and 20.2% at 5 years post-transplant. Only 2 of the 35 (5.7%) patients had disseminated cutaneous infection and none had visceral involvement. Recurrence of herpes zoster was seen in 13.8% of patients. Post-herpetic neuralgia was detected in 20% of cases. Anti-viral prophylaxis, primarily for cytomegalovirus (CMV), was protective against herpes zoster. No significant epidemiologic risk factors associated with herpes zoster could be identified. CONCLUSIONS: Herpes zoster is a common complication after lung transplantation with a peak incidence at between 1 and 4 years post-transplant. Preventive strategies would be beneficial for this population.
Resumo:
OBJECTIVE: Ultrasounds are a useful tool when looking for indirect evidence in favor of pulmonary embolism. The aim of this study was to determine the incidence of acute cor pulmonale and deep venous thrombosis revealed by ultrasonographic techniques in a population of patients presenting with pulmonary embolism. METHODS: 96 consecutive patients with a mean (+/- SD) age of 65 +/- 15 years, admitted to our hospital for pulmonary embolism were included in this study. The diagnosis of pulmonary embolism was made either by spiral computed tomography or selective pulmonary angiography. Each patient subsequently underwent both trans-thoracic echocardiography and venous ultrasonography. The diagnostic criterion used for defining acute cor pulmonale by echocardiography was the right to left ventricular end-diastolic area ratio over (or equal to) 0.6. Diagnosis of deep venous thrombosis was supported by the visualization of thrombi or vein incompressibility and/or the absence of venous flow or loss of flow variability by venous ultrasonography. RESULTS: Using ultrasounds, an acute cor pulmonale was found in 63% of our patients while 79% were found to have deep venous thrombosis and 92% of the patients had either acute cor pulmonale or deep venous thrombosis or both. All of the patients with proximal pulmonary embolism had acute cor pulmonale and/or deep venous thrombosis. The presence of acute cor pulmonale on echocardiography was significantly higher in patients with proximal pulmonary embolism (p < 0.0001). CONCLUSION: This study emphasizes the potential value of ultrasonographic techniques in the diagnosis of acute pulmonary embolism.
Resumo:
OBJECTIVES: The aim of this study was to evaluate the risk factors associated with Contegra graft (Medtronic Minneapolis, MN, USA) infection after reconstruction of the right ventricular outflow tract. METHODS: One hundred and six Contegra grafts were implanted between April 1999 and April 2010 for the Ross procedure (n = 46), isolated pulmonary valve replacement (n = 32), tetralogy of Fallot (n = 24), double-outlet right ventricle (n = 7), troncus arteriosus (n = 4), switch operation (n = 1) and redo of pulmonary valve replacement (n = 2). The median age of the patients was 13 years (range 0-54 years). A follow-up was completed in all cases with a median duration of 7.6 years (range 1.7-12.7 years). RESULTS: There were 3 cases of in-hospital mortality. The survival rate during 7 years was 95.7%. Despite the lifelong endocarditis prophylaxis, Contegra graft infection was diagnosed in 12 (11.3%) patients at a median time of 4.4 years (ranging from 0.4 to 8.7 years). Univariate analysis of preoperative, perioperative and postoperative variables was performed and the following risk factors for time to infection were identified: female gender with a hazard ratio (HR) of 0.19 (P = 0.042), systemic-to-pulmonary shunt (HR 6.46, P < 0.01), hypothermia (HR 0.79, P = 0.014), postoperative renal insufficiency (HR 11.97, P = 0.015) and implantation of permanent pacemaker during hospitalization (HR 5.29, P = 0.075). In 2 cases, conservative therapy was successful and, in 10 patients, replacement of the infected valve was performed. The Contegra graft was replaced by a homograft in 2 cases and by a new Contegra graft in 8 cases. Cox's proportional hazard model indicated that time to graft infection was significantly associated with tetralogy of Fallot (HR 0.06, P = 0.01), systemic-to-pulmonary shunt (HR 64.71, P < 0.01) and hypothermia (HR 0.77, P < 0.01). CONCLUSION: Contegra graft infection affected 11.3% of cases in our cohort, and thus may be considered as a frequent entity that can be predicted by both intraoperative and early postoperative factors. After the diagnosis of infection associated with the Contegra graft was confirmed, surgical treatment was the therapy of choice.
Resumo:
BACKGROUND: The incidence and outcomes of respiratory viral infections in lung transplant recipients (LTR) are not well defined. The objective of this prospective study conducted from June 2008 to March 2011 was to characterise the incidence and outcomes of viral respiratory infections in LTR. METHODS: Patients were seen in three contexts: study-specific screenings covering all seasons; routine post-transplantation follow-up; and emergency visits. Nasopharyngeal specimens were collected systematically and bronchoalveolar lavage (BAL) was performed when clinically indicated. All specimens underwent testing with a wide panel of molecular assays targeting respiratory viruses. RESULTS: One hundred and twelve LTR had 903 encounters: 570 (63%) were screening visits, 124 (14%) were routine post-transplantation follow-up and 209 (23%) were emergency visits. Respiratory viruses were identified in 174 encounters, 34 of these via BAL. The incidence of infection was 0.83 per patient-year (95% CI 0.45 to 1.52). The viral infection rates upon screening, routine and emergency visits were 14%, 15% and 34%, respectively (p<0.001). Picornavirus was identified most frequently in nasopharyngeal (85/140; 60.7%) and BAL specimens (20/34; 59%). Asymptomatic viral carriage, mainly of picornaviruses, was found at 10% of screening visits. Infections were associated with transient lung function loss and high calcineurin inhibitor blood levels. The hospitalisation rate was 50% (95% CI 30% to 70.9%) for influenza and parainfluenza and 16.9% (95% CI 11.2% to 23.9%) for other viruses. Acute rejection was not associated with viral infection (OR 0.4, 95% CI 0.1 to 1.3). CONCLUSIONS: There is a high incidence of viral infection in LTR; asymptomatic carriage is rare. Viral infections contribute significantly to this population's respiratory symptomatology. No temporal association was observed between infection and acute rejection.