51 resultados para Trost, Kirk


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Non-AIDS defining cancers (NADC) are an important cause of morbidity and mortality in HIV-positive individuals. Using data from a large international cohort of HIV-positive individuals, we described the incidence of NADC from 2004–2010, and described subsequent mortality and predictors of these. Methods Individuals were followed from 1st January 2004/enrolment in study, until the earliest of a new NADC, 1st February 2010, death or six months after the patient’s last visit. Incidence rates were estimated for each year of follow-up, overall and stratified by gender, age and mode of HIV acquisition. Cumulative risk of mortality following NADC diagnosis was summarised using Kaplan-Meier methods, with follow-up for these analyses from the date of NADC diagnosis until the patient’s death, 1st February 2010 or 6 months after the patient’s last visit. Factors associated with mortality following NADC diagnosis were identified using multivariable Cox proportional hazards regression. Results Over 176,775 person-years (PY), 880 (2.1%) patients developed a new NADC (incidence: 4.98/1000PY [95% confidence interval 4.65, 5.31]). Over a third of these patients (327, 37.2%) had died by 1st February 2010. Time trends for lung cancer, anal cancer and Hodgkin’s lymphoma were broadly consistent. Kaplan-Meier cumulative mortality estimates at 1, 3 and 5 years after NADC diagnosis were 28.2% [95% CI 25.1-31.2], 42.0% [38.2-45.8] and 47.3% [42.4-52.2], respectively. Significant predictors of poorer survival after diagnosis of NADC were lung cancer (compared to other cancer types), male gender, non-white ethnicity, and smoking status. Later year of diagnosis and higher CD4 count at NADC diagnosis were associated with improved survival. The incidence of NADC remained stable over the period 2004–2010 in this large observational cohort. Conclusions The prognosis after diagnosis of NADC, in particular lung cancer and disseminated cancer, is poor but has improved somewhat over time. Modifiable risk factors, such as smoking and low CD4 counts, were associated with mortality following a diagnosis of NADC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Few studies have monitored late presentation (LP) of HIV infection over the European continent, including Eastern Europe. Study objectives were to explore the impact of LP on AIDS and mortality. Methods and Findings LP was defined in Collaboration of Observational HIV Epidemiological Research Europe (COHERE) as HIV diagnosis with a CD4 count <350/mm3 or an AIDS diagnosis within 6 months of HIV diagnosis among persons presenting for care between 1 January 2000 and 30 June 2011. Logistic regression was used to identify factors associated with LP and Poisson regression to explore the impact on AIDS/death. 84,524 individuals from 23 cohorts in 35 countries contributed data; 45,488 were LP (53.8%). LP was highest in heterosexual males (66.1%), Southern European countries (57.0%), and persons originating from Africa (65.1%). LP decreased from 57.3% in 2000 to 51.7% in 2010/2011 (adjusted odds ratio [aOR] 0.96; 95% CI 0.95–0.97). LP decreased over time in both Central and Northern Europe among homosexual men, and male and female heterosexuals, but increased over time for female heterosexuals and male intravenous drug users (IDUs) from Southern Europe and in male and female IDUs from Eastern Europe. 8,187 AIDS/deaths occurred during 327,003 person-years of follow-up. In the first year after HIV diagnosis, LP was associated with over a 13-fold increased incidence of AIDS/death in Southern Europe (adjusted incidence rate ratio [aIRR] 13.02; 95% CI 8.19–20.70) and over a 6-fold increased rate in Eastern Europe (aIRR 6.64; 95% CI 3.55–12.43). Conclusions LP has decreased over time across Europe, but remains a significant issue in the region in all HIV exposure groups. LP increased in male IDUs and female heterosexuals from Southern Europe and IDUs in Eastern Europe. LP was associated with an increased rate of AIDS/deaths, particularly in the first year after HIV diagnosis, with significant variation across Europe. Earlier and more widespread testing, timely referrals after testing positive, and improved retention in care strategies are required to further reduce the incidence of LP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background. Few studies consider the incidence of individual AIDS-defining illnesses (ADIs) at higher CD4 counts, relevant on a population level for monitoring and resource allocation. Methods. Individuals from the Collaboration of Observational HIV Epidemiological Research Europe (COHERE) aged ≥14 years with ≥1 CD4 count of ≥200 µL between 1998 and 2010 were included. Incidence rates (per 1000 person-years of follow-up [PYFU]) were calculated for each ADI within different CD4 strata; Poisson regression, using generalized estimating equations and robust standard errors, was used to model rates of ADIs with current CD4 ≥500/µL. Results. A total of 12 135 ADIs occurred at a CD4 count of ≥200 cells/µL among 207 539 persons with 1 154 803 PYFU. Incidence rates declined from 20.5 per 1000 PYFU (95% confidence interval [CI], 20.0–21.1 per 1000 PYFU) with current CD4 200–349 cells/µL to 4.1 per 1000 PYFU (95% CI, 3.6–4.6 per 1000 PYFU) with current CD4 ≥ 1000 cells/µL. Persons with a current CD4 of 500–749 cells/µL had a significantly higher rate of ADIs (adjusted incidence rate ratio [aIRR], 1.20; 95% CI, 1.10–1.32), whereas those with a current CD4 of ≥1000 cells/µL had a similar rate (aIRR, 0.92; 95% CI, .79–1.07), compared to a current CD4 of 750–999 cells/µL. Results were consistent in persons with high or low viral load. Findings were stronger for malignant ADIs (aIRR, 1.52; 95% CI, 1.25–1.86) than for nonmalignant ADIs (aIRR, 1.12; 95% CI, 1.01–1.25), comparing persons with a current CD4 of 500–749 cells/µL to 750–999 cells/µL. Discussion. The incidence of ADIs was higher in individuals with a current CD4 count of 500–749 cells/µL compared to those with a CD4 count of 750–999 cells/µL, but did not decrease further at higher CD4 counts. Results were similar in patients virologically suppressed on combination antiretroviral therapy, suggesting that immune reconstitution is not complete until the CD4 increases to >750 cells/µL.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: To assess health care utilisation for patients co-infected with TB and HIV (TB-HIV), and to develop a weighted health care index (HCI) score based on commonly used interventions and compare it with patient outcome. METHODS: A total of 1061 HIV patients diagnosed with TB in four regions, Central/Northern, Southern and Eastern Europe and Argentina, between January 2004 and December 2006 were enrolled in the TB-HIV study. A weighted HCI score (range 0–5), based on independent prognostic factors identified in multivariable Cox models and the final score, included performance of TB drug susceptibility testing (DST), an initial TB regimen containing a rifamycin, isoniazid and pyrazinamide, and start of combination antiretroviral treatment (cART). RESULTS: The mean HCI score was highest in Central/Northern Europe (3.2, 95%CI 3.1–3.3) and lowest in Eastern Europe (1.6, 95%CI 1.5–1.7). The cumulative probability of death 1 year after TB diagnosis decreased from 39% (95%CI 31–48) among patients with an HCI score of 0, to 9% (95%CI 6–13) among those with a score of ≥4. In an adjusted Cox model, a 1-unit increase in the HCI score was associated with 27% reduced mortality (relative hazard 0.73, 95%CI 0.64–0.84). CONCLUSIONS: Our results suggest that DST, standard anti-tuberculosis treatment and early cART may improve outcome for TB-HIV patients. The proposed HCI score provides a tool for future research and monitoring of the management of TB-HIV patients. The highest HCI score may serve as a benchmark to assess TB-HIV management, encouraging continuous health care improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: The aim of this study was to determine whether the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI)- or Cockcroft-Gault (CG)-based estimated glomerular filtration rates (eGFRs) performs better in the cohort setting for predicting moderate/advanced chronic kidney disease (CKD) or end-stage renal disease (ESRD). METHODS: A total of 9521 persons in the EuroSIDA study contributed 133 873 eGFRs. Poisson regression was used to model the incidence of moderate and advanced CKD (confirmed eGFR < 60 and < 30 mL/min/1.73 m(2) , respectively) or ESRD (fatal/nonfatal) using CG and CKD-EPI eGFRs. RESULTS: Of 133 873 eGFR values, the ratio of CG to CKD-EPI was ≥ 1.1 in 22 092 (16.5%) and the difference between them (CG minus CKD-EPI) was ≥ 10 mL/min/1.73 m(2) in 20 867 (15.6%). Differences between CKD-EPI and CG were much greater when CG was not standardized for body surface area (BSA). A total of 403 persons developed moderate CKD using CG [incidence 8.9/1000 person-years of follow-up (PYFU); 95% confidence interval (CI) 8.0-9.8] and 364 using CKD-EPI (incidence 7.3/1000 PYFU; 95% CI 6.5-8.0). CG-derived eGFRs were equal to CKD-EPI-derived eGFRs at predicting ESRD (n = 36) and death (n = 565), as measured by the Akaike information criterion. CG-based moderate and advanced CKDs were associated with ESRD [adjusted incidence rate ratio (aIRR) 7.17; 95% CI 2.65-19.36 and aIRR 23.46; 95% CI 8.54-64.48, respectively], as were CKD-EPI-based moderate and advanced CKDs (aIRR 12.41; 95% CI 4.74-32.51 and aIRR 12.44; 95% CI 4.83-32.03, respectively). CONCLUSIONS: Differences between eGFRs using CG adjusted for BSA or CKD-EPI were modest. In the absence of a gold standard, the two formulae predicted clinical outcomes with equal precision and can be used to estimate GFR in HIV-positive persons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High brightness electron sources are of great importance for the operation of the hard X-ray free electron lasers. Field emission cathodes based on the double-gate metallic field emitter arrays (FEAs) can potentially offer higher brightness than the currently used ones. We report on the successful application of electron beam lithography for fabrication of the large-scale single-gate as well as double-gate FEAs. We demonstrate operational high-density single-gate FEAs with sub-micron pitch and total number of tips up to 106 as well as large-scale double-gate FEAs with large collimation gate apertures. The details of design, fabrication procedure and successful measurements of the emission current from the single- and double-gate cathodes are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Patients suffering from bipolar affective disorder show deficits in working memory functions. In a previous functional magnetic resonance imaging study, we observed an abnormal hyperactivity of the amygdala in bipolar patients during articulatory rehearsal in verbal working memory. In the present study, we investigated the dynamic neurofunctional interactions between the right amygdala and the brain systems that underlie verbal working memory in both bipolar patients and healthy controls. In total, 18 euthymic bipolar patients and 18 healthy controls performed a modified version of the Sternberg item-recognition (working memory) task. We used the psychophysiological interaction approach in order to assess functional connectivity between the right amygdala and the brain regions involved in verbal working memory. In healthy subjects, we found significant negative functional interactions between the right amygdala and multiple cortical brain areas involved in verbal working memory. In comparison with the healthy control subjects, bipolar patients exhibited significantly reduced functional interactions of the right amygdala particularly with the right-hemispheric, i.e., ipsilateral, cortical regions supporting verbal working memory. Together with our previous finding of amygdala hyperactivity in bipolar patients during verbal rehearsal, the present results suggest that a disturbed right-hemispheric “cognitive–emotional” interaction between the amygdala and cortical brain regions underlying working memory may be responsible for amygdala hyperactivation and affects verbal working memory (deficits) in bipolar patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION Proteinuria (PTU) is an important marker for the development and progression of renal disease, cardiovascular disease and death, but there is limited information about the prevalence and factors associated with confirmed PTU in predominantly white European HIV+ persons, especially in those with an estimated glomerular filtration rate (eGFR) of 60 mL/min/1.73 m(2). PATIENTS AND METHODS Baseline was defined as the first of two consecutive dipstick urine protein (DPU) measurements during prospective follow-up >1/6/2011 (when systematic data collection began). PTU was defined as two consecutive DUP >1+ (>30 mg/dL) >3 months apart; persons with eGFR <60 at either DPU measurement were excluded. Logistic regression investigated factors associated with PTU. RESULTS A total of 1,640 persons were included, participants were mainly white (n=1,517, 92.5%), male (n=1296, 79.0%) and men having sex with men (n=809; 49.3%). Median age at baseline was 45 (IQR 37-52 years), and CD4 was 570 (IQR 406-760/mm(3)). The median baseline date was 2/12 (IQR 11/11-6/12), and median eGFR was 99 (IQR 88-109 mL/min/1.73 m(2)). Sixty-nine persons had PTU (4.2%, 95% CI 3.2-4.7%). Persons with diabetes had increased odds of PTU, as were those with a prior non-AIDS (1) or AIDS event and those with prior exposure to indinavir. Among females, those with a normal eGFR (>90) and those with prior abacavir use had lower odds of PTU (Figure 1). CONCLUSIONS One in 25 persons with eGFR>60 had confirmed proteinuria at baseline. Factors associated with PTU were similar to those associated with CKD. The lack of association with antiretrovirals, particularly tenofovir, may be due to the cross-sectional design of this study, and additional follow-up is required to address progression to PTU in those without PTU at baseline. It may also suggest other markers are needed to capture the deteriorating renal function associated with antiretrovirals may be needed at higher eGFRs. Our findings suggest PTU is an early marker for impaired renal function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION Rates of both TB/HIV co-infection and multi-drug-resistant (MDR) TB are increasing in Eastern Europe (EE). Data on the clinical management of TB/HIV co-infected patients are scarce. Our aim was to study the clinical characteristics of TB/HIV patients in Europe and Latin America (LA) at TB diagnosis, identify factors associated with MDR-TB and assess the activity of initial TB treatment regimens given the results of drug-susceptibility tests (DST). MATERIAL AND METHODS We enrolled 1413 TB/HIV patients from 62 clinics in 19 countries in EE, Western Europe (WE), Southern Europe (SE) and LA from January 2011 to December 2013. Among patients who completed DST within the first month of TB therapy, we linked initial TB treatment regimens to the DST results and calculated the distribution of patients receiving 0, 1, 2, 3 and ≥4 active drugs in each region. Risk factors for MDR-TB were identified in logistic regression models. RESULTS Significant differences were observed between EE (n=844), WE (n=152), SE (n=164) and LA (n=253) for use of combination antiretroviral therapy (cART) at TB diagnosis (17%, 40%, 44% and 35%, p<0.0001), a definite TB diagnosis (culture and/or PCR positive for Mycobacterium tuberculosis; 47%, 71%, 72% and 40%, p<0.0001) and MDR-TB prevalence (34%, 3%, 3% and 11%, p <0.0001 among those with DST results). The history of injecting drug use [adjusted OR (aOR) = 2.03, (95% CI 1.00-4.09)], prior TB treatment (aOR = 3.42, 95% CI 1.88-6.22) and living in EE (aOR = 7.19, 95% CI 3.28-15.78) were associated with MDR-TB. For 569 patients with available DST, the initial TB treatment contained ≥3 active drugs in 64% of patients in EE compared with 90-94% of patients in other regions (Figure 1a). Had the patients received initial therapy with standard therapy [Rifampicin, Isoniazid, Pyrazinamide, Ethambutol (RHZE)], the corresponding proportions would have been 64% vs. 86-97%, respectively (Figure 1b). CONCLUSIONS In EE, TB/HIV patients had poorer exposure to cART, less often a definitive TB diagnosis and more often MDR-TB compared to other parts of Europe and LA. Initial TB therapy in EE was sub-optimal, with less than two-thirds of patients receiving at least three active drugs, and improved compliance with standard RHZE treatment does not seem to be the solution. Improved management of TB/HIV patients requires routine use of DST, initial TB therapy according to prevailing resistance patterns and more widespread use of cART.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Dual antiplatelet therapy is recommended after coronary stenting to prevent thrombotic complications, yet the benefits and risks of treatment beyond 1 year are uncertain. METHODS Patients were enrolled after they had undergone a coronary stent procedure in which a drug-eluting stent was placed. After 12 months of treatment with a thienopyridine drug (clopidogrel or prasugrel) and aspirin, patients were randomly assigned to continue receiving thienopyridine treatment or to receive placebo for another 18 months; all patients continued receiving aspirin. The coprimary efficacy end points were stent thrombosis and major adverse cardiovascular and cerebrovascular events (a composite of death, myocardial infarction, or stroke) during the period from 12 to 30 months. The primary safety end point was moderate or severe bleeding. RESULTS A total of 9961 patients were randomly assigned to continue thienopyridine treatment or to receive placebo. Continued treatment with thienopyridine, as compared with placebo, reduced the rates of stent thrombosis (0.4% vs. 1.4%; hazard ratio, 0.29 [95% confidence interval {CI}, 0.17 to 0.48]; P<0.001) and major adverse cardiovascular and cerebrovascular events (4.3% vs. 5.9%; hazard ratio, 0.71 [95% CI, 0.59 to 0.85]; P<0.001). The rate of myocardial infarction was lower with thienopyridine treatment than with placebo (2.1% vs. 4.1%; hazard ratio, 0.47; P<0.001). The rate of death from any cause was 2.0% in the group that continued thienopyridine therapy and 1.5% in the placebo group (hazard ratio, 1.36 [95% CI, 1.00 to 1.85]; P=0.05). The rate of moderate or severe bleeding was increased with continued thienopyridine treatment (2.5% vs. 1.6%, P=0.001). An elevated risk of stent thrombosis and myocardial infarction was observed in both groups during the 3 months after discontinuation of thienopyridine treatment. CONCLUSIONS Dual antiplatelet therapy beyond 1 year after placement of a drug-eluting stent, as compared with aspirin therapy alone, significantly reduced the risks of stent thrombosis and major adverse cardiovascular and cerebrovascular events but was associated with an increased risk of bleeding. (Funded by a consortium of eight device and drug manufacturers and others; DAPT ClinicalTrials.gov number, NCT00977938.).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES In Europe and elsewhere, health inequalities among HIV-positive individuals are of concern. We investigated late HIV diagnosis and late initiation of combination antiretroviral therapy (cART) by educational level, a proxy of socioeconomic position. DESIGN AND METHODS We used data from nine HIV cohorts within COHERE in Austria, France, Greece, Italy, Spain and Switzerland, collecting data on level of education in categories of the UNESCO/International Standard Classification of Education standard classification: non-completed basic, basic, secondary and tertiary education. We included individuals diagnosed with HIV between 1996 and 2011, aged at least 16 years, with known educational level and at least one CD4 cell count within 6 months of HIV diagnosis. We examined trends by education level in presentation with advanced HIV disease (AHD) (CD4 <200 cells/μl or AIDS within 6 months) using logistic regression, and distribution of CD4 cell count at cART initiation overall and among presenters without AHD using median regression. RESULTS Among 15 414 individuals, 52, 45,37, and 31% with uncompleted basic, basic, secondary and tertiary education, respectively, presented with AHD (P trend <0.001). Compared to patients with tertiary education, adjusted odds ratios of AHD were 1.72 (95% confidence interval 1.48-2.00) for uncompleted basic, 1.39 (1.24-1.56) for basic and 1.20 (1.08-1.34) for secondary education (P < 0.001). In unadjusted and adjusted analyses, median CD4 cell count at cART initiation was lower with poorer educational level. CONCLUSIONS Socioeconomic inequalities in delayed HIV diagnosis and initiation of cART are present in European countries with universal healthcare systems and individuals with lower educational level do not equally benefit from timely cART initiation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES The aim of the study was to investigate the organization and delivery of HIV and tuberculosis (TB) health care and to analyse potential differences between treatment centres in Eastern (EE) and Western Europe (WE). METHODS Thirty-eight European HIV and TB treatment centres participating in the TB:HIV study within EuroCoord completed a survey on health care management for coinfected patients in 2013 (EE: 17 respondents; WE:21; 76% of all TB:HIV centres). Descriptive statistics were obtained for regional comparisons. The reported data on health care strategies were compared with actual clinical practice at patient level via data derived from the TB:HIV study. RESULTS Respondent centres in EE comprised: Belarus (n = 3), Estonia (1), Georgia (1), Latvia (1), Lithuania (1), Poland (4), Romania (1), the Russian Federation (4) and Ukraine (1); those in WE comprised: Belgium (1), Denmark (1), France (1), Italy (7), Spain (2), Switzerland (1) and UK (8). Compared with WE, treatment of HIV and TB in EE are less often located at the same site (47% in EE versus 100% in WE; P < 0.001) and less often provided by the same doctors (41% versus 90%, respectively; P = 0.002), whereas regular screening of HIV-infected patients for TB (80% versus 40%, respectively; P = 0.037) and directly observed treatment (88% versus 20%, respectively; P < 0.001) were more common in EE. The reported availability of rifabutin and second- and third-line anti-TB drugs was lower, and opioid substitution therapy (OST) was available at fewer centres in EE compared with WE (53% versus 100%, respectively; P < 0.001). CONCLUSIONS Major differences exist between EE and WE in relation to the organization and delivery of health care for HIV/TB-coinfected patients and the availability of anti-TB drugs and OST. Significant discrepancies between reported and actual clinical practices were found in EE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND To cover the shortage of cadaveric organs, new approaches to expand the donor pool are needed. Here we report on a case of domino liver transplantation (DLT) using an organ harvested from a compound heterozygous patient with primary hyperoxaluria (PHO), who underwent combined liver and kidney transplantation. The DLT recipient developed early renal failure with oxaluria. The time to the progression to oxalosis with renal failure in such situations is unknown, but, based on animal data, we hypothesize that calcineurin inhibitors may play a detrimental role. METHODS A cadaveric liver and kidney transplantation was performed in a 52-year-old male with PHO. His liver was used for a 64-year-old patient with a non-resectable, but limited cholangiocarcinoma. RESULTS While the course of the PHO donor was uneventful, in the DLT recipient early post-operative, dialysis-dependent renal failure with hyperoxaluria developed. Histology of a kidney biopsy revealed massive calcium oxalate crystal deposition as the leading aetiological cause. CONCLUSIONS DLT using PHO organs for marginal recipients represents a possible therapeutic approach regarding graft function of the liver. However, it may negatively alter the renal outcome of the recipient in an unpredictable manner, especially with concomitant use of cyclosporin. Therefore, we suggest that, although DLT should be promoted, PHO organs are better excluded from such procedures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE To evaluate risk factors for survival in a large international cohort of patients with primary urethral cancer (PUC). METHODS A series of 154 patients (109 men, 45 women) were diagnosed with PUC in ten referral centers between 1993 and 2012. Kaplan-Meier analysis with log-rank test was used to investigate various potential prognostic factors for recurrence-free (RFS) and overall survival (OS). Multivariate models were constructed to evaluate independent risk factors for recurrence and death. RESULTS Median age at definitive treatment was 66 years (IQR 58-76). Histology was urothelial carcinoma in 72 (47 %), squamous cell carcinoma in 46 (30 %), adenocarcinoma in 17 (11 %), and mixed and other histology in 11 (7 %) and nine (6 %), respectively. A high degree of concordance between clinical and pathologic nodal staging (cN+/cN0 vs. pN+/pN0; p < 0.001) was noted. For clinical nodal staging, the corresponding sensitivity, specificity, and overall accuracy for predicting pathologic nodal stage were 92.8, 92.3, and 92.4 %, respectively. In multivariable Cox-regression analysis for patients staged cM0 at initial diagnosis, RFS was significantly associated with clinical nodal stage (p < 0.001), tumor location (p < 0.001), and age (p = 0.001), whereas clinical nodal stage was the only independent predictor for OS (p = 0.026). CONCLUSIONS These data suggest that clinical nodal stage is a critical parameter for outcomes in PUC.