122 resultados para Reiss, Harry.
Resumo:
Sustained growth of solid tumours can rely on both the formation of new and the co-option of existing blood vessels. Current models suggest that binding of angiopoietin-2 (Ang-2) to its endothelial Tie2 receptor prevents receptor phosphorylation, destabilizes blood vessels, and promotes vascular permeability. In contrast, binding of angiopoietin-1 (Ang-1) induces Tie2 receptor activation and supports the formation of mature blood vessels covered by pericytes. Despite the intense research to decipher the role of angiopoietins during physiological neovascularization and tumour angiogenesis, a mechanistic understanding of angiopoietin function on vascular integrity and remodelling is still incomplete. We therefore assessed the vascular morphology of two mouse mammary carcinoma xenotransplants (M6378 and M6363) which differ in their natural angiopoietin expression. M6378 displayed Ang-1 in tumour cells but no Ang-2 in tumour endothelial cells in vivo. In contrast, M6363 tumours expressed Ang-2 in the tumour vasculature, whereas no Ang-1 expression was present in tumour cells. We stably transfected M6378 mouse mammary carcinoma cells with human Ang-1 or Ang-2 and investigated the consequences on the host vasculature, including ultrastructural morphology. Interestingly, M6378/Ang-2 and M6363 tumours displayed a similar vascular morphology, with intratumoural haemorrhage and non-functional and abnormal blood vessels. Pericyte loss was prominent in these tumours and was accompanied by increased endothelial cell apoptosis. Thus, overexpression of Ang-2 converted the vascular phenotype of M6378 tumours into a phenotype similar to M6363 tumours. Our results support the hypothesis that Ang-1/Tie2 signalling is essential for vessel stabilization and endothelial cell/pericyte interaction, and suggest that Ang-2 is able to induce a switch of vascular phenotypes within tumours.
Resumo:
CONTEXT: The incidence of bladder cancer increases with advancing age. Considering the increasing life expectancy and the increasing proportion of elderly people in the general population, radical cystectomy will be considered for a growing number of elderly patients who suffer from muscle-invasive or recurrent bladder cancer. OBJECTIVE: This article reviews contemporary complication and mortality rates after radical cystectomy in elderly patients and the relationship between age and short-term outcome after this procedure. EVIDENCE ACQUISITION: A literature review was performed using the PubMed database with combinations of the following keywords cystectomy, elderly, complications, and comorbidity. English-language articles published in the year 2000 or later were reviewed. Papers were included in this review if the authors investigated any relationship between age and complication rates with radical cystectomy for bladder cancer or if they reported complication rates stratified by age groups. EVIDENCE SYNTHESIS: Perioperative morbidity and mortality are increased and continence rates after orthotopic urinary diversion are impaired in elderly patients undergoing radical cystectomy. Complications are frequent in this population, particularly when an extended postoperative period (90 d instead of 30 d) is considered. CONCLUSIONS: Although age alone does not preclude radical cystectomy for muscle-invasive or recurrent bladder cancer or for certain types of urinary diversion, careful surveillance is required, even after the first 30 d after surgery. Excellent perioperative management may contribute to the prevention of morbidity and mortality of radical cystectomy, supplementary to the skills of the surgeon, and is probably a reason for the better perioperative results obtained in high-volume centers.
Resumo:
BACKGROUND: This collaboration of seven observational clinical cohorts investigated risk factors for treatment-limiting toxicities in both antiretroviral-naive and experienced patients starting nevirapine-based combination antiretroviral therapy (NVPc). METHODS: Patients starting NVPc after 1 January 1998 were included. CD4 cell count at starting NVPc was classified as high (>400/microl/>250/microl for men/women, respectively) or low. Cox models were used to investigate risk factors for discontinuations due to hypersensitivity reactions (HSR, n = 6547) and discontinuation of NVPc due to treatment-limiting toxicities and/or patient/physician choice (TOXPC, n = 10,186). Patients were classified according to prior antiretroviral treatment experience and CD4 cell count/viral load at start NVPc. Models were stratified by cohort and adjusted for age, sex, nadir CD4 cell count, calendar year of starting NVPc and mode of transmission. RESULTS: Median time from starting NVPc to TOXPC and HSR were 162 days [interquartile range (IQR) 31-737] and 30 days (IQR 17-60), respectively. In adjusted Cox analyses, compared to naive patients with a low CD4 cell count, treatment-experienced patients with high CD4 cell count and viral load more than 400 had a significantly increased risk for HSR [hazard ratio 1.45, confidence interval (CI) 1.03-2.03] and TOXPC within 18 weeks (hazard ratio 1.34, CI 1.08-1.67). In contrast, treatment-experienced patients with high CD4 cell count and viral load less than 400 had no increased risk for HSR 1.10 (0.82-1.46) or TOXPC within 18 weeks (hazard ratio 0.94, CI 0.78-1.13). CONCLUSION: Our results suggest it may be relatively well tolerated to initiate NVPc in antiretroviral-experienced patients with high CD4 cell counts provided there is no detectable viremia.
Resumo:
BACKGROUND: The development of arsenical and diamidine resistance in Trypanosoma brucei is associated with loss of drug uptake by the P2 purine transporter as a result of alterations in the corresponding T. brucei adenosine transporter 1 gene (TbAT1). Previously, specific TbAT1 mutant type alleles linked to melarsoprol treatment failure were significantly more prevalent in T. b. gambiense from relapse patients at Omugo health centre in Arua district. Relapse rates of up to 30% prompted a shift from melarsoprol to eflornithine (alpha-difluoromethylornithine, DFMO) as first-line treatment at this centre. The aim of this study was to determine the status of TbAT1 in recent isolates collected from T. b. gambiense sleeping sickness patients from Arua and Moyo districts in Northwestern Uganda after this shift in first-line drug choice. METHODOLOGY AND RESULTS: Blood and cerebrospinal fluids of consenting patients were collected for DNA preparation and subsequent amplification. All of the 105 isolates from Omugo that we successfully analysed by PCR-RFLP possessed the TbAT1 wild type allele. In addition, PCR/RFLP analysis was performed for 74 samples from Moyo, where melarsoprol is still the first line drug; 61 samples displayed the wild genotype while six were mutant and seven had a mixed pattern of both mutant and wild-type TbAT1. The melarsoprol treatment failure rate at Moyo over the same period was nine out of 101 stage II cases that were followed up at least once. Five of the relapse cases harboured mutant TbAT1, one had the wild type, while no amplification was achieved from the remaining three samples. CONCLUSIONS/SIGNIFICANCE: The apparent disappearance of mutant alleles at Omugo may correlate with melarsoprol withdrawal as first-line treatment. Our results suggest that melarsoprol could successfully be reintroduced following a time lag subsequent to its replacement. A field-applicable test to predict melarsoprol treatment outcome and identify patients for whom the drug can still be beneficial is clearly required. This will facilitate cost-effective management of HAT in rural resource-poor settings, given that eflornithine has a much higher logistical requirement for its application.
Resumo:
BACKGROUND: Prognostic models for children starting antiretroviral therapy (ART) in Africa are lacking. We developed models to estimate the probability of death during the first year receiving ART in Southern Africa. METHODS: We analyzed data from children ≤10 years old who started ART in Malawi, South Africa, Zambia or Zimbabwe from 2004-2010. Children lost to follow-up or transferred were excluded. The primary outcome was all-cause mortality in the first year of ART. We used Weibull survival models to construct two prognostic models: one with CD4%, age, WHO clinical stage, weight-for-age z-score (WAZ) and anemia and one without CD4%, because it is not routinely measured in many programs. We used multiple imputation to account for missing data. RESULTS: Among 12655 children, 877 (6.9%) died in the first year of ART. 1780 children were lost to follow-up/transferred and excluded from main analyses; 10875 children were included. With the CD4% model probability of death at 1 year ranged from 1.8% (95% CI: 1.5-2.3) in children 5-10 years with CD4% ≥10%, WHO stage I/II, WAZ ≥-2 and without severe anemia to 46.3% (95% CI: 38.2-55.2) in children <1 year with CD4% <5%, stage III/IV, WAZ< -3 and severe anemia. The corresponding range for the model without CD4% was 2.2% (95% CI: 1.8-2.7) to 33.4% (95% CI: 28.2-39.3). Agreement between predicted and observed mortality was good (C-statistics=0.753 and 0.745 for models with and without CD4% respectively). CONCLUSION: These models may be useful to counsel children/caregivers, for program planning and to assess program outcomes after allowing for differences in patient disease severity characteristics.
Resumo:
BACKGROUND Since 2005, increasing numbers of children have started antiretroviral therapy (ART) in sub-Saharan Africa and, in recent years, WHO and country treatment guidelines have recommended ART initiation for all infants and very young children, and at higher CD4 thresholds for older children. We examined temporal changes in patient and regimen characteristics at ART start using data from 12 cohorts in 4 countries participating in the IeDEA-SA collaboration. METHODOLOGY/PRINCIPAL FINDINGS Data from 30,300 ART-naïve children aged <16 years at ART initiation who started therapy between 2005 and 2010 were analysed. We examined changes in median values for continuous variables using the Cuzick's test for trend over time. We also examined changes in the proportions of patients with particular disease severity characteristics (expressed as a binary variable e.g. WHO Stage III/IV vs I/II) using logistic regression. Between 2005 and 2010 the number of children starting ART each year increased and median age declined from 63 months (2006) to 56 months (2010). Both the proportion of children <1 year and ≥10 years of age increased from 12 to 19% and 18 to 22% respectively. Children had less severe disease at ART initiation in later years with significant declines in the percentage with severe immunosuppression (81 to 63%), WHO Stage III/IV disease (75 to 62%), severe anemia (12 to 7%) and weight-for-age z-score<-3 (31 to 28%). Similar results were seen when restricting to infants with significant declines in the proportion with severe immunodeficiency (98 to 82%) and Stage III/IV disease (81 to 63%). First-line regimen use followed country guidelines. CONCLUSIONS/SIGNIFICANCE Between 2005 and 2010 increasing numbers of children have initiated ART with a decline in disease severity at start of therapy. However, even in 2010, a substantial number of infants and children started ART with advanced disease. These results highlight the importance of efforts to improve access to HIV diagnostic testing and ART in children.
Resumo:
BACKGROUND There is limited evidence on the optimal timing of antiretroviral therapy (ART) initiation in children 2-5 y of age. We conducted a causal modelling analysis using the International Epidemiologic Databases to Evaluate AIDS-Southern Africa (IeDEA-SA) collaborative dataset to determine the difference in mortality when starting ART in children aged 2-5 y immediately (irrespective of CD4 criteria), as recommended in the World Health Organization (WHO) 2013 guidelines, compared to deferring to lower CD4 thresholds, for example, the WHO 2010 recommended threshold of CD4 count <750 cells/mm(3) or CD4 percentage (CD4%) <25%. METHODS AND FINDINGS ART-naïve children enrolling in HIV care at IeDEA-SA sites who were between 24 and 59 mo of age at first visit and with ≥1 visit prior to ART initiation and ≥1 follow-up visit were included. We estimated mortality for ART initiation at different CD4 thresholds for up to 3 y using g-computation, adjusting for measured time-dependent confounding of CD4 percent, CD4 count, and weight-for-age z-score. Confidence intervals were constructed using bootstrapping. The median (first; third quartile) age at first visit of 2,934 children (51% male) included in the analysis was 3.3 y (2.6; 4.1), with a median (first; third quartile) CD4 count of 592 cells/mm(3) (356; 895) and median (first; third quartile) CD4% of 16% (10%; 23%). The estimated cumulative mortality after 3 y for ART initiation at different CD4 thresholds ranged from 3.4% (95% CI: 2.1-6.5) (no ART) to 2.1% (95% CI: 1.3%-3.5%) (ART irrespective of CD4 value). Estimated mortality was overall higher when initiating ART at lower CD4 values or not at all. There was no mortality difference between starting ART immediately, irrespective of CD4 value, and ART initiation at the WHO 2010 recommended threshold of CD4 count <750 cells/mm(3) or CD4% <25%, with mortality estimates of 2.1% (95% CI: 1.3%-3.5%) and 2.2% (95% CI: 1.4%-3.5%) after 3 y, respectively. The analysis was limited by loss to follow-up and the unavailability of WHO staging data. CONCLUSIONS The results indicate no mortality difference for up to 3 y between ART initiation irrespective of CD4 value and ART initiation at a threshold of CD4 count <750 cells/mm(3) or CD4% <25%, but there are overall higher point estimates for mortality when ART is initiated at lower CD4 values. Please see later in the article for the Editors' Summary.
Resumo:
We report the case of a patient in whom successful radiofrequency catheter ablation of an idiopathic ventricular tachycardia (VT) originating in the main stem of the pulmonary artery was performed. After successful ablation of the index arrhythmia, which was an idiopathic right ventricular outflow tract VT, a second VT with a different QRS morphology was reproducibly induced. Mapping of the second VT revealed the presence of myocardium approximately 2 cm above the pulmonary valve. Application of radiofrequency energy at this site resulted in termination and noninducibility of this VT. After 6-month follow-up, the patient remained free from VT recurrences.
Resumo:
Background Non-AIDS defining cancers (NADC) are an important cause of morbidity and mortality in HIV-positive individuals. Using data from a large international cohort of HIV-positive individuals, we described the incidence of NADC from 2004–2010, and described subsequent mortality and predictors of these. Methods Individuals were followed from 1st January 2004/enrolment in study, until the earliest of a new NADC, 1st February 2010, death or six months after the patient’s last visit. Incidence rates were estimated for each year of follow-up, overall and stratified by gender, age and mode of HIV acquisition. Cumulative risk of mortality following NADC diagnosis was summarised using Kaplan-Meier methods, with follow-up for these analyses from the date of NADC diagnosis until the patient’s death, 1st February 2010 or 6 months after the patient’s last visit. Factors associated with mortality following NADC diagnosis were identified using multivariable Cox proportional hazards regression. Results Over 176,775 person-years (PY), 880 (2.1%) patients developed a new NADC (incidence: 4.98/1000PY [95% confidence interval 4.65, 5.31]). Over a third of these patients (327, 37.2%) had died by 1st February 2010. Time trends for lung cancer, anal cancer and Hodgkin’s lymphoma were broadly consistent. Kaplan-Meier cumulative mortality estimates at 1, 3 and 5 years after NADC diagnosis were 28.2% [95% CI 25.1-31.2], 42.0% [38.2-45.8] and 47.3% [42.4-52.2], respectively. Significant predictors of poorer survival after diagnosis of NADC were lung cancer (compared to other cancer types), male gender, non-white ethnicity, and smoking status. Later year of diagnosis and higher CD4 count at NADC diagnosis were associated with improved survival. The incidence of NADC remained stable over the period 2004–2010 in this large observational cohort. Conclusions The prognosis after diagnosis of NADC, in particular lung cancer and disseminated cancer, is poor but has improved somewhat over time. Modifiable risk factors, such as smoking and low CD4 counts, were associated with mortality following a diagnosis of NADC.
Resumo:
Background Few studies have monitored late presentation (LP) of HIV infection over the European continent, including Eastern Europe. Study objectives were to explore the impact of LP on AIDS and mortality. Methods and Findings LP was defined in Collaboration of Observational HIV Epidemiological Research Europe (COHERE) as HIV diagnosis with a CD4 count <350/mm3 or an AIDS diagnosis within 6 months of HIV diagnosis among persons presenting for care between 1 January 2000 and 30 June 2011. Logistic regression was used to identify factors associated with LP and Poisson regression to explore the impact on AIDS/death. 84,524 individuals from 23 cohorts in 35 countries contributed data; 45,488 were LP (53.8%). LP was highest in heterosexual males (66.1%), Southern European countries (57.0%), and persons originating from Africa (65.1%). LP decreased from 57.3% in 2000 to 51.7% in 2010/2011 (adjusted odds ratio [aOR] 0.96; 95% CI 0.95–0.97). LP decreased over time in both Central and Northern Europe among homosexual men, and male and female heterosexuals, but increased over time for female heterosexuals and male intravenous drug users (IDUs) from Southern Europe and in male and female IDUs from Eastern Europe. 8,187 AIDS/deaths occurred during 327,003 person-years of follow-up. In the first year after HIV diagnosis, LP was associated with over a 13-fold increased incidence of AIDS/death in Southern Europe (adjusted incidence rate ratio [aIRR] 13.02; 95% CI 8.19–20.70) and over a 6-fold increased rate in Eastern Europe (aIRR 6.64; 95% CI 3.55–12.43). Conclusions LP has decreased over time across Europe, but remains a significant issue in the region in all HIV exposure groups. LP increased in male IDUs and female heterosexuals from Southern Europe and IDUs in Eastern Europe. LP was associated with an increased rate of AIDS/deaths, particularly in the first year after HIV diagnosis, with significant variation across Europe. Earlier and more widespread testing, timely referrals after testing positive, and improved retention in care strategies are required to further reduce the incidence of LP.
Resumo:
Background. Few studies consider the incidence of individual AIDS-defining illnesses (ADIs) at higher CD4 counts, relevant on a population level for monitoring and resource allocation. Methods. Individuals from the Collaboration of Observational HIV Epidemiological Research Europe (COHERE) aged ≥14 years with ≥1 CD4 count of ≥200 µL between 1998 and 2010 were included. Incidence rates (per 1000 person-years of follow-up [PYFU]) were calculated for each ADI within different CD4 strata; Poisson regression, using generalized estimating equations and robust standard errors, was used to model rates of ADIs with current CD4 ≥500/µL. Results. A total of 12 135 ADIs occurred at a CD4 count of ≥200 cells/µL among 207 539 persons with 1 154 803 PYFU. Incidence rates declined from 20.5 per 1000 PYFU (95% confidence interval [CI], 20.0–21.1 per 1000 PYFU) with current CD4 200–349 cells/µL to 4.1 per 1000 PYFU (95% CI, 3.6–4.6 per 1000 PYFU) with current CD4 ≥ 1000 cells/µL. Persons with a current CD4 of 500–749 cells/µL had a significantly higher rate of ADIs (adjusted incidence rate ratio [aIRR], 1.20; 95% CI, 1.10–1.32), whereas those with a current CD4 of ≥1000 cells/µL had a similar rate (aIRR, 0.92; 95% CI, .79–1.07), compared to a current CD4 of 750–999 cells/µL. Results were consistent in persons with high or low viral load. Findings were stronger for malignant ADIs (aIRR, 1.52; 95% CI, 1.25–1.86) than for nonmalignant ADIs (aIRR, 1.12; 95% CI, 1.01–1.25), comparing persons with a current CD4 of 500–749 cells/µL to 750–999 cells/µL. Discussion. The incidence of ADIs was higher in individuals with a current CD4 count of 500–749 cells/µL compared to those with a CD4 count of 750–999 cells/µL, but did not decrease further at higher CD4 counts. Results were similar in patients virologically suppressed on combination antiretroviral therapy, suggesting that immune reconstitution is not complete until the CD4 increases to >750 cells/µL.
Resumo:
Context: In virologically suppressed, antiretroviral-treated patients, the effect of switching to tenofovir (TDF) on bone biomarkers compared to patients remaining on stable antiretroviral therapy is unknown. Methods: We examined bone biomarkers (osteocalcin [OC], procollagen type 1 amino-terminal propeptide, and C-terminal cross-linking telopeptide of type 1 collagen) and bone mineral density (BMD) over 48 weeks in virologically suppressed patients (HIV RNA < 50 copies/ml) randomized to switch to TDF/emtricitabine (FTC) or remain on first-line zidovudine (AZT)/lamivudine (3TC). PTH was also measured. Between-group differences in bone biomarkers and associations between change in bone biomarkers and BMD measures were assessed by Student's t tests, Pearson correlation, and multivariable linear regression, respectively. All data are expressed as mean (SD), unless otherwise specified. Results: Of 53 subjects (aged 46.0 y; 84.9% male; 75.5% Caucasian), 29 switched to TDF/FTC. There were reductions in total hip and lumbar spine BMD in those switching to TDF/FTC (total hip, TDF/FTC, −1.73 (2.76)% vs AZT/3TC, −0.39 (2.41)%; between-group P = .07; lumbar spine, TDF/FTC, −1.50 (3.49)% vs AZT/3TC, +0.25 (2.82)%; between-group P = .06), but they did not reach statistical significance. Greater declines in lumbar spine BMD correlated with greater increases in OC (r = −0.28; P = .05). The effect of TDF/FTC on bone biomarkers remained significant when adjusted for baseline biomarker levels, gender, and ethnicity. There was no difference in change in PTH levels over 48 weeks between treatment groups (between-group P = .23). All biomarkers increased significantly from weeks 0 to 48 in the switch group, with no significant change in those remaining on AZT/3TC (between-group, all biomarkers, P < .0001). Conclusion: A switch to TDF/FTC compared to remaining on a stable regimen is associated with increases in bone turnover that correlate with reductions in BMD, suggesting that TDF exposure directly affects bone metabolism in vivo.