993 resultados para Reiss, Harry.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: This collaboration of seven observational clinical cohorts investigated risk factors for treatment-limiting toxicities in both antiretroviral-naive and experienced patients starting nevirapine-based combination antiretroviral therapy (NVPc). METHODS: Patients starting NVPc after 1 January 1998 were included. CD4 cell count at starting NVPc was classified as high (>400/microl/>250/microl for men/women, respectively) or low. Cox models were used to investigate risk factors for discontinuations due to hypersensitivity reactions (HSR, n = 6547) and discontinuation of NVPc due to treatment-limiting toxicities and/or patient/physician choice (TOXPC, n = 10,186). Patients were classified according to prior antiretroviral treatment experience and CD4 cell count/viral load at start NVPc. Models were stratified by cohort and adjusted for age, sex, nadir CD4 cell count, calendar year of starting NVPc and mode of transmission. RESULTS: Median time from starting NVPc to TOXPC and HSR were 162 days [interquartile range (IQR) 31-737] and 30 days (IQR 17-60), respectively. In adjusted Cox analyses, compared to naive patients with a low CD4 cell count, treatment-experienced patients with high CD4 cell count and viral load more than 400 had a significantly increased risk for HSR [hazard ratio 1.45, confidence interval (CI) 1.03-2.03] and TOXPC within 18 weeks (hazard ratio 1.34, CI 1.08-1.67). In contrast, treatment-experienced patients with high CD4 cell count and viral load less than 400 had no increased risk for HSR 1.10 (0.82-1.46) or TOXPC within 18 weeks (hazard ratio 0.94, CI 0.78-1.13). CONCLUSION: Our results suggest it may be relatively well tolerated to initiate NVPc in antiretroviral-experienced patients with high CD4 cell counts provided there is no detectable viremia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The development of arsenical and diamidine resistance in Trypanosoma brucei is associated with loss of drug uptake by the P2 purine transporter as a result of alterations in the corresponding T. brucei adenosine transporter 1 gene (TbAT1). Previously, specific TbAT1 mutant type alleles linked to melarsoprol treatment failure were significantly more prevalent in T. b. gambiense from relapse patients at Omugo health centre in Arua district. Relapse rates of up to 30% prompted a shift from melarsoprol to eflornithine (alpha-difluoromethylornithine, DFMO) as first-line treatment at this centre. The aim of this study was to determine the status of TbAT1 in recent isolates collected from T. b. gambiense sleeping sickness patients from Arua and Moyo districts in Northwestern Uganda after this shift in first-line drug choice. METHODOLOGY AND RESULTS: Blood and cerebrospinal fluids of consenting patients were collected for DNA preparation and subsequent amplification. All of the 105 isolates from Omugo that we successfully analysed by PCR-RFLP possessed the TbAT1 wild type allele. In addition, PCR/RFLP analysis was performed for 74 samples from Moyo, where melarsoprol is still the first line drug; 61 samples displayed the wild genotype while six were mutant and seven had a mixed pattern of both mutant and wild-type TbAT1. The melarsoprol treatment failure rate at Moyo over the same period was nine out of 101 stage II cases that were followed up at least once. Five of the relapse cases harboured mutant TbAT1, one had the wild type, while no amplification was achieved from the remaining three samples. CONCLUSIONS/SIGNIFICANCE: The apparent disappearance of mutant alleles at Omugo may correlate with melarsoprol withdrawal as first-line treatment. Our results suggest that melarsoprol could successfully be reintroduced following a time lag subsequent to its replacement. A field-applicable test to predict melarsoprol treatment outcome and identify patients for whom the drug can still be beneficial is clearly required. This will facilitate cost-effective management of HAT in rural resource-poor settings, given that eflornithine has a much higher logistical requirement for its application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a recent policy document of the organized employers in the care and welfare sector in The Netherlands (the MO Group), directors and board members of care and welfare institutions present themselves as "social entrepreneurs", managing their institutions as look-a like commercial companies. They are hardly criticized and there is not any countervailing power of significance. The workers are focusing on their own specialized professional fields and divided as a whole. Many government officials are in favour or do not bother. The relatively small number of intellectual workers in Dutch care and welfare are fragmented and pragmatic. From a democratic point of view this is a worrying situation. From a professional point of view the purpose and functions of professional care and welfare work are at stake. The penetration of market mechanisms and the take-over by commercially orientated managers result from unquestioned adaptation of Anglo-Saxon policy in The Netherlands in the 1990's, following the crisis of the Welfare State in the late 1980's. The polder country is now confronted fully with the pressure and negative effects of unbalanced powers in the institutions, i.e. Managerialism. After years of silence, the two principal authentic critics of Dutch care and welfare, Harry Kunneman and Andries Baart, are no longer voices crying in the wilderness, but are getting a response from a growing number of worried workers and intellectuals. Kunneman and Baart warn against the restriction of professional space and the loss of normative values and standards in the profession. They are right. It is high time to make room for criticism and to start a debate about the future of the social professions in The Netherlands, better: in Europe. Research, discussion and action have to prove how worrying the everyday situation of professional workers is, what goals have to be set and what strategy to be chosen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In most Western countries, the professional status of social workers is instable and insecure. Of course, most Western countries are themselves instable, ridden with feelings of insecurity and in search of reassurance and promises of control. But social work hardly lends itself as a projection screen for visions of professional control and efficiency in the face of insecurity. On the contrary: within the present cultural and political climate, social work connotes primarily with unpopular social problems, with people unable to cope adequately with the competitiveness and the rate of change of post-industrial societies, that is to say: it connotes more with dependency and helplessness then with autonomy and control. Moreover, whereas public discourse in most Western country is dominated by a neo-liberal perspective and the intricate network of economic, managerial, consumerist and military metaphors connected with it, social work still carries with it a legacy of 'progressive politics' increasingly labeled as outdated and inadequate. Although the values of solidarity and social justice connected with this 'progressive heritage' certainly have not faded away completely, the loudest and most popular voices on the level of public discourse keep underscoring the necessity to adapt to the 'realities' of present-day postindustrial societies and their dependence on economic growth, technological innovation and the dynamics of an ever more competitive world-market. This 'unavoidable' adaptation involves both the 'modernization' and progressive diminishment of 'costly' welfare-state arrangements and a radical reorientation of social work as a profession. Instead of furthering the dependency of clients in the name of solidarity, social workers should stimulate them to face their own responsibilities and help them to function more adequately in a world where individual autonomy and economic progress are dominant values. This shift has far-reaching consequences for the organization of the work itself. Efficiency and transparency are the new code words, professional autonomy is dramatically limited and interventions of social workers are increasingly bound to 'objective' standards of success and cost-effectiveness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Prognostic models for children starting antiretroviral therapy (ART) in Africa are lacking. We developed models to estimate the probability of death during the first year receiving ART in Southern Africa. METHODS: We analyzed data from children ≤10 years old who started ART in Malawi, South Africa, Zambia or Zimbabwe from 2004-2010. Children lost to follow-up or transferred were excluded. The primary outcome was all-cause mortality in the first year of ART. We used Weibull survival models to construct two prognostic models: one with CD4%, age, WHO clinical stage, weight-for-age z-score (WAZ) and anemia and one without CD4%, because it is not routinely measured in many programs. We used multiple imputation to account for missing data. RESULTS: Among 12655 children, 877 (6.9%) died in the first year of ART. 1780 children were lost to follow-up/transferred and excluded from main analyses; 10875 children were included. With the CD4% model probability of death at 1 year ranged from 1.8% (95% CI: 1.5-2.3) in children 5-10 years with CD4% ≥10%, WHO stage I/II, WAZ ≥-2 and without severe anemia to 46.3% (95% CI: 38.2-55.2) in children <1 year with CD4% <5%, stage III/IV, WAZ< -3 and severe anemia. The corresponding range for the model without CD4% was 2.2% (95% CI: 1.8-2.7) to 33.4% (95% CI: 28.2-39.3). Agreement between predicted and observed mortality was good (C-statistics=0.753 and 0.745 for models with and without CD4% respectively). CONCLUSION: These models may be useful to counsel children/caregivers, for program planning and to assess program outcomes after allowing for differences in patient disease severity characteristics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Since 2005, increasing numbers of children have started antiretroviral therapy (ART) in sub-Saharan Africa and, in recent years, WHO and country treatment guidelines have recommended ART initiation for all infants and very young children, and at higher CD4 thresholds for older children. We examined temporal changes in patient and regimen characteristics at ART start using data from 12 cohorts in 4 countries participating in the IeDEA-SA collaboration. METHODOLOGY/PRINCIPAL FINDINGS Data from 30,300 ART-naïve children aged <16 years at ART initiation who started therapy between 2005 and 2010 were analysed. We examined changes in median values for continuous variables using the Cuzick's test for trend over time. We also examined changes in the proportions of patients with particular disease severity characteristics (expressed as a binary variable e.g. WHO Stage III/IV vs I/II) using logistic regression. Between 2005 and 2010 the number of children starting ART each year increased and median age declined from 63 months (2006) to 56 months (2010). Both the proportion of children <1 year and ≥10 years of age increased from 12 to 19% and 18 to 22% respectively. Children had less severe disease at ART initiation in later years with significant declines in the percentage with severe immunosuppression (81 to 63%), WHO Stage III/IV disease (75 to 62%), severe anemia (12 to 7%) and weight-for-age z-score<-3 (31 to 28%). Similar results were seen when restricting to infants with significant declines in the proportion with severe immunodeficiency (98 to 82%) and Stage III/IV disease (81 to 63%). First-line regimen use followed country guidelines. CONCLUSIONS/SIGNIFICANCE Between 2005 and 2010 increasing numbers of children have initiated ART with a decline in disease severity at start of therapy. However, even in 2010, a substantial number of infants and children started ART with advanced disease. These results highlight the importance of efforts to improve access to HIV diagnostic testing and ART in children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND There is limited evidence on the optimal timing of antiretroviral therapy (ART) initiation in children 2-5 y of age. We conducted a causal modelling analysis using the International Epidemiologic Databases to Evaluate AIDS-Southern Africa (IeDEA-SA) collaborative dataset to determine the difference in mortality when starting ART in children aged 2-5 y immediately (irrespective of CD4 criteria), as recommended in the World Health Organization (WHO) 2013 guidelines, compared to deferring to lower CD4 thresholds, for example, the WHO 2010 recommended threshold of CD4 count <750 cells/mm(3) or CD4 percentage (CD4%) <25%. METHODS AND FINDINGS ART-naïve children enrolling in HIV care at IeDEA-SA sites who were between 24 and 59 mo of age at first visit and with ≥1 visit prior to ART initiation and ≥1 follow-up visit were included. We estimated mortality for ART initiation at different CD4 thresholds for up to 3 y using g-computation, adjusting for measured time-dependent confounding of CD4 percent, CD4 count, and weight-for-age z-score. Confidence intervals were constructed using bootstrapping. The median (first; third quartile) age at first visit of 2,934 children (51% male) included in the analysis was 3.3 y (2.6; 4.1), with a median (first; third quartile) CD4 count of 592 cells/mm(3) (356; 895) and median (first; third quartile) CD4% of 16% (10%; 23%). The estimated cumulative mortality after 3 y for ART initiation at different CD4 thresholds ranged from 3.4% (95% CI: 2.1-6.5) (no ART) to 2.1% (95% CI: 1.3%-3.5%) (ART irrespective of CD4 value). Estimated mortality was overall higher when initiating ART at lower CD4 values or not at all. There was no mortality difference between starting ART immediately, irrespective of CD4 value, and ART initiation at the WHO 2010 recommended threshold of CD4 count <750 cells/mm(3) or CD4% <25%, with mortality estimates of 2.1% (95% CI: 1.3%-3.5%) and 2.2% (95% CI: 1.4%-3.5%) after 3 y, respectively. The analysis was limited by loss to follow-up and the unavailability of WHO staging data. CONCLUSIONS The results indicate no mortality difference for up to 3 y between ART initiation irrespective of CD4 value and ART initiation at a threshold of CD4 count <750 cells/mm(3) or CD4% <25%, but there are overall higher point estimates for mortality when ART is initiated at lower CD4 values. Please see later in the article for the Editors' Summary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the case of a patient in whom successful radiofrequency catheter ablation of an idiopathic ventricular tachycardia (VT) originating in the main stem of the pulmonary artery was performed. After successful ablation of the index arrhythmia, which was an idiopathic right ventricular outflow tract VT, a second VT with a different QRS morphology was reproducibly induced. Mapping of the second VT revealed the presence of myocardium approximately 2 cm above the pulmonary valve. Application of radiofrequency energy at this site resulted in termination and noninducibility of this VT. After 6-month follow-up, the patient remained free from VT recurrences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Non-AIDS defining cancers (NADC) are an important cause of morbidity and mortality in HIV-positive individuals. Using data from a large international cohort of HIV-positive individuals, we described the incidence of NADC from 2004–2010, and described subsequent mortality and predictors of these. Methods Individuals were followed from 1st January 2004/enrolment in study, until the earliest of a new NADC, 1st February 2010, death or six months after the patient’s last visit. Incidence rates were estimated for each year of follow-up, overall and stratified by gender, age and mode of HIV acquisition. Cumulative risk of mortality following NADC diagnosis was summarised using Kaplan-Meier methods, with follow-up for these analyses from the date of NADC diagnosis until the patient’s death, 1st February 2010 or 6 months after the patient’s last visit. Factors associated with mortality following NADC diagnosis were identified using multivariable Cox proportional hazards regression. Results Over 176,775 person-years (PY), 880 (2.1%) patients developed a new NADC (incidence: 4.98/1000PY [95% confidence interval 4.65, 5.31]). Over a third of these patients (327, 37.2%) had died by 1st February 2010. Time trends for lung cancer, anal cancer and Hodgkin’s lymphoma were broadly consistent. Kaplan-Meier cumulative mortality estimates at 1, 3 and 5 years after NADC diagnosis were 28.2% [95% CI 25.1-31.2], 42.0% [38.2-45.8] and 47.3% [42.4-52.2], respectively. Significant predictors of poorer survival after diagnosis of NADC were lung cancer (compared to other cancer types), male gender, non-white ethnicity, and smoking status. Later year of diagnosis and higher CD4 count at NADC diagnosis were associated with improved survival. The incidence of NADC remained stable over the period 2004–2010 in this large observational cohort. Conclusions The prognosis after diagnosis of NADC, in particular lung cancer and disseminated cancer, is poor but has improved somewhat over time. Modifiable risk factors, such as smoking and low CD4 counts, were associated with mortality following a diagnosis of NADC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Few studies have monitored late presentation (LP) of HIV infection over the European continent, including Eastern Europe. Study objectives were to explore the impact of LP on AIDS and mortality. Methods and Findings LP was defined in Collaboration of Observational HIV Epidemiological Research Europe (COHERE) as HIV diagnosis with a CD4 count <350/mm3 or an AIDS diagnosis within 6 months of HIV diagnosis among persons presenting for care between 1 January 2000 and 30 June 2011. Logistic regression was used to identify factors associated with LP and Poisson regression to explore the impact on AIDS/death. 84,524 individuals from 23 cohorts in 35 countries contributed data; 45,488 were LP (53.8%). LP was highest in heterosexual males (66.1%), Southern European countries (57.0%), and persons originating from Africa (65.1%). LP decreased from 57.3% in 2000 to 51.7% in 2010/2011 (adjusted odds ratio [aOR] 0.96; 95% CI 0.95–0.97). LP decreased over time in both Central and Northern Europe among homosexual men, and male and female heterosexuals, but increased over time for female heterosexuals and male intravenous drug users (IDUs) from Southern Europe and in male and female IDUs from Eastern Europe. 8,187 AIDS/deaths occurred during 327,003 person-years of follow-up. In the first year after HIV diagnosis, LP was associated with over a 13-fold increased incidence of AIDS/death in Southern Europe (adjusted incidence rate ratio [aIRR] 13.02; 95% CI 8.19–20.70) and over a 6-fold increased rate in Eastern Europe (aIRR 6.64; 95% CI 3.55–12.43). Conclusions LP has decreased over time across Europe, but remains a significant issue in the region in all HIV exposure groups. LP increased in male IDUs and female heterosexuals from Southern Europe and IDUs in Eastern Europe. LP was associated with an increased rate of AIDS/deaths, particularly in the first year after HIV diagnosis, with significant variation across Europe. Earlier and more widespread testing, timely referrals after testing positive, and improved retention in care strategies are required to further reduce the incidence of LP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background. Few studies consider the incidence of individual AIDS-defining illnesses (ADIs) at higher CD4 counts, relevant on a population level for monitoring and resource allocation. Methods. Individuals from the Collaboration of Observational HIV Epidemiological Research Europe (COHERE) aged ≥14 years with ≥1 CD4 count of ≥200 µL between 1998 and 2010 were included. Incidence rates (per 1000 person-years of follow-up [PYFU]) were calculated for each ADI within different CD4 strata; Poisson regression, using generalized estimating equations and robust standard errors, was used to model rates of ADIs with current CD4 ≥500/µL. Results. A total of 12 135 ADIs occurred at a CD4 count of ≥200 cells/µL among 207 539 persons with 1 154 803 PYFU. Incidence rates declined from 20.5 per 1000 PYFU (95% confidence interval [CI], 20.0–21.1 per 1000 PYFU) with current CD4 200–349 cells/µL to 4.1 per 1000 PYFU (95% CI, 3.6–4.6 per 1000 PYFU) with current CD4 ≥ 1000 cells/µL. Persons with a current CD4 of 500–749 cells/µL had a significantly higher rate of ADIs (adjusted incidence rate ratio [aIRR], 1.20; 95% CI, 1.10–1.32), whereas those with a current CD4 of ≥1000 cells/µL had a similar rate (aIRR, 0.92; 95% CI, .79–1.07), compared to a current CD4 of 750–999 cells/µL. Results were consistent in persons with high or low viral load. Findings were stronger for malignant ADIs (aIRR, 1.52; 95% CI, 1.25–1.86) than for nonmalignant ADIs (aIRR, 1.12; 95% CI, 1.01–1.25), comparing persons with a current CD4 of 500–749 cells/µL to 750–999 cells/µL. Discussion. The incidence of ADIs was higher in individuals with a current CD4 count of 500–749 cells/µL compared to those with a CD4 count of 750–999 cells/µL, but did not decrease further at higher CD4 counts. Results were similar in patients virologically suppressed on combination antiretroviral therapy, suggesting that immune reconstitution is not complete until the CD4 increases to >750 cells/µL.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context: In virologically suppressed, antiretroviral-treated patients, the effect of switching to tenofovir (TDF) on bone biomarkers compared to patients remaining on stable antiretroviral therapy is unknown. Methods: We examined bone biomarkers (osteocalcin [OC], procollagen type 1 amino-terminal propeptide, and C-terminal cross-linking telopeptide of type 1 collagen) and bone mineral density (BMD) over 48 weeks in virologically suppressed patients (HIV RNA < 50 copies/ml) randomized to switch to TDF/emtricitabine (FTC) or remain on first-line zidovudine (AZT)/lamivudine (3TC). PTH was also measured. Between-group differences in bone biomarkers and associations between change in bone biomarkers and BMD measures were assessed by Student's t tests, Pearson correlation, and multivariable linear regression, respectively. All data are expressed as mean (SD), unless otherwise specified. Results: Of 53 subjects (aged 46.0 y; 84.9% male; 75.5% Caucasian), 29 switched to TDF/FTC. There were reductions in total hip and lumbar spine BMD in those switching to TDF/FTC (total hip, TDF/FTC, −1.73 (2.76)% vs AZT/3TC, −0.39 (2.41)%; between-group P = .07; lumbar spine, TDF/FTC, −1.50 (3.49)% vs AZT/3TC, +0.25 (2.82)%; between-group P = .06), but they did not reach statistical significance. Greater declines in lumbar spine BMD correlated with greater increases in OC (r = −0.28; P = .05). The effect of TDF/FTC on bone biomarkers remained significant when adjusted for baseline biomarker levels, gender, and ethnicity. There was no difference in change in PTH levels over 48 weeks between treatment groups (between-group P = .23). All biomarkers increased significantly from weeks 0 to 48 in the switch group, with no significant change in those remaining on AZT/3TC (between-group, all biomarkers, P < .0001). Conclusion: A switch to TDF/FTC compared to remaining on a stable regimen is associated with increases in bone turnover that correlate with reductions in BMD, suggesting that TDF exposure directly affects bone metabolism in vivo.