85 resultados para Saharan eteläpuolinen Afrikka
Resumo:
Tenofovir (TDF) is increasingly used in second-line antiretroviral treatment (ART) in sub-Saharan Africa. We compared outcomes of second-line ART containing and not containing TDF in cohort studies from Zambia and the Republic of South Africa (RSA).
Resumo:
BACKGROUND: Tuberculin skin testing (TST) and preventive treatment of tuberculosis (TB) are recommended for all persons with human immunodeficiency virus (HIV) infection. We aimed to assess the effect of TST and preventive treatment of TB on the incidence of TB in the era of combination antiretroviral therapy in an area with low rates of TB transmission. METHODS: We calculated the incidence of TB among participants who entered the Swiss HIV Cohort Study after 1995, and we studied the associations of TST results, epidemiological and laboratory markers, preventive TB treatment, and combination antiretroviral therapy with TB incidence. RESULTS: Of 6160 participants, 142 (2.3%) had a history of TB at study entry, and 56 (0.91%) developed TB during a total follow-up period of 25,462 person-years, corresponding to an incidence of 0.22 cases per 100 person-years. TST was performed for 69% of patients; 9.4% of patients tested had positive results (induration > or = 5 mm in diameter). Among patients with positive TST results, TB incidence was 1.6 cases per 100 person-years if preventive treatment was withheld, but none of the 193 patients who received preventive treatment developed TB. Positive TST results (adjusted hazard ratio [HR], 25; 95% confidence interval [CI], 11-57), missing TST results (HR, 12; 95% CI, 4.8-20), origin from sub-Saharan Africa (HR, 5.8; 95% CI, 2.7-12.5), low CD4+ cell counts, and high plasma HIV RNA levels were associated with an increased risk of TB, whereas the risk was reduced among persons receiving combination antiretroviral therapy (HR, 0.44; 95% CI, 0.2-0.8). CONCLUSION: Screening for latent TB using TST and administering preventive treatment for patients with positive TST results is an efficacious strategy to reduce TB incidence in areas with low rates of TB transmission. Combination antiretroviral therapy reduces the incidence of TB.
Resumo:
Trypanosoma brucei rhodesiense and T. b. gambiense are the causative agents of sleeping sickness, a fatal disease that affects 36 countries in sub-Saharan Africa. Nevertheless, only a handful of clinically useful drugs are available. These drugs suffer from severe side-effects. The situation is further aggravated by the alarming incidence of treatment failures in several sleeping sickness foci, apparently indicating the occurrence of drug-resistant trypanosomes. Because of these reasons, and since vaccination does not appear to be feasible due to the trypanosomes' ever changing coat of variable surface glycoproteins (VSGs), new drugs are needed urgently. The entry of Trypanosoma brucei into the post-genomic age raises hopes for the identification of novel kinds of drug targets and in turn new treatments for sleeping sickness. The pragmatic definition of a drug target is, a protein that is essential for the parasite and does not have homologues in the host. Such proteins are identified by comparing the predicted proteomes of T. brucei and Homo sapiens, then validated by large-scale gene disruption or gene silencing experiments in trypanosomes. Once all proteins that are essential and unique to the parasite are identified, inhibitors may be found by high-throughput screening. However powerful, this functional genomics approach is going to miss a number of attractive targets. Several current, successful parasiticides attack proteins that have close homologues in the human proteome. Drugs like DFMO or pyrimethamine inhibit parasite and host enzymes alike--a therapeutic window is opened only by subtle differences in the regulation of the targets, which cannot be recognized in silico. Working against the post-genomic approach is also the fact that essential proteins tend to be more highly conserved between species than non-essential ones. Here we advocate drug targeting, i.e. uptake or activation of a drug via parasite-specific pathways, as a chemotherapeutic strategy to selectively inhibit enzymes that have equally sensitive counterparts in the host. The T. brucei purine salvage machinery offers opportunities for both metabolic and transport-based targeting: unusual nucleoside and nucleobase permeases may be exploited for selective import, salvage enzymes for selective activation of purine antimetabolites.
Resumo:
BACKGROUND: Few data are available on the long-term immunologic response to antiretroviral therapy (ART) in resource-limited settings, where ART is being rapidly scaled up using a public health approach, with a limited repertoire of drugs. OBJECTIVES: To describe immunologic response to ART among ART patients in a network of cohorts from sub-Saharan Africa, Latin America, and Asia. STUDY POPULATION/METHODS: Treatment-naive patients aged 15 and older from 27 treatment programs were eligible. Multilevel, linear mixed models were used to assess associations between predictor variables and CD4 cell count trajectories following ART initiation. RESULTS: Of 29 175 patients initiating ART, 8933 (31%) were excluded due to insufficient follow-up time and early lost to follow-up or death. The remaining 19 967 patients contributed 39 200 person-years on ART and 71 067 CD4 cell count measurements. The median baseline CD4 cell count was 114 cells/microl, with 35% having less than 100 cells/microl. Substantial intersite variation in baseline CD4 cell count was observed (range 61-181 cells/microl). Women had higher median baseline CD4 cell counts than men (121 vs. 104 cells/microl). The median CD4 cell count increased from 114 cells/microl at ART initiation to 230 [interquartile range (IQR) 144-338] at 6 months, 263 (IQR 175-376) at 1 year, 336 (IQR 224-472) at 2 years, 372 (IQR 242-537) at 3 years, 377 (IQR 221-561) at 4 years, and 395 (IQR 240-592) at 5 years. In multivariable models, baseline CD4 cell count was the most important determinant of subsequent CD4 cell count trajectories. CONCLUSION: These data demonstrate robust and sustained CD4 response to ART among patients remaining on therapy. Public health and programmatic interventions leading to earlier HIV diagnosis and initiation of ART could substantially improve patient outcomes in resource-limited settings.
Resumo:
OBJECTIVES: To investigate delayed HIV diagnosis and late initiation of antiretroviral therapy (ART) in the Swiss HIV Cohort Study. METHODS: Two sub-populations were included: 1915 patients with HIV diagnosis from 1998 to 2007 and within 3 months of cohort registration (group A), and 1730 treatment-naïve patients with CD4>or=200 cells/microL before their second cohort visit (group B). In group A, predictors for low initial CD4 cell counts were examined with a median regression. In group B, we studied predictors for CD4<200 cells/microL without ART despite cohort follow-up. RESULTS: Median initial CD4 cell count in group A was 331 cells/microL; 31% and 10% were <200 and <50 cells/microL, respectively. Risk factors for low CD4 count were age and non-White race. Homosexual transmission, intravenous drug use and living alone were protective. In group B, 30% initiated ART with CD4>or=200 cells/microL; 18% and 2% dropped to CD4 <200 and <50 cells/microL without ART, respectively. Sub-Saharan origin was associated with lower probability of CD4 <200 cells/microL without ART during follow-up. Median CD4 count at ART initiation was 207 and 253 cells/microL in groups A and B, respectively. CONCLUSIONS: CD4<200 cells/microL and, particularly, CD4<50 cells/microL before starting ART are predominantly caused by late presentation. Earlier HIV diagnosis is paramount.
Resumo:
BACKGROUND: The retention of patients in antiretroviral therapy (ART) programmes is an important issue in resource-limited settings. Loss to follow up can be substantial, but it is unclear what the outcomes are in patients who are lost to programmes. METHODS AND FINDINGS: We searched the PubMed, EMBASE, Latin American and Caribbean Health Sciences Literature (LILACS), Indian Medlars Centre (IndMed) and African Index Medicus (AIM) databases and the abstracts of three conferences for studies that traced patients lost to follow up to ascertain their vital status. Main outcomes were the proportion of patients traced, the proportion found to be alive and the proportion that had died. Where available, we also examined the reasons why some patients could not be traced, why patients found to be alive did not return to the clinic, and the causes of death. We combined mortality data from several studies using random-effects meta-analysis. Seventeen studies were eligible. All were from sub-Saharan Africa, except one study from India, and none were conducted in children. A total of 6420 patients (range 44 to 1343 patients) were included. Patients were traced using telephone calls, home visits and through social networks. Overall the vital status of 4021 patients could be ascertained (63%, range across studies: 45% to 86%); 1602 patients had died. The combined mortality was 40% (95% confidence interval 33%-48%), with substantial heterogeneity between studies (P<0.0001). Mortality in African programmes ranged from 12% to 87% of patients lost to follow-up. Mortality was inversely associated with the rate of loss to follow up in the programme: it declined from around 60% to 20% as the percentage of patients lost to the programme increased from 5% to 50%. Among patients not found, telephone numbers and addresses were frequently incorrect or missing. Common reasons for not returning to the clinic were transfer to another programme, financial problems and improving or deteriorating health. Causes of death were available for 47 deaths: 29 (62%) died of an AIDS defining illness. CONCLUSIONS: In ART programmes in resource-limited settings a substantial minority of adults lost to follow up cannot be traced, and among those traced 20% to 60% had died. Our findings have implications both for patient care and the monitoring and evaluation of programmes.
Resumo:
Rapid diagnostic tests (RDT) are sometimes recommended to improve the home-based management of malaria. The accuracy of an RDT for the detection of clinical malaria and the presence of malarial parasites has recently been evaluated in a high-transmission area of southern Mali. During the same study, the cost-effectiveness of a 'test-and-treat' strategy for the home-based management of malaria (based on an artemisinin-combination therapy) was compared with that of a 'treat-all' strategy. Overall, 301 patients, of all ages, each of whom had been considered a presumptive case of uncomplicated malaria by a village healthworker, were checked with a commercial RDT (Paracheck-Pf). The sensitivity, specificity, and positive and negative predictive values of this test, compared with the results of microscopy and two different definitions of clinical malaria, were then determined. The RDT was found to be 82.9% sensitive (with a 95% confidence interval of 78.0%-87.1%) and 78.9% (63.9%-89.7%) specific compared with the detection of parasites by microscopy. In the detection of clinical malaria, it was 95.2% (91.3%-97.6%) sensitive and 57.4% (48.2%-66.2%) specific compared with a general practitioner's diagnosis of the disease, and 100.0% (94.5%-100.0%) sensitive but only 30.2% (24.8%-36.2%) specific when compared against the fulfillment of the World Health Organization's (2003) research criteria for uncomplicated malaria. Among children aged 0-5 years, the cost of the 'test-and-treat' strategy, per episode, was about twice that of the 'treat-all' (U.S.$1.0. v. U.S.$0.5). In older subjects, however, the two strategies were equally costly (approximately U.S.$2/episode). In conclusion, for children aged 0-5 years in a high-transmission area of sub-Saharan Africa, use of the RDT was not cost-effective compared with the presumptive treatment of malaria with an ACT. In older patients, use of the RDT did not reduce costs. The question remains whether either of the strategies investigated can be made affordable for the affected population.
Resumo:
Famines are often linked to drought in semi-arid areas of Sub-Saharan Africa where not only pastoralists, but also increasingly agro-pastoralists are affected. This study addresses the interplay between drought and famine in the rural semi-arid areas of Makueni district, Kenya, by examining whether, and how crop production conditions and agro-pastoral strategies predispose smallholder households to drought-triggered food insecurity. If this hypothesis holds, then approaches to deal with drought and famine have to target factors causing household food insecurity during non-drought periods. Data from a longitudinal survey of 127 households, interviews, workshops, and daily rainfall records (1961–2003) were analysed using quantitative and qualitative methods. This integrated approach confirms the above hypothesis and reveals that factors other than rainfall, like asset and labour constraints, inadequate policy enforcement, as well as the poverty-driven inability to adopt risk-averse production systems play a key role. When linking these factors to the high rainfall variability, farmer-relevant definitions and forecasts of drought have to be applied.
Resumo:
OBJECTIVES In resource-constrained settings, tuberculosis (TB) is a common opportunistic infection and cause of death in HIV-infected persons. TB may be present at the start of antiretroviral therapy (ART), but it is often under-diagnosed. We describe approaches to TB diagnosis and screening of TB in ART programs in low- and middle-income countries. METHODS AND FINDINGS We surveyed ART programs treating HIV-infected adults in sub-Saharan Africa, Asia and Latin America in 2012 using online questionnaires to collect program-level and patient-level data. Forty-seven sites from 26 countries participated. Patient-level data were collected on 987 adult TB patients from 40 sites (median age 34.7 years; 54% female). Sputum smear microscopy and chest radiograph were available in 47 (100%) sites, TB culture in 44 (94%), and Xpert MTB/RIF in 23 (49%). Xpert MTB/RIF was rarely available in Central Africa and South America. In sites with access to these diagnostics, microscopy was used in 745 (76%) patients diagnosed with TB, culture in 220 (24%), and chest X-ray in 688 (70%) patients. When free of charge culture was done in 27% of patients, compared to 21% when there was a fee (p = 0.033). Corresponding percentages for Xpert MTB/RIF were 26% and 15% of patients (p = 0.001). Screening practices for active disease before starting ART included symptom screening (46 sites, 98%), chest X-ray (38, 81%), sputum microscopy (37, 79%), culture (16, 34%), and Xpert MTB/RIF (5, 11%). CONCLUSIONS Mycobacterial culture was infrequently used despite its availability at most sites, while Xpert MTB/RIF was not generally available. Use of available diagnostics was higher when offered free of charge.
Resumo:
BACKGROUND In adults it is well documented that there are substantial losses to the programme between HIV testing and start of antiretroviral therapy (ART). The magnitude and reasons for loss to follow-up and death between HIV diagnosis and start of ART in children are not well defined. METHODS We searched the PubMed and EMBASE databases for studies on children followed between HIV diagnosis and start of ART in low-income settings. We examined the proportion of children with a CD4 cell count/percentage after after being diagnosed with HIV infection, the number of treatment-eligible children starting ART and predictors of loss to programme. Data were extracted in duplicate. RESULTS Eight studies from sub-Saharan Africa and two studies from Asia with a total of 10,741 children were included. Median age ranged from 2.2 to 6.5 years. Between 78.0 and 97.0% of HIV-infected children subsequently had a CD4 cell count/percentage measured, 63.2 to 90.7% of children with an eligibility assessment met the eligibility criteria for the particular setting and time and 39.5 to 99.4% of the eligible children started ART. Three studies reported an association between low CD4 count/percentage and ART initiation while no association was reported for gender. Only two studies reported on pre-ART mortality and found rates of 13 and 6 per 100 person-years. CONCLUSION Most children who presented for HIV care met eligibility criteria for ART. There is an urgent need for strategies to improve the access to and retention to care of HIV-infected children in resource-limited settings.
Resumo:
BACKGROUND Monitoring of HIV viral load in patients on combination antiretroviral therapy (ART) is not generally available in resource-limited settings. We examined the cost-effectiveness of qualitative point-of-care viral load tests (POC-VL) in sub-Saharan Africa. DESIGN Mathematical model based on longitudinal data from the Gugulethu and Khayelitsha township ART programmes in Cape Town, South Africa. METHODS Cohorts of patients on ART monitored by POC-VL, CD4 cell count or clinically were simulated. Scenario A considered the more accurate detection of treatment failure with POC-VL only, and scenario B also considered the effect on HIV transmission. Scenario C further assumed that the risk of virologic failure is halved with POC-VL due to improved adherence. We estimated the change in costs per quality-adjusted life-year gained (incremental cost-effectiveness ratios, ICERs) of POC-VL compared with CD4 and clinical monitoring. RESULTS POC-VL tests with detection limits less than 1000 copies/ml increased costs due to unnecessary switches to second-line ART, without improving survival. Assuming POC-VL unit costs between US$5 and US$20 and detection limits between 1000 and 10,000 copies/ml, the ICER of POC-VL was US$4010-US$9230 compared with clinical and US$5960-US$25540 compared with CD4 cell count monitoring. In Scenario B, the corresponding ICERs were US$2450-US$5830 and US$2230-US$10380. In Scenario C, the ICER ranged between US$960 and US$2500 compared with clinical monitoring and between cost-saving and US$2460 compared with CD4 monitoring. CONCLUSION The cost-effectiveness of POC-VL for monitoring ART is improved by a higher detection limit, by taking the reduction in new HIV infections into account and assuming that failure of first-line ART is reduced due to targeted adherence counselling.
Resumo:
BACKGROUND Since 2005, increasing numbers of children have started antiretroviral therapy (ART) in sub-Saharan Africa and, in recent years, WHO and country treatment guidelines have recommended ART initiation for all infants and very young children, and at higher CD4 thresholds for older children. We examined temporal changes in patient and regimen characteristics at ART start using data from 12 cohorts in 4 countries participating in the IeDEA-SA collaboration. METHODOLOGY/PRINCIPAL FINDINGS Data from 30,300 ART-naïve children aged <16 years at ART initiation who started therapy between 2005 and 2010 were analysed. We examined changes in median values for continuous variables using the Cuzick's test for trend over time. We also examined changes in the proportions of patients with particular disease severity characteristics (expressed as a binary variable e.g. WHO Stage III/IV vs I/II) using logistic regression. Between 2005 and 2010 the number of children starting ART each year increased and median age declined from 63 months (2006) to 56 months (2010). Both the proportion of children <1 year and ≥10 years of age increased from 12 to 19% and 18 to 22% respectively. Children had less severe disease at ART initiation in later years with significant declines in the percentage with severe immunosuppression (81 to 63%), WHO Stage III/IV disease (75 to 62%), severe anemia (12 to 7%) and weight-for-age z-score<-3 (31 to 28%). Similar results were seen when restricting to infants with significant declines in the proportion with severe immunodeficiency (98 to 82%) and Stage III/IV disease (81 to 63%). First-line regimen use followed country guidelines. CONCLUSIONS/SIGNIFICANCE Between 2005 and 2010 increasing numbers of children have initiated ART with a decline in disease severity at start of therapy. However, even in 2010, a substantial number of infants and children started ART with advanced disease. These results highlight the importance of efforts to improve access to HIV diagnostic testing and ART in children.
Resumo:
Background. Although tenofovir (TDF) use has increased as part of first-line antiretroviral therapy (ART) across sub-Saharan Africa, renal outcomes among patients receiving TDF remain poorly understood. We assessed changes in renal function and mortality in patients starting TDF- or non-TDF-containing ART in Lusaka, Zambia. Methods. We included patients aged ≥16 years who started ART from 2007 onward, with documented baseline weight and serum creatinine. Renal dysfunction was categorized as mild (eGFR 60-89 mL/min), moderate (30-59 mL/min) or severe (<30 mL/min) using the CKD-EPI formula. Differences in eGFR during ART were analyzed using linear mixed-effect models, the odds of developing moderate or severe eGFR decrease with logistic regression and mortality with competing risk regression. Results. We included 62,230 adults, of which 38,716 (62%) initiated a TDF-based regimen. The proportion with moderate or severe renal dysfunction at baseline was lower in the TDF compared to the non-TDF group (1.9% vs. 4.0%). Among patients with no or mild renal dysfunction, those on TDF were more likely to develop moderate (adjusted OR: 3.11; 95%CI: 2.52-3.87) or severe eGFR decrease (adjusted OR: 2.43; 95%CI: 1.80-3.28), although the incidence of such episodes was low. Among patients with moderate or severe renal dysfunction at baseline, renal function improved independently of ART regimen and mortality was similar in both treatment groups. Conclusions. TDF use did not attenuate renal function recovery or increase mortality in patients with renal dysfunction. Further studies are needed to determine the role of routine renal function monitoring before and during ART use in Africa.
Resumo:
The Bodélé Depression (Chad) in the central Sahara/Sahel region of Northern Africa is the most important source of mineral dust to the atmosphere globally. The Bodélé Depression is purportedly the largest source of Saharan dust reaching the Amazon Basin by transatlantic transport. Here, we have undertaken a comprehensive study of surface sediments from the Bodélé Depression and dust deposits (Chad, Niger) in order to characterize geochemically and isotopically (Sr, Nd and Pb isotopes) this dust source, and evaluate its importance in present and past African dust records. We similarly analyzed sedimentary deposits from the Amazonian lowlands in order to assess postulated accumulation of African mineral dust in the Amazon Basin, as well as its possible impact in fertilizing the Amazon rainforest. Our results identify distinct sources of different ages and provenance in the Bodélé Depression versus the Amazon Basin, effectively ruling out an origin for the Amazonian deposits, such as the Belterra Clay Layer, by long-term deposition of Bodélé Depression material. Similarly, no evidence for contributions from other potential source areas is provided by existing isotope data (Sr, Nd) on Saharan dusts. Instead, the composition of these Amazonian deposits is entirely consistent with derivation from in-situ weathering and erosion of the Precambrian Amazonian craton, with little, if any, Andean contribution. In the Amazon Basin, the mass accumulation rate of eolian dust is only around one-third of the vertical erosion rate in shield areas, suggesting that Saharan dust is “consumed” by tropical weathering, contributing nutrients and stimulating plant growth, but never accumulates as such in the Amazon Basin. The chemical and isotope compositions found in the Bodélé Depression are varied at the local scale, and have contrasting signatures in the “silica-rich” dry lake-bed sediments and in the “calcium-rich” mixed diatomites and surrounding sand material. This unexpected finding implies that the Bodélé Depression material is not “pre-mixed” at the source to provide a homogeneous source of dust. Rather, different isotope signatures can be emitted depending on subtle vagaries of dust-producing events. Our characterization of the Bodélé Depression components indicate that the Bodélé “calcium-rich” component, identified here, is most likely released via eolian processes of sand grain saltation and abrasion and may be significant in the overall global budget of dusts carried out by the Harmattan low-level jet during the winter.
Resumo:
Biofuel production, while highly contested, is supported by a number of policies worldwide. Ethiopia was among the first sub-Saharan countries to devise a biofuel policy strategy to guide the associated demand toward sustainable development. In this paper, I discuss Ethiopia’s biofuel policy from an interpretative research position using a frames approach and argue that useful insights can be obtained by paying more attention to national contexts and values represented in the debates on whether biofuel production can or will contribute to sustainable development. To this end, I was able to distinguish three major frames used in the Ethiopian debate on biofuels: an environmental rehabilitation frame, a green revolution frame and a legitimacy frame. The article concludes that actors advocating for frames related to social and human issues have difficulties entering the debate and forming alliances, and that those voices need to be included in order for Ethiopia to develop a sustainable biofuel sector.