141 resultados para Gilpin, Mary Ann, 1813-1839.
Resumo:
OBJECTIVES Cotrimoxazole prophylactic treatment (CPT) prevents opportunistic infections in HIV-infected or HIV-exposed children, but estimates of the effectiveness in preventing malaria vary. We reviewed studies that examined the effect of CPT on incidence of malaria in children in sub-Saharan Africa. METHODS We searched PubMed and EMBASE for randomised controlled trials (RCTs) and cohort studies on the effect of CPT on incidence of malaria and mortality in children and extracted data on the prevalence of sulphadoxine-pyrimethamine resistance-conferring point mutations. Incidence rate ratios (IRR) from individual studies were combined using random effects meta-analysis; confounder-adjusted estimates were used for cohort studies. The importance of resistance was examined in meta-regression analyses. RESULTS Three RCTs and four cohort studies with 5039 children (1692 HIV-exposed; 2800 HIV-uninfected; 1486 HIV-infected) were included. Children on CPT were less likely to develop clinical malaria episodes than those without prophylaxis (combined IRR 0.37, 95% confidence interval: 0.21-0.66), but there was substantial between-study heterogeneity (I-squared = 94%, P < 0.001). The protective efficacy of CPT was highest in an RCT from Mali, where the prevalence of antifolate resistant plasmodia was low. In meta-regression analyses, there was some evidence that the efficacy of CPT declined with increasing levels of resistance. Mortality was reduced with CPT in an RCT from Zambia, but not in a cohort study from Côte d'Ivoire. CONCLUSIONS Cotrimoxazole prophylactic treatment reduces incidence of malaria and mortality in children in sub-Saharan Africa, but study designs, settings and results were heterogeneous. CPT appears to be beneficial for HIV-infected and HIV-exposed as well as HIV-uninfected children.
Resumo:
The incidence of Kaposi's Sarcoma (KS) is high in South Africa but the impact of antiretroviral therapy (ART) is not well defined. We examined incidence and survival of KS in HIV-infected patients enrolled in South African ART programs. We analyzed data of three ART programs: Khayelitsha township and Tygerberg Hospital programs in Cape Town and Themba Lethu program in Johannesburg. We included patients aged >16 years. ART was defined as a regimen of at least three drugs. We estimated incidence rates of KS for patients on ART and not on ART. We calculated Cox models adjusted for age, sex and time-updated CD4 cell counts and HIV-1 RNA. A total of 18,254 patients (median age 34.5 years, 64% female, median CD4 cell count at enrolment 105 cells/μL) were included. During 37,488 person-years follow-up 162 patients developed KS. The incidence was 1,682/100,000 person-years (95% confidence interval [CI] 1,406-2,011) among patients not receiving ART and 138/100,000 person-years (95% CI 102-187) among patients on ART. The adjusted hazard ratio comparing time on ART with time not on ART was 0.19 (95% CI 0.13-0.28). Low CD4 cell counts (time-updated) and male sex were also associated with KS. Estimated survival of KS patients at one year was 72.2% (95% CI 64.9-80.2) and higher in men than in women. The incidence of KS is substantially lower on ART than not on ART. Timely initiation of ART is essential to prevent KS and KS-associated morbidity and mortality in South Africa and other regions in Africa with a high burden of HIV.
Resumo:
Congenital distal renal tubular acidosis (dRTA) from mutations of the B1 subunit of the V-ATPase is considered an autosomal recessive disease. We analyzed a dRTA kindred with a truncation-mutation of B1 (p.Phe468fsX487) previously shown to have failure of assembly into the V1 domain of the V-ATPase. All heterozygous carriers in this kindred have normal plasma bicarbonate concentrations, thus evaded the diagnosis of RTA. However, inappropriately high urine pH, hypocitraturia, and hypercalciuria are present either individually or in combination in the heterozygotes at baseline. Two of the heterozygotes studied also have inappropriate urinary acidification with acute ammonium chloride loading and impaired urine-blood pCO2 gradient during bicarbonaturia indicating presence of H+ gradient and flux defects. In normal human renal papillae, wild type B1 is located primarily on the plasma membrane but papilla from one of the heterozygote who had kidney stones had renal tissue secured from surgery showed B1 in both plasma membrane as well as a diffuse intracellular staining. Titrating increasing amounts of the mutant B1 subunit did not exhibit negative dominance over the expression, cellular distribution, or H+-pump activity of the wild type B1 in mammalian HEK293 cells and in V-ATPase-deficient S. cerevisiae. This is the first demonstration of renal acidification defects and nephrolithiasis in heterozygous carriers of mutant B1 subunit; which cannot be attributable to negative dominance. We propose that heterozygosity may lead to mild real acidification defects due to haploinsufficiency. B1 heterozygosity should be considered in patients with calcium nephrolithiasis and urinary abnormalities such as alkalinuria or hypocitraturia.
Resumo:
In several studies of antiretroviral treatment (ART) programs for persons with human immunodeficiency virus infection, investigators have reported that there has been a higher rate of loss to follow-up (LTFU) among patients initiating ART in recent years than among patients who initiated ART during earlier time periods. This finding is frequently interpreted as reflecting deterioration of patient retention in the face of increasing patient loads. However, in this paper we demonstrate by simulation that transient gaps in follow-up could lead to bias when standard survival analysis techniques are applied. We created a simulated cohort of patients with different dates of ART initiation. Rates of ART interruption, ART resumption, and mortality were assumed to remain constant over time, but when we applied a standard definition of LTFU, the simulated probability of being classified LTFU at a particular ART duration was substantially higher in recently enrolled cohorts. This suggests that much of the apparent trend towards increased LTFU may be attributed to bias caused by transient interruptions in care. Alternative statistical techniques need to be used when analyzing predictors of LTFU-for example, using "prospective" definitions of LTFU in place of "retrospective" definitions. Similar considerations may apply when analyzing predictors of LTFU from treatment programs for other chronic diseases.
Resumo:
BACKGROUND The CD4 cell count or percent (CD4%) at the start of combination antiretroviral therapy (cART) is an important prognostic factor in children starting therapy and an important indicator of program performance. We describe trends and determinants of CD4 measures at cART initiation in children from low-, middle-, and high-income countries. METHODS We included children aged <16 years from clinics participating in a collaborative study spanning sub-Saharan Africa, Asia, Latin America, and the United States. Missing CD4 values at cART start were estimated through multiple imputation. Severe immunodeficiency was defined according to World Health Organization criteria. Analyses used generalized additive mixed models adjusted for age, country, and calendar year. RESULTS A total of 34,706 children from 9 low-income, 6 lower middle-income, 4 upper middle-income countries, and 1 high-income country (United States) were included; 20,624 children (59%) had severe immunodeficiency. In low-income countries, the estimated prevalence of children starting cART with severe immunodeficiency declined from 76% in 2004 to 63% in 2010. Corresponding figures for lower middle-income countries were from 77% to 66% and for upper middle-income countries from 75% to 58%. In the United States, the percentage decreased from 42% to 19% during the period 1996 to 2006. In low- and middle-income countries, infants and children aged 12-15 years had the highest prevalence of severe immunodeficiency at cART initiation. CONCLUSIONS Despite progress in most low- and middle-income countries, many children continue to start cART with severe immunodeficiency. Early diagnosis and treatment of HIV-infected children to prevent morbidity and mortality associated with immunodeficiency must remain a global public health priority.
Resumo:
OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.
Resumo:
BACKGROUND Even among HIV-infected patients who fully suppress plasma HIV RNA replication on antiretroviral therapy, genetic (e.g. CCL3L1 copy number), viral (e.g. tropism) and environmental (e.g. chronic exposure to microbial antigens) factors influence CD4 recovery. These factors differ markedly around the world and therefore the expected CD4 recovery during HIV RNA suppression may differ globally. METHODS We evaluated HIV-infected adults from North America, West Africa, East Africa, Southern Africa and Asia starting non-nucleoside reverse transcriptase inhibitorbased regimens containing efavirenz or nevirapine, who achieved at least one HIV RNA level <500/ml in the first year of therapy and observed CD4 changes during HIV RNA suppression. We used a piecewise linear regression to estimate the influence of region of residence on CD4 recovery, adjusting for socio-demographic and clinical characteristics. We observed 28 217 patients from 105 cohorts over 37 825 person-years. RESULTS After adjustment, patients from East Africa showed diminished CD4 recovery as compared with other regions. Three years after antiretroviral therapy initiation, the mean CD4 count for a prototypical patient with a pre-therapy CD4 count of 150/ml was 529/ml [95% confidence interval (CI): 517–541] in North America, 494/ml (95% CI: 429–559) in West Africa, 515/ml (95% CI: 508–522) in Southern Africa, 503/ml (95% CI: 478–528) in Asia and 437/ml (95% CI: 425–449) in East Africa. CONCLUSIONS CD4 recovery during HIV RNA suppression is diminished in East Africa as compared with other regions of the world, and observed differences are large enough to potentially influence clinical outcomes. Epidemiological analyses on a global scale can identify macroscopic effects unobservable at the clinical, national or individual regional level.
Resumo:
ntroduction: The ProAct study has shown that a pump switch to the Accu-Chek® Combo system (Roche Diagnostics Deutschland GmbH, Mannheim, Germany) in type 1 diabetes patients results in stable glycemic control with significant improvements in glycated hemoglobin (HbA1c) in patients with unsatisfactory baseline HbA1c and shorter pump usage time. Patients and Methods: In this post hoc analysis of the ProAct database, we investigated the glycemic control and glycemic variability at baseline by determination of several established parameters and scores (HbA1c, hypoglycemia frequency, J-score, Hypoglycemia and Hyperglycemia Indexes, and Index of Glycemic Control) in participants with different daily bolus and blood glucose measurement frequencies (less than four day, four or five per day, and more than five per day, in both cases). The data were derived from up to 299 patients (172 females, 127 males; age [mean±SD], 39.4±15.2 years; pump treatment duration, 7.0±5.2 years). Results: Participants with frequent glucose readings had better glycemic control than those with few readings (more than five readings per day vs. less than four readings per day: HbA1c, 7.2±1.1% vs. 8.0±0.9%; mean daily blood glucose, 151±22 mg/dL vs. 176±30 mg/dL; percentage of readings per month >300 mg/dL, 10±4% vs. 14±5%; percentage of readings in target range [80-180 mg/dL], 59% vs. 48% [P<0.05 in all cases]) and had a lower glycemic variability (J-score, 49±13 vs. 71±25 [P<0.05]; Hyperglycemia Index, 0.9±0.5 vs. 1.9±1.2 [P<0.05]; Index of Glycemic Control, 1.9±0.8 vs. 3.1±1.6 [P<0.05]; Hypoglycemia Index, 0.9±0.8 vs. 1.2±1.3 [not significant]). Frequent self-monitoring of blood glucose was associated with a higher number of bolus applications (6.1±2.2 boluses/day vs. 4.5±2.0 boluses/day [P<0.05]). Therefore, a similar but less pronounced effect on glycemic variability in favor of more daily bolus applications was observed (more than five vs. less than four bolues per day: J-score, 57±17 vs. 63±25 [not significant]; Hypoglycemia Index, 1.0±1.0 vs. 1.5±1.4 [P<0.05]; Hyperglycemia Index, 1.3±0.6 vs. 1.6±1.1 [not significant]; Index of Glycemic Control, 2.3±1.1 vs. 3.1±1.7 [P<0.05]). Conclusions: Pump users who perform frequent daily glucose readings have a better glycemic control with lower glycemic variability.
Resumo:
The formation of blood vessels is a complex tissue-specific process that plays a pivotal role during developmental processes, in wound healing, cancer progression, fibrosis and other pathologies. To study vasculogenesis and vascular remodeling in the context of the lung, we developed an in-vitro microvascular model that closely mimics the human lung microvasculature in terms of 3D architecture, accessibility, functionality and cell types. Human pericytes from the distal airway were isolated and characterized using flow cytometry. To assess their role in the generation of normal microvessels, lung pericytes were mixed in fibrin gel and seeded into well-defined microcompartments together with primary endothelial cells (HUVEC). Patent microvessels covering an area of 3.1 mm2 formed within 3-5 days and were stable for up to 14 days. Soluble signals from the lung pericytes were necessary to establish perfusability, and pericytes migrated towards endothelial microvessels. Cell-cell communication in the form of adherens and tight junctions, as well as secretion of basement membrane was confirmed using transmission electron microscopy and immunocytochemistry on chip. Direct co-culture of pericytes with endothelial cells decreased the microvascular permeability by one order of magnitude from 17.8∙10-6 cm/s to 2.0∙10-6 cm/s and led to vessels with significantly smaller and less variable diameter. Upon phenylephrine administration, vasoconstriction was observed in microvessels lined with pericytes but not in endothelial microvessels only. Perfusable microvessels were also generated with human lung microvascular endothelial cells and lung pericytes. Human lung pericytes were thus shown to have a prominent influence on microvascular morphology, permeability, vasoconstriction and long-term stability in an in-vitro microvascular system. This biomimetic platform opens new possibilities to test functions and interactions of patient-derived cells in a physiologically relevant microvascular setting.
Resumo:
BACKGROUND HIV-1 viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not universally available. We examined monitoring of first-line and switching to second-line ART in sub-Saharan Africa, 2004-2013. METHODS Adult HIV-1 infected patients starting combination ART in 16 countries were included. Switching was defined as a change from a non-nucleoside reverse-transcriptase inhibitor (NNRTI)-based regimen to a protease inhibitor (PI)-based regimen, with a change of ≥1 NRTI. Virological and immunological failures were defined per World Health Organization criteria. We calculated cumulative probabilities of switching and hazard ratios with 95% confidence intervals (CI) comparing routine VL monitoring, targeted VL monitoring, CD4 cell monitoring and clinical monitoring, adjusted for programme and individual characteristics. FINDINGS Of 297,825 eligible patients, 10,352 patients (3·5%) switched during 782,412 person-years of follow-up. Compared to CD4 monitoring hazard ratios for switching were 3·15 (95% CI 2·92-3·40) for routine VL, 1·21 (1·13-1·30) for targeted VL and 0·49 (0·43-0·56) for clinical monitoring. Overall 58.0% of patients with confirmed virological and 19·3% of patients with confirmed immunological failure switched within 2 years. Among patients who switched the percentage with evidence of treatment failure based on a single CD4 or VL measurement ranged from 32·1% with clinical to 84.3% with targeted VL monitoring. Median CD4 counts at switching were 215 cells/µl under routine VL monitoring but lower with other monitoring (114-133 cells/µl). INTERPRETATION Overall few patients switched to second-line ART and switching occurred late in the absence of routine viral load monitoring. Switching was more common and occurred earlier with targeted or routine viral load testing.
Resumo:
Subseafloor environments preserved in Archean greenstone belts provide an analogue for investigating potential subsurface habitats on Mars. The c. 3.5-3.4 Ga pillow lava metabasalts of the mid-Archean Barberton greenstone belt, South Africa, have been argued to contain the earliest evidence for microbial subseafloor life. This includes candidate trace fossils in the form of titanite microtextures, and sulfur isotopic signatures of pyrite preserved in metabasaltic glass of the c. 3.472 Ga Hooggenoeg Formation. It has been contended that similar microtextures in altered martian basalts may represent potential extraterrestrial biosignatures of microbe-fluid-rock interaction. But despite numerous studies describing these putative early traces of life, a detailed metamorphic characterization of the microtextures and their host alteration conditions in the ancient pillow lava metabasites is lacking. Here, we present a new nondestructive technique with which to study the in situ metamorphic alteration conditions associated with potential biosignatures in mafic-ultramafic rocks of the Hooggenoeg Formation. Our approach combines quantitative microscale compositional mapping by electron microprobe with inverse thermodynamic modeling to derive low-temperature chlorite crystallization conditions. We found that the titanite microtextures formed under subgreenschist to greenschist facies conditions. Two chlorite temperature groups were identified in the maps surrounding the titanite microtextures and record peak metamorphic conditions at 315 ± 40°C (XFe3+(chlorite) = 25-34%) and lower-temperature chlorite veins/microdomains at T = 210 ± 40°C (lower XFe3+(chlorite) = 40-45%). These results provide the first metamorphic constraints in textural context on the Barberton titanite microtextures and thereby improve our understanding of the local preservation conditions of these potential biosignatures. We suggest that this approach may prove to be an important tool in future studies to assess the biogenicity of these earliest candidate traces of life on Earth. Furthermore, we propose that this mapping approach could also be used to investigate altered mafic-ultramafic extraterrestrial samples containing candidate biosignatures.