80 resultados para Art Built In
Resumo:
BACKGROUND There is limited evidence on the optimal timing of antiretroviral therapy (ART) initiation in children 2-5 y of age. We conducted a causal modelling analysis using the International Epidemiologic Databases to Evaluate AIDS-Southern Africa (IeDEA-SA) collaborative dataset to determine the difference in mortality when starting ART in children aged 2-5 y immediately (irrespective of CD4 criteria), as recommended in the World Health Organization (WHO) 2013 guidelines, compared to deferring to lower CD4 thresholds, for example, the WHO 2010 recommended threshold of CD4 count <750 cells/mm(3) or CD4 percentage (CD4%) <25%. METHODS AND FINDINGS ART-naïve children enrolling in HIV care at IeDEA-SA sites who were between 24 and 59 mo of age at first visit and with ≥1 visit prior to ART initiation and ≥1 follow-up visit were included. We estimated mortality for ART initiation at different CD4 thresholds for up to 3 y using g-computation, adjusting for measured time-dependent confounding of CD4 percent, CD4 count, and weight-for-age z-score. Confidence intervals were constructed using bootstrapping. The median (first; third quartile) age at first visit of 2,934 children (51% male) included in the analysis was 3.3 y (2.6; 4.1), with a median (first; third quartile) CD4 count of 592 cells/mm(3) (356; 895) and median (first; third quartile) CD4% of 16% (10%; 23%). The estimated cumulative mortality after 3 y for ART initiation at different CD4 thresholds ranged from 3.4% (95% CI: 2.1-6.5) (no ART) to 2.1% (95% CI: 1.3%-3.5%) (ART irrespective of CD4 value). Estimated mortality was overall higher when initiating ART at lower CD4 values or not at all. There was no mortality difference between starting ART immediately, irrespective of CD4 value, and ART initiation at the WHO 2010 recommended threshold of CD4 count <750 cells/mm(3) or CD4% <25%, with mortality estimates of 2.1% (95% CI: 1.3%-3.5%) and 2.2% (95% CI: 1.4%-3.5%) after 3 y, respectively. The analysis was limited by loss to follow-up and the unavailability of WHO staging data. CONCLUSIONS The results indicate no mortality difference for up to 3 y between ART initiation irrespective of CD4 value and ART initiation at a threshold of CD4 count <750 cells/mm(3) or CD4% <25%, but there are overall higher point estimates for mortality when ART is initiated at lower CD4 values. Please see later in the article for the Editors' Summary.
Resumo:
BACKGROUND There is debate over using tenofovir or zidovudine alongside lamivudine in second-line antiretroviral therapy (ART) following stavudine failure. We analyzed outcomes in cohorts from South Africa, Zambia and Zimbabwe METHODS: Patients aged ≥16 years who switched from a first-line regimen including stavudine to a ritonavir-boosted lopinavir-based second-line regimen with lamivudine or emtricitabine and zidovudine or tenofovir in seven ART programs in southern Africa were included. We estimated the causal effect of receiving tenofovir or zidovudine on mortality and virological failure using Cox proportional hazards marginal structural models. Its parameters were estimated using inverse probability of treatment weights. Baseline characteristics were age, sex, calendar year and country. CD4 cell count, creatinine and hemoglobin levels were included as time-dependent confounders. RESULTS 1,256 patients on second-line ART, including 958 on tenofovir, were analyzed. Patients on tenofovir were more likely to have switched to second-line ART in recent years, spent more time on first-line ART (33 vs. 24 months) and had lower CD4 cell counts (172 vs. 341 cells/μl) at initiation of second-line ART. The adjusted hazard ratio comparing tenofovir with zidovudine was 1.00 (95% confidence interval 0.59-1.68) for virologic failure and 1.40 (0.57-3.41) for death. CONCLUSIONS We did not find any difference in treatment outcomes between patients on tenofovir or zidovudine; however, the precision of our estimates was limited. There is an urgent need for randomized trials to inform second-line ART strategies in resource-limited settings.
Resumo:
Background. Although tenofovir (TDF) use has increased as part of first-line antiretroviral therapy (ART) across sub-Saharan Africa, renal outcomes among patients receiving TDF remain poorly understood. We assessed changes in renal function and mortality in patients starting TDF- or non-TDF-containing ART in Lusaka, Zambia. Methods. We included patients aged ≥16 years who started ART from 2007 onward, with documented baseline weight and serum creatinine. Renal dysfunction was categorized as mild (eGFR 60-89 mL/min), moderate (30-59 mL/min) or severe (<30 mL/min) using the CKD-EPI formula. Differences in eGFR during ART were analyzed using linear mixed-effect models, the odds of developing moderate or severe eGFR decrease with logistic regression and mortality with competing risk regression. Results. We included 62,230 adults, of which 38,716 (62%) initiated a TDF-based regimen. The proportion with moderate or severe renal dysfunction at baseline was lower in the TDF compared to the non-TDF group (1.9% vs. 4.0%). Among patients with no or mild renal dysfunction, those on TDF were more likely to develop moderate (adjusted OR: 3.11; 95%CI: 2.52-3.87) or severe eGFR decrease (adjusted OR: 2.43; 95%CI: 1.80-3.28), although the incidence of such episodes was low. Among patients with moderate or severe renal dysfunction at baseline, renal function improved independently of ART regimen and mortality was similar in both treatment groups. Conclusions. TDF use did not attenuate renal function recovery or increase mortality in patients with renal dysfunction. Further studies are needed to determine the role of routine renal function monitoring before and during ART use in Africa.
Resumo:
Safe disposal of toxic wastes in geologic formations requires minimal water and gas movement in the vicinity of storage areas, Ventilation of repository tunnels or caverns built in solid rock can desaturate the near field up to a distance of meters from the rock surface, even when the surrounding geological formation is saturated and under hydrostatic pressures. A tunnel segment at the Grimsel test site located in the Aare granite of the Bernese Alps (central Switzerland) has been subjected to a resaturation and, subsequently, to a controlled desaturation, Using thermocouple psychrometers (TP) and time domain reflectometry (TDR), the water potentials psi and water contents theta were measured within the unsaturated granodiorite matrix near the tunnel wall at depths between 0 and 160 cm. During the resaturation the water potentials in the first 30 cm from the rock surface changed within weeks from values of less than -1.5 MPa to near saturation. They returned to the negative initial values during desaturation, The dynamics of this saturation-desaturation regime could be monitored very sensitively using the thermocouple psychrometers, The TDR measurements indicated that water contents changed dose to the surface, but at deeper installation depths the observed changes were within the experimental noise. The field-measured data of the desaturation cycle were used to test the predictive capabilities of the hydraulic parameter functions that were derived from the water retention characteristics psi(theta) determined in the laboratory. A depth-invariant saturated hydraulic conductivity k(s) = 3.0 x 10(-11) m s(-1) was estimated from the psi(t) data at all measurement depths, using the one-dimensional, unsaturated water flow and transport model HYDRUS Vogel er al., 1996, For individual measurement depths, the estimated k(s) varied between 9.8 x 10(-12) and 6.1 x 10(-11) The fitted k(s) values fell within the range of previously estimated k(s) for this location and led to a satisfactory description of the data, even though the model did not include transport of water vapor.
Tranformation and Innovation of Rising Gothic in the Northern Holy Roman Empire: Transferring Gothic
Resumo:
In this study we present the analysis of the human remains from tomb K93.12 in the Ancient Egyptian necropolis of Dra’ Abu el-Naga, located opposite the modern city of Luxor in Upper Egypt on the western bank of the Nile. Archaeological findings indicate that the rock tomb was originally built in the early 18th dynasty. Remains of two tomb-temples of the 20th dynasty and the looted burial of the High Priest of Amun Amenhotep have been identified. After the New Kingdom the tomb was reused as a burial place until the 26th dynasty. The skeletal and mummified material of the different tomb areas underwent a detailed anthropological and paleopathological analysis. The human remains were mostly damaged and scattered due to extensive grave robberies. In total, 79 individuals could be partly reconstructed and investigated. The age and sex distribution revealed a male predominance and a high percentage of young children (< 6 years) and adults in the range of 20 to 40 years. The paleopathological analysis showed a high prevalence of stress markers such as cribra orbitalia in the younger individuals, and other pathological conditions such as dental diseases, degenerative diseases and a possible case of ankylosing spondylitis. Additionally, 13 mummies of an intrusive waste pit could be attributed to three different groups belonging to earlier time periods based on their style of mummification and materials used. The study revealed important information on the age and sex distribution and diseases of the individuals buried in tomb K93.12.
Resumo:
OBJECTIVE: The presence of minority nonnucleoside reverse transcriptase inhibitor (NNRTI)-resistant HIV-1 variants prior to antiretroviral therapy (ART) has been linked to virologic failure in treatment-naive patients. DESIGN: We performed a large retrospective study to determine the number of treatment failures that could have been prevented by implementing minority drug-resistant HIV-1 variant analyses in ART-naïve patients in whom no NNRTI resistance mutations were detected by routine resistance testing. METHODS: Of 1608 patients in the Swiss HIV Cohort Study, who have initiated first-line ART with two nucleoside reverse transcriptase inhibitors (NRTIs) and one NNRTI before July 2008, 519 patients were eligible by means of HIV-1 subtype, viral load and sample availability. Key NNRTI drug resistance mutations K103N and Y181C were measured by allele-specific PCR in 208 of 519 randomly chosen patients. RESULTS: Minority K103N and Y181C drug resistance mutations were detected in five out of 190 (2.6%) and 10 out of 201 (5%) patients, respectively. Focusing on 183 patients for whom virologic success or failure could be examined, virologic failure occurred in seven out of 183 (3.8%) patients; minority K103N and/or Y181C variants were present prior to ART initiation in only two of those patients. The NNRTI-containing, first-line ART was effective in 10 patients with preexisting minority NNRTI-resistant HIV-1 variant. CONCLUSION: As revealed in settings of case-control studies, minority NNRTI-resistant HIV-1 variants can have an impact on ART. However, the sole implementation of minority NNRTI-resistant HIV-1 variant analysis in addition to genotypic resistance testing (GRT) cannot be recommended in routine clinical settings. Additional associated risk factors need to be discovered.
Resumo:
SETTING Drug resistance threatens tuberculosis (TB) control, particularly among human immunodeficiency virus (HIV) infected persons. OBJECTIVE To describe practices in the prevention and management of drug-resistant TB under antiretroviral therapy (ART) programs in lower-income countries. DESIGN We used online questionnaires to collect program-level data on 47 ART programs in Southern Africa (n = 14), East Africa (n = 8), West Africa (n = 7), Central Africa (n = 5), Latin America (n = 7) and the Asia-Pacific (n = 6 programs) in 2012. Patient-level data were collected on 1002 adult TB patients seen at 40 of the participating ART programs. RESULTS Phenotypic drug susceptibility testing (DST) was available in 36 (77%) ART programs, but was only used for 22% of all TB patients. Molecular DST was available in 33 (70%) programs and was used in 23% of all TB patients. Twenty ART programs (43%) provided directly observed therapy (DOT) during the entire course of treatment, 16 (34%) during the intensive phase only, and 11 (23%) did not follow DOT. Fourteen (30%) ART programs reported no access to second-line anti-tuberculosis regimens; 18 (38%) reported TB drug shortages. CONCLUSIONS Capacity to diagnose and treat drug-resistant TB was limited across ART programs in lower-income countries. DOT was not always implemented and drug supplies were regularly interrupted, which may contribute to the global emergence of drug resistance.
Resumo:
BACKGROUND The risk of Kaposi sarcoma (KS) among HIV-infected persons on antiretroviral therapy (ART) is not well defined in resource-limited settings. We studied KS incidence rates and associated risk factors in children and adults on ART in Southern Africa. METHODS We included patient data of 6 ART programs in Botswana, South Africa, Zambia, and Zimbabwe. We estimated KS incidence rates in patients on ART measuring time from 30 days after ART initiation to KS diagnosis, last follow-up visit, or death. We assessed risk factors (age, sex, calendar year, WHO stage, tuberculosis, and CD4 counts) using Cox models. FINDINGS We analyzed data from 173,245 patients (61% female, 8% children aged <16 years) who started ART between 2004 and 2010. Five hundred and sixty-four incident cases were diagnosed during 343,927 person-years (pys). The overall KS incidence rate was 164/100,000 pys [95% confidence interval (CI): 151 to 178]. The incidence rate was highest 30-90 days after ART initiation (413/100,000 pys; 95% CI: 342 to 497) and declined thereafter [86/100,000 pys (95% CI: 71 to 105), >2 years after ART initiation]. Male sex [adjusted hazard ratio (HR): 1.34; 95% CI: 1.12 to 1.61], low current CD4 counts (≥500 versus <50 cells/μL, adjusted HR: 0.36; 95% CI: 0.23 to 0.55), and age (5-9 years versus 30-39 years, adjusted HR: 0.20; 95% CI: 0.05 to 0.79) were relevant risk factors for developing KS. INTERPRETATION Despite ART, KS risk in HIV-infected persons in Southern Africa remains high. Early HIV testing and maintaining high CD4 counts is needed to further reduce KS-related morbidity and mortality.
Resumo:
Microsoft Project is one of the most-widely used software packages for project management. For the scheduling of resource-constrained projects, the package applies a priority-based procedure using a specific schedule-generation scheme. This procedure performs relatively poorly when compared against other software packages or state-of-the-art methods for resource-constrained project scheduling. In Microsoft Project 2010, it is possible to work with schedules that are infeasible with respect to the precedence or the resource constraints. We propose a novel schedule-generation scheme that makes use of this possibility. Under this scheme, the project tasks are scheduled sequentially while taking into account all temporal and resource constraints that a user can define within Microsoft Project. The scheme can be implemented as a priority-rule based heuristic procedure. Our computational results for two real-world construction projects indicate that this procedure outperforms the built-in procedure of Microsoft Project
Resumo:
OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.