45 resultados para sediment retention in reservoirs


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND The use of combination antiretroviral therapy (cART) comprising three antiretroviral medications from at least two classes of drugs is the current standard treatment for HIV infection in adults and children. Current World Health Organization (WHO) guidelines for antiretroviral therapy recommend early treatment regardless of immunologic thresholds or the clinical condition for all infants (less than one years of age) and children under the age of two years. For children aged two to five years current WHO guidelines recommend (based on low quality evidence) that clinical and immunological thresholds be used to identify those who need to start cART (advanced clinical stage or CD4 counts ≤ 750 cells/mm(3) or per cent CD4 ≤ 25%). This Cochrane review will inform the current available evidence regarding the optimal time for treatment initiation in children aged two to five years with the goal of informing the revision of WHO 2013 recommendations on when to initiate cART in children. OBJECTIVES To assess the evidence for the optimal time to initiate cART in treatment-naive, HIV-infected children aged 2 to 5 years. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, the AEGIS conference database, specific relevant conferences, www.clinicaltrials.gov, the World Health Organization International Clinical Trials Registry platform and reference lists of articles. The date of the most recent search was 30 September 2012. SELECTION CRITERIA Randomised controlled trials (RCTs) that compared immediate with deferred initiation of cART, and prospective cohort studies which followed children from enrolment to start of cART and on cART. DATA COLLECTION AND ANALYSIS Two review authors considered studies for inclusion in the review, assessed the risk of bias, and extracted data on the primary outcome of death from all causes and several secondary outcomes, including incidence of CDC category C and B clinical events and per cent CD4 cells (CD4%) at study end. For RCTs we calculated relative risks (RR) or mean differences with 95% confidence intervals (95% CI). For cohort data, we extracted relative risks with 95% CI from adjusted analyses. We combined results from RCTs using a random effects model and examined statistical heterogeneity. MAIN RESULTS Two RCTs in HIV-positive children aged 1 to 12 years were identified. One trial was the pilot study for the larger second trial and both compared initiation of cART regardless of clinical-immunological conditions with deferred initiation until per cent CD4 dropped to <15%. The two trials were conducted in Thailand, and Thailand and Cambodia, respectively. Unpublished analyses of the 122 children enrolled at ages 2 to 5 years were included in this review. There was one death in the immediate cART group and no deaths in the deferred group (RR 2.9; 95% CI 0.12 to 68.9). In the subgroup analysis of children aged 24 to 59 months, there was one CDC C event in each group (RR 0.96; 95% CI 0.06 to 14.87) and 8 and 11 CDC B events in the immediate and deferred groups respectively (RR 0.95; 95% CI 0.24 to 3.73). In this subgroup, the mean difference in CD4 per cent at study end was 5.9% (95% CI 2.7 to 9.1). One cohort study from South Africa, which compared the effect of delaying cART for up to 60 days in 573 HIV-positive children starting tuberculosis treatment (median age 3.5 years), was also included. The adjusted hazard ratios for the effect on mortality of delaying ART for more than 60 days was 1.32 (95% CI 0.55 to 3.16). AUTHORS' CONCLUSIONS This systematic review shows that there is insufficient evidence from clinical trials in support of either early or CD4-guided initiation of ART in HIV-infected children aged 2 to 5 years. Programmatic issues such as the retention in care of children in ART programmes in resource-limited settings will need to be considered when formulating WHO 2013 recommendations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Lesotho was among the first countries to adopt decentralization of care from hospitals to nurse-led health centres (HCs) to scale up the provision of antiretroviral therapy (ART). We compared outcomes between patients who started ART at HCs and hospitals in two rural catchment areas in Lesotho. Methods: The two catchment areas comprise two hospitals and 12 HCs. Patients ≥16 years starting ART at a hospital or HC between 2008 and 2011 were included. Loss to follow-up (LTFU) was defined as not returning to the facility for ≥180 days after the last visit, no follow-up (no FUP) as not returning after starting ART, and retention in care as alive and on ART at the facility. The data were analysed using logistic regression, competing risk regression and Kaplan-Meier methods. Multivariable analyses were adjusted for sex, age, CD4 cell count, World Health Organization stage, catchment area and type of ART. All analyses were stratified by gender. Results: Of 3747 patients, 2042 (54.5%) started ART at HCs. Both women and men at hospitals had more advanced clinical and immunological stages of disease than those at HCs. Over 5445 patient-years, 420 died and 475 were LTFU. Kaplan-Meier estimates for three-year retention were 68.7 and 69.7% at HCs and hospitals, respectively, among women (p=0.81) and 68.8% at HCs versus 54.7% at hospitals among men (p<0.001). These findings persisted in adjusted analyses, with similar retention at HCs and hospitals among women (odds ratio (OR): 0.89, 95% confidence interval (CI): 0.73-1.09) and higher retention at HCs among men (OR: 1.53, 95% CI: 1.20-1.96). The latter result was mainly driven by a lower proportion of patients LTFU at HCs (OR: 0.68, 95% CI: 0.51-0.93). Conclusions: In rural Lesotho, overall retention in care did not differ significantly between nurse-led HCs and hospitals. However, men seemed to benefit most from starting ART at HCs, as they were more likely to remain in care in these facilities compared to hospitals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The tropical montane forests of the E Andean cordillera in Ecuador receive episodic Sahara-dust inputs particularly increasing Ca deposition. We added CaCl2 to isolate the effect of Ca deposition by Sahara dust to tropical montane forest from the simultaneously occurring pH effect. We examined components of the Ca cycle at four control plots and four plots with added Ca (2 × 5 kg ha–1 Ca annually as CaCl2) in a random arrangement. Between August 2007 and December 2009 (four applications of Ca), we determined Ca concentrations and fluxes in litter leachate, mineral soil solution (0.15 and 0.30 m depths), throughfall, and fine litterfall and Al concentrations and speciation in soil solutions. After 1 y of Ca addition, we assessed fine-root biomass, leaf area, and tree growth. Only < 3% of the applied Ca leached below the acid organic layer (pH 3.5–4.8). The added CaCl2 did not change electrical conductivity in the root zone after 2 y. In the second year of fertilization, Ca retention in the canopy of the Ca treatment tended to decrease relative to the control. After 2 y, 21% of the applied Ca was recycled to soil with throughfall and litterfall. One year after the first Ca addition, fine-root biomass had decreased significantly. Decreasing fine-root biomass might be attributed to a direct or an indirect beneficial effect of Ca on the soil decomposer community. Because of almost complete association of Al with dissolved organic matter and high free Ca2+ : Al3+ activity ratios in solution of all plots, Al toxicity was unlikely. We conclude that the added Ca was retained in the system and had beneficial effects on some plants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Few studies have monitored late presentation (LP) of HIV infection over the European continent, including Eastern Europe. Study objectives were to explore the impact of LP on AIDS and mortality. Methods and Findings LP was defined in Collaboration of Observational HIV Epidemiological Research Europe (COHERE) as HIV diagnosis with a CD4 count <350/mm3 or an AIDS diagnosis within 6 months of HIV diagnosis among persons presenting for care between 1 January 2000 and 30 June 2011. Logistic regression was used to identify factors associated with LP and Poisson regression to explore the impact on AIDS/death. 84,524 individuals from 23 cohorts in 35 countries contributed data; 45,488 were LP (53.8%). LP was highest in heterosexual males (66.1%), Southern European countries (57.0%), and persons originating from Africa (65.1%). LP decreased from 57.3% in 2000 to 51.7% in 2010/2011 (adjusted odds ratio [aOR] 0.96; 95% CI 0.95–0.97). LP decreased over time in both Central and Northern Europe among homosexual men, and male and female heterosexuals, but increased over time for female heterosexuals and male intravenous drug users (IDUs) from Southern Europe and in male and female IDUs from Eastern Europe. 8,187 AIDS/deaths occurred during 327,003 person-years of follow-up. In the first year after HIV diagnosis, LP was associated with over a 13-fold increased incidence of AIDS/death in Southern Europe (adjusted incidence rate ratio [aIRR] 13.02; 95% CI 8.19–20.70) and over a 6-fold increased rate in Eastern Europe (aIRR 6.64; 95% CI 3.55–12.43). Conclusions LP has decreased over time across Europe, but remains a significant issue in the region in all HIV exposure groups. LP increased in male IDUs and female heterosexuals from Southern Europe and IDUs in Eastern Europe. LP was associated with an increased rate of AIDS/deaths, particularly in the first year after HIV diagnosis, with significant variation across Europe. Earlier and more widespread testing, timely referrals after testing positive, and improved retention in care strategies are required to further reduce the incidence of LP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT. Here we present datasets from a hydroacoustic survey in July 2011 at Lake Torneträsk, northern Sweden. Our hydroacoustic data exhibit lake floor morphologies formed by glacial erosion and accumulation processes, insights into lacustrine sediment accumulation since the beginning of deglaciation, and information on seismic activity along the Pärvie Fault. Features of glacial scouring with a high-energy relief, steep slopes, and relative reliefs of more than 50 m are observed in the large W-basin. The remainder of the lacustrine subsurface appears to host a broad variety of well preserved formations from glacial accumulation related to the last retreat of the Fennoscandian ice sheet. Deposition of glaciolacustrine and lacustrine sediments is focused in areas situated in proximity to major inlets. Sediment accumulation in distal areas of the lake seldom exceeds 2 m or is not observable. We assume that lack of sediment deposition in the lake is a result of different factors, including low rates of erosion in the catchment, a previously high lake level leading to deposition of sediments in higher elevated paleodeltas, tributaries carrying low suspension loads as a result of sedimentation in upstream lakes, and an overall low productivity in the lake. A clear off-shore trace of the Pärvie Fault could not be detected from our hydroacoustic data. However, an absence of sediment disturbance in close proximity to the presumed fault trace implies minimal seismic activity since deposition of the glaciolacustrine and lacustrine sediments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glacier highstands since the Last Glacial Maximum are well documented for many regions, but little is known about glacier fluctuations and lowstands during the Holocene. This is because the traces of minimum extents are difficult to identify and at many places are still ice covered, limiting the access to sample material. Here we report a new approach to assess minimal glacier extent, using a 72-m long surface-to-bedrock ice core drilled on Khukh Nuru Uul, a glacier in the Tsambagarav mountain range of the Mongolian Altai (4130 m asl, 48°39.338′N, 90°50.826′E). The small ice cap has low ice temperatures and flat bedrock topography at the drill site. This indicates minimal lateral glacier flow and thereby preserved climate signals. The upper two-thirds of the ice core contain 200 years of climate information with annual resolution, whereas the lower third is subject to strong thinning of the annual layers with a basal ice age of approximately 6000 years before present (BP). We interpret the basal ice age as indicative of ice-free conditions in the Tsambagarav mountain range at 4100 m asl prior to 6000 years BP. This age marks the onset of the Neoglaciation and the end of the Holocene Climate Optimum. The ice-free conditions allow for adjusting the Equilibrium Line Altitude (ELA) and derive the glacier extent in the Mongolian Altai during the Holocene Climate Optimum. Based on the ELA-shift, we conclude that most of the glaciers are not remnants of the Last Glacial Maximum but were formed during the second part of the Holocene. The ice core derived accumulation reconstruction suggests important changes in the precipitation pattern over the last 6000 years. During formation of the glacier, more humid conditions than presently prevailed followed by a long dry period from 5000 years BP until 250 years ago. Present conditions are more humid than during the past millennia. This is consistent with precipitation evolution derived from lake sediment studies in the Altai.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In terms of changing flow and sediment regimes of rivers, dams are often regarded as the most dominant form of human impact on fluvial systems. Dams can decrease the flux of water and sediments leading to channel changes such as upstream aggradation and downstream degradation. The opposite effects occur when dams are removed. Channel degradation often requires further intervention in terms of river bed and bank protection works. The situation evolves more complex in river systems that are impacted by a series of dams due to feedback processes between the different system compartments. A number of studies have recently investigated geomorphic systems using connectivity approaches to improve the understanding of geomorphic system response to change. This paper presents a case study investigating the impact of dam construction, dam removal and dam-related river bed and bank protection measures on the sediment connectivity and channel morphology of the Fugnitz and the Kaja Rivers using a combination of DEM analyses, field surveys and landscape evolution modelling. For both river systems the results revealed low sediment connectivity accompanied by a fine river bed sediment facies in river sections upstream of active dams and of removed dams with protection measures. Contrarily, high sediment connectivity which was accompanied by a coarse river bed sediment facies was observed in river sections either located downstream of active dams or of removed dams with upstream protection. In terms of channel changes, significant channel degradation was examined at locations downstream of active dams and of removed dams. Channel bed and bank protection measures prevent erosion and channel slope recovery after dam removal. Landscape evolution modeling revealed a complex geomorphic response to dam construction and dam removal as sediment output rates and therefore geomorphic processes have been shown to act in a non-linear manner. These insights are deemed to have major implications for river management and conservation, as quality and state of riverine habitats are determined by channel morphology and river bed sediment composition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

•Symbioses between plant roots and mycorrhizal fungi are thought to enhance plant uptake of nutrients through a favourable exchange for photosynthates. Ectomycorrhizal fungi are considered to play this vital role for trees in nitrogen (N)-limited boreal forests. •We followed symbiotic carbon (C)–N exchange in a large-scale boreal pine forest experiment by tracing 13CO2 absorbed through tree photosynthesis and 15N injected into a soil layer in which ectomycorrhizal fungi dominate the microbial community. •We detected little 15N in tree canopies, but high levels in soil microbes and in mycorrhizal root tips, illustrating effective soil N immobilization, especially in late summer, when tree belowground C allocation was high. Additions of N fertilizer to the soil before labelling shifted the incorporation of 15N from soil microbes and root tips to tree foliage. •These results were tested in a model for C–N exchange between trees and mycorrhizal fungi, suggesting that ectomycorrhizal fungi transfer small fractions of absorbed N to trees under N-limited conditions, but larger fractions if more N is available. We suggest that greater allocation of C from trees to ectomycorrhizal fungi increases N retention in soil mycelium, driving boreal forests towards more severe N limitation at low N supply.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rainfall controls fire in tropical savanna ecosystems through impacting both the amount and flammability of plant biomass, and consequently, predicted changes in tropical precipitation over the next century are likely to have contrasting effects on the fire regimes of wet and dry savannas. We reconstructed the long-term dynamics of biomass burning in equatorial East Africa, using fossil charcoal particles from two well-dated lake-sediment records in western Uganda and central Kenya. We compared these high-resolution (5 years/sample) time series of biomass burning, spanning the last 3800 and 1200 years, with independent data on past hydroclimatic variability and vegetation dynamics. In western Uganda, a rapid (<100 years) and permanent increase in burning occurred around 2170 years ago, when climatic drying replaced semideciduous forest by wooded grassland. At the century time scale, biomass burning was inversely related to moisture balance for much of the next two millennia until ca. 1750 ad, when burning increased strongly despite regional climate becoming wetter. A sustained decrease in burning since the mid20th century reflects the intensified modern-day landscape conversion into cropland and plantations. In contrast, in semiarid central Kenya, biomass burning peaked at intermediate moisture-balance levels, whereas it was lower both during the wettest and driest multidecadal periods of the last 1200 years. Here, burning steadily increased since the mid20th century, presumably due to more frequent deliberate ignitions for bush clearing and cattle ranching. Both the observed historical trends and regional contrasts in biomass burning are consistent with spatial variability in fire regimes across the African savanna biome today. They demonstrate the strong dependence of East African fire regimes on both climatic moisture balance and vegetation, and the extent to which this dependence is now being overridden by anthropogenic activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, many studies about a network active during rest and deactivated during tasks emerged in the literature: the default mode network (DMN). Spatial and temporal DMN features are important markers for psychiatric diseases. Another prominent indicator of cognitive functioning, yielding information about the mental condition in health and disease, is working memory (WM) processing. In EEG studies, frontal-midline theta power has been shown to increase with load during WM retention in healthy subjects. From these findings, the conclusion can be drawn that an increase in resting state DMN activity may go along with an increase in theta power in high-load WM conditions. We followed this hypothesis in a study on 17 healthy subjects performing a visual Sternberg WM task. The DMN was obtained by a BOLD-ICA approach and its dynamics represented by the percent-strength during pre-stimulus periods. DMN dynamics were temporally correlated with EEG theta spectral power from retention intervals. This so-called covariance mapping yielded the spatial distribution of the theta EEG fluctuations associated with the dynamics of the DMN. In line with previous findings, theta power was increased at frontal-midline electrodes in high- versus low-load conditions during early WM retention. However, load-dependent correlations of DMN with theta power resulted in primarily positive correlations in low-load conditions, while during high-load conditions negative correlations of DMN activity and theta power were observed at frontal-midline electrodes. This DMN-dependent load effect reached significance during later retention. Our results show a complex and load-dependent interaction of pre-stimulus DMN activity and theta power during retention, varying over the course of the retention period. Since both, WM performance and DMN activity, are markers of mental health, our results could be important for further investigations of psychiatric populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, multiple studies showed that spatial and temporal features of a task-negative default mode network (DMN) (Greicius et al., 2003) are important markers for psychiatric diseases (Balsters et al., 2013). Another prominent indicator of cognitive functioning, yielding information about the mental condition in health and disease, is working memory (WM) processing. In EEG and MEG studies, frontal-midline theta power has been shown to increase with load during WM retention in healthy subjects (Brookes et al., 2011). Negative correlations between DMN activity and theta amplitude have been found during resting state (Jann et al., 2010) as well as during WM (Michels et al., 2010). Likewise, WM training resulted in higher resting state theta power as well as increased small-worldness of the resting brain (Langer et al., 2013). Further, increased fMRI connectivity between nodes of the DMN correlated with better WM performance (Hampson et al., 2006). Hence, the brain’s default state might influence it’s functioning during task. We therefore hypothesized correlations between pre-stimulus DMN activity and EEG-theta power during WM maintenance, depending on the WM load. 17 healthy subjects performed a Sternberg WM task while being measured simultaneously with EEG and fMRI. Data was recorded within a multicenter-study: 12 subjects were measured in Zurich with a 64-channels MR-compatible system (Brain Products) in a 3T Philips scanner, 5 subjects with a 96-channel MR-compatible system (Brain Products) in a 3T Siemens Scanner in Bern. The DMN components was obtained by a group BOLD-ICA approach over the full task duration (figure 1). The subject-wise dynamics were obtained by back-reconstructed onto each subject’s fMRI data and normalized to percent signal change values. The single trial pre-stimulus-DMN activation was then temporally correlated with the single trial EEG-theta (3-8 Hz) spectral power during retention intervals. This so-called covariance mapping (Jann et al., 2010) yielded the spatial distribution of the theta EEG fluctuations during retention associated with the dynamics of the pre-stimulus DMN. In line with previous findings, theta power was increased at frontal-midline electrodes in high- versus low-load conditions during early WM retention (figure 2). However, correlations of DMN with theta power resulted in primarily positive correlations in low-load conditions, while during high-load conditions negative correlations of DMN activity and theta power were observed at frontal-midline electrodes. This DMN-dependent load effect reached significance in the middle of the retention period (TANOVA, p<0.05) (figure 3). Our results show a complex and load-dependent interaction of pre-stimulus DMN activity and theta power during retention, varying over time. While at a more global, load-independent view pre-stimulus DMN activity correlated positively with theta power during retention, the correlation was inversed during certain time windows in high-load trials, meaning that in trials with enhanced pre-stimulus DMN activity theta power decreases during retention. Since both WM performance and DMN activity are markers of mental health our results could be important for further investigations of psychiatric populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background.  Cryptococcal meningitis is a leading cause of death in people living with human immunodeficiency virus (HIV)/acquired immune deficiency syndrome. The World Health Organizations recommends pre-antiretroviral treatment (ART) cryptococcal antigen (CRAG) screening in persons with CD4 below 100 cells/µL. We assessed the prevalence and outcome of cryptococcal antigenemia in rural southern Tanzania. Methods.  We conducted a retrospective study including all ART-naive adults with CD4 <150 cells/µL prospectively enrolled in the Kilombero and Ulanga Antiretroviral Cohort between 2008 and 2012. Cryptococcal antigen was assessed in cryopreserved pre-ART plasma. Cox regression estimated the composite outcome of death or loss to follow-up (LFU) by CRAG status and fluconazole use. Results.  Of 750 ART-naive adults, 28 (3.7%) were CRAG-positive, corresponding to a prevalence of 4.4% (23 of 520) in CD4 <100 and 2.2% (5 of 230) in CD4 100-150 cells/µL. Within 1 year, 75% (21 of 28) of CRAG-positive and 42% (302 of 722) of CRAG-negative patients were dead or LFU (P<.001), with no differences across CD4 strata. Cryptococcal antigen positivity was an independent predictor of death or LFU after adjusting for relevant confounders (hazard ratio [HR], 2.50; 95% confidence interval [CI], 1.29-4.83; P = .006). Cryptococcal meningitis occurred in 39% (11 of 28) of CRAG-positive patients, with similar retention-in-care regardless of meningitis diagnosis (P = .8). Cryptococcal antigen titer >1:160 was associated with meningitis development (odds ratio, 4.83; 95% CI, 1.24-8.41; P = .008). Fluconazole receipt decreased death or LFU in CRAG-positive patients (HR, 0.18; 95% CI, .04-.78; P = .022). Conclusions.  Cryptococcal antigenemia predicted mortality or LFU among ART-naive HIV-infected persons with CD4 <150 cells/µL, and fluconazole increased survival or retention-in-care, suggesting that targeted pre-ART CRAG screening may decrease early mortality or LFU. A CRAG screening threshold of CD4 <100 cells/µL missed 18% of CRAG-positive patients, suggesting guidelines should consider a higher threshold.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goals of this study were to identifythe alpine torrent catchments that are sensitive to climatic changes and to assess the robustness of the methods for the elaboration of flood and debris flow hazard zone maps to specific effects of climate changes. In this study, a procedure for the identification and localization of torrent catchments in which the climate scenarios will modify the hazard situation was developed. In two case studies, the impacts of a potential increase of precipitation intensities to the delimited hazard zones were studied. The identification and localization of the torrent and river catchments, where unfavourable changes in the hazard situation occur, could eliminate speculative and unnecessary measures against the impacts of climate changes like a general enlargement of hazard zones or a general over dimensioning of protection structures for the whole territory. The results showed a high spatial variability of the sensitivity of catchments to climate changes. In sensitive catchments, the sediment management in alpine torrents will meet future challenges due to a higher rate for sediment removal from retention basins. The case studies showed a remarkable increase of the areas affected by floods and debris flow when considering possible future precipitation intensities in hazard mapping. But, the calculated increase in extent of future hazard zones lay within the uncertainty of the methods used today for the delimitation of the hazard zones. Thus, the consideration of the uncertainties laying in the methods for the elaboration of hazard zone maps in the torrent and river catchments sensitive to climate changes would provide a useful instrument for the consideration of potential future climate conditions. The study demonstrated that weak points in protection structures in future will become more important in risk management activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

• Background and Aims The uptake, translocation and redistribution of the heavy metals zinc, manganese, nickel, cobalt and cadmium are relevant for plant nutrition as well as for the quality of harvested plant products. The long-distance transport of these heavy metals within the root system and the release to the shoot in young wheat (Triticum aestivum ‘Arina’) plants were investigated. • Methods After the application of 65Zn, 54Mn, 63Ni, 57Co and 109Cd for 24 h to one seminal root (the other seminal roots being excised) of 54-h-old wheat seedlings, the labelled plants were incubated for several days in hydroponic culture on a medium without radionuclides. • Key Results The content of 65Zn decreased quickly in the labelled part of the root. After the transfer of 65Zn from the roots to the shoot, a further redistribution in the phloem from older to younger leaves was observed. In contrast to 65Zn, 109Cd was released more slowly from the roots to the leaves and was subsequently redistributed in the phloem to the youngest leaves only at trace levels. The content of 63Ni decreased quickly in the labelled part of the root, moving to the newly formed parts of the root system and also accumulating transiently in the expanding leaves. The 54Mn content decreased quickly in the labelled part of the root and increased simultaneously in leaf 1. A strong retention in the labelled part of the root was observed after supplying 57Co. • Conclusions The dynamics of redistribution of 65Zn, 54Mn, 63Ni, 57Co and 109Cd differed considerably. The rapid redistribution of 63Ni from older to younger leaves throughout the experiment indicated a high mobility in the phloem, while 54Mn was mobile only in the xylem and 57Co was retained in the labelled root without being loaded into the xylem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: To date, an estimated 10% of children eligible for antiretroviral treatment (ART) receive it, and the frequency of retention in programs is unknown. We evaluated the 2-year risks of death and loss to follow-up (LTFU) of children after ART initiation in a multicenter study in sub-Saharan Africa. METHODS: Pooled analysis of routine individual data from 16 participating clinics produced overall Kaplan-Meier estimates of the probabilities of death or LTFU after ART initiation. Risk factors analysis used Weibull regression, accounting for between-cohort heterogeneity. RESULTS: The median age of 2405 children at ART initiation was 4.9 years (12%, younger than 12 months), 52% were male, 70% had severe immunodeficiency, and 59% started ART with a nonnucleoside reverse transcriptase inhibitor. The 2-year risk of death after ART initiation was 6.9% (95% confidence interval [CI]: 5.9 to 8.1), independently associated with baseline severe anemia (adjusted hazard ratio [aHR]: 4.10 [CI: 2.36 to 7.13]), immunodeficiency (adjusted aHR: 2.95 [CI: 1.49 to 5.82]), and severe clinical status (adjusted aHR: 3.64 [CI: 1.95 to 6.81]); the 2-year risk of LTFU was 10.3% (CI: 8.9 to 11.9), higher in children with severe clinical status. CONCLUSIONS: Once on treatment, the 2-year risk of death is low but the LTFU risk is substantial. ART is still mainly initiated at advanced disease stage in African children, reinforcing the need for early HIV diagnosis, early initiation of ART, and procedures to increase program retention.