905 resultados para Embryonic Mortality, Fungal Infection, Habitat Choice, Soil pH, Terrestrial Nesting
Resumo:
BACKGROUND Current guidelines give recommendations for preferred combination antiretroviral therapy (cART). We investigated factors influencing the choice of initial cART in clinical practice and its outcome. METHODS We analyzed treatment-naive adults with human immunodeficiency virus (HIV) infection participating in the Swiss HIV Cohort Study and starting cART from January 1, 2005, through December 31, 2009. The primary end point was the choice of the initial antiretroviral regimen. Secondary end points were virologic suppression, the increase in CD4 cell counts from baseline, and treatment modification within 12 months after starting treatment. RESULTS A total of 1957 patients were analyzed. Tenofovir-emtricitabine (TDF-FTC)-efavirenz was the most frequently prescribed cART (29.9%), followed by TDF-FTC-lopinavir/r (16.9%), TDF-FTC-atazanavir/r (12.9%), zidovudine-lamivudine (ZDV-3TC)-lopinavir/r (12.8%), and abacavir/lamivudine (ABC-3TC)-efavirenz (5.7%). Differences in prescription were noted among different Swiss HIV Cohort Study sites (P < .001). In multivariate analysis, compared with TDF-FTC-efavirenz, starting TDF-FTC-lopinavir/r was associated with prior AIDS (relative risk ratio, 2.78; 95% CI, 1.78-4.35), HIV-RNA greater than 100 000 copies/mL (1.53; 1.07-2.18), and CD4 greater than 350 cells/μL (1.67; 1.04-2.70); TDF-FTC-atazanavir/r with a depressive disorder (1.77; 1.04-3.01), HIV-RNA greater than 100 000 copies/mL (1.54; 1.05-2.25), and an opiate substitution program (2.76; 1.09-7.00); and ZDV-3TC-lopinavir/r with female sex (3.89; 2.39-6.31) and CD4 cell counts greater than 350 cells/μL (4.50; 2.58-7.86). At 12 months, 1715 patients (87.6%) achieved viral load less than 50 copies/mL and CD4 cell counts increased by a median (interquartile range) of 173 (89-269) cells/μL. Virologic suppression was more likely with TDF-FTC-efavirenz, and CD4 increase was higher with ZDV-3TC-lopinavir/r. No differences in outcome were observed among Swiss HIV Cohort Study sites. CONCLUSIONS Large differences in prescription but not in outcome were observed among study sites. A trend toward individualized cART was noted suggesting that initial cART is significantly influenced by physician's preference and patient characteristics. Our study highlights the need for evidence-based data for determining the best initial regimen for different HIV-infected persons.
Resumo:
Acer saccharum Marsh., is one of the most valuable trees in the northern hardwood forests. Severe dieback was recently reported by area foresters in the western Upper Great Lakes Region. Sugar Maple has had a history of dieback over the last 100 years throughout its range and different variables have been identified as being the predisposing and inciting factors in different regions at different times. Some of the most common factors attributed to previous maple dieback episodes were insect defoliation outbreaks, inadequate precipitation, poor soils, atmospheric deposition, fungal pathogens, poor management, or a combination of these. The current sugar maple dieback was evaluated to determine the etiology, severity, and change in dieback on both industry and public lands. A network of 120 sugar maple health evaluation plots was established in the Upper Peninsula, Michigan, northern Wisconsin, and eastern Minnesota and evaluated annually from 2009-2012. Mean sugar maple crown dieback between 2009-2012 was 12.4% (ranging from 0.8-75.5%) across the region. Overall, during the sampling period, mean dieback decreased by 5% but individual plots and trees continued to decline. Relationships were examined between sugar maple dieback and growth, habitat conditions, ownership, climate, soil, foliage nutrients, and the maple pathogen sapstreak. The only statistically significant factor was found to be a high level of forest floor impacts due to exotic earthworm activity. Sugar maple on soils with lower pH had less earthworm impacts, less dieback, and higher growth rates than those on soils more favorable to earthworms. Nutritional status of foliage and soil was correlated with dieback and growth suggesting perturbation of nutrient cycling may be predisposing or contributing to dieback. The previous winter's snowfall totals, length of stay on the ground, and number of days with freezing temperatures had a significant positive relationship to sugar maple growth rates. Sapstreak disease, Ceratocystis virescens, may be contributing to dieback in some stands but was not related to the amount of dieback in the region. The ultimate goal of this research is to help forest managers in the Great Lakes Region prevent, anticipate, reduce, and/or salvage stands with dieback and loss in the future. An improved understanding of the complex etiology associated with sugar maple dieback in the Upper Great Lakes Region is necessary to make appropriate silvicultural decisions. Forest Health education helps increase awareness and proactive forest management in the face of changing forest ecosystems. Lessons are included to assist educators in incorporating forest health into standard biological disciplines at the secondary school curricula.
Resumo:
Experience with anidulafungin against Candida krusei is limited. Immunosuppressed mice were injected with 1.3 x 10(7) to 1.5 x 10(7) CFU of C. krusei. Animals were treated with saline, 40 mg/kg fluconazole, 1 mg/kg amphotericin B, or 10 and 20 mg/kg anidulafungin for 5 days. Anidulafungin improved survival and significantly reduced the number of CFU/g in kidneys and serum beta-glucan levels.
Resumo:
Cirrhotic patients with chronic hepatitis C virus (HCV) infection remain at risk for complications following sustained virological response (SVR). Therefore, we aimed to evaluate treatment efficacy with the number needed to treat (NNT) to prevent clinical endpoints. Mortality and cirrhosis-related morbidity were assessed in an international multicentre cohort of consecutively treated patients with HCV genotype 1 infection and cirrhosis. The NNT to prevent death or clinical disease progression (any cirrhosis-related event or death) in one patient was determined with the adjusted (event-free) survival among patients without SVR and adjusted hazard ratio of SVR. Overall, 248 patients were followed for a median of 8.3 (IQR 6.2-11.1) years. Fifty-nine (24%) patients attained SVR. Among patients without SVR, the adjusted 5-year survival and event-free survival were 94.4% and 80.0%, respectively. SVR was associated with reduced all-cause mortality (HR 0.15, 95% CI 0.05-0.48, P = 0.002) and clinical disease progression (HR 0.16, 95% CI 0.07-0.36, P < 0.001). The NNT to prevent one death in 5 years declined from 1052 (95% CI 937-1755) at 2% SVR (interferon monotherapy) to 61 (95% CI 54-101) at 35% SVR (peginterferon and ribavirin). At 50% SVR, which might be expected with triple therapy, the estimated NNT was 43 (95% CI 38-71). The NNT to prevent clinical disease progression in one patient in 5 years was 302 (95% CI 271-407), 18 (95% CI 16-24) and 13 (95% CI 11-17) at 2%, 35% and 50% SVR, respectively. In conclusion, the NNT to prevent clinical endpoints among cirrhotic patients with HCV genotype 1 has declined enormously with the improvement of antiviral therapy.
Resumo:
Fungal plant pathogens are common in natural communities where they affect plant physiology, plant survival, and biomass production. Conversely, pathogen transmission and infection may be regulated by plant community characteristics such as plant species diversity and functional composition that favor pathogen diversity through increases in host diversity while simultaneously reducing pathogen infection via increased variability in host density and spatial heterogeneity. Therefore, a comprehensive understanding of multi-host multi-pathogen interactions is of high significance in the context of biodiversity-ecosystem functioning. We investigated the relationship between plant diversity and aboveground obligate parasitic fungal pathogen (''pathogens'' hereafter) diversity and infection in grasslands of a long-term, large-scale, biodiversity experiment with varying plant species (1-60 species) and plant functional group diversity (1-4 groups). To estimate pathogen infection of the plant communities, we visually assessed pathogen-group presence (i.e., rusts, powdery mildews, downy mildews, smuts, and leaf-spot diseases) and overall infection levels (combining incidence and severity of each pathogen group) in 82 experimental plots on all aboveground organs of all plant species per plot during four surveys in 2006. Pathogen diversity, assessed as the cumulative number of pathogen groups on all plant species per plot, increased log-linearly with plant species diversity. However, pathogen incidence and severity, and hence overall infection, decreased with increasing plant species diversity. In addition, co-infection of plant individuals by two or more pathogen groups was less likely with increasing plant community diversity. We conclude that plant community diversity promotes pathogen-community diversity while at the same time reducing pathogen infection levels of plant individuals.
Resumo:
BACKGROUND High early mortality in patients with HIV-1 starting antiretroviral therapy (ART) in sub-Saharan Africa, compared to Europe and North America, is well documented. Longer-term comparisons between settings have been limited by poor ascertainment of mortality in high burden African settings. This study aimed to compare mortality up to four years on ART between South Africa, Europe, and North America. METHODS AND FINDINGS Data from four South African cohorts in which patients lost to follow-up (LTF) could be linked to the national population register to determine vital status were combined with data from Europe and North America. Cumulative mortality, crude and adjusted (for characteristics at ART initiation) mortality rate ratios (relative to South Africa), and predicted mortality rates were described by region at 0-3, 3-6, 6-12, 12-24, and 24-48 months on ART for the period 2001-2010. Of the adults included (30,467 [South Africa], 29,727 [Europe], and 7,160 [North America]), 20,306 (67%), 9,961 (34%), and 824 (12%) were women. Patients began treatment with markedly more advanced disease in South Africa (median CD4 count 102, 213, and 172 cells/µl in South Africa, Europe, and North America, respectively). High early mortality after starting ART in South Africa occurred mainly in patients starting ART with CD4 count <50 cells/µl. Cumulative mortality at 4 years was 16.6%, 4.7%, and 15.3% in South Africa, Europe, and North America, respectively. Mortality was initially much lower in Europe and North America than South Africa, but the differences were reduced or reversed (North America) at longer durations on ART (adjusted rate ratios 0.46, 95% CI 0.37-0.58, and 1.62, 95% CI 1.27-2.05 between 24 and 48 months on ART comparing Europe and North America to South Africa). While bias due to under-ascertainment of mortality was minimised through death registry linkage, residual bias could still be present due to differing approaches to and frequency of linkage. CONCLUSIONS After accounting for under-ascertainment of mortality, with increasing duration on ART, the mortality rate on HIV treatment in South Africa declines to levels comparable to or below those described in participating North American cohorts, while substantially narrowing the differential with the European cohorts. Please see later in the article for the Editors' Summary.
Resumo:
Candida albicans is the most common opportunistic fungal pathogen of humans. The balance between commensal and pathogenic C. albicans is maintained largely by phagocytes of the innate immune system. Analysis of transcriptional changes after macrophage phagocytosis indicates the C. albicans response is broadly similar to starvation, including up-regulation of alternate carbon metabolism. Systems known and suspected to be part of acetate/acetyl-CoA metabolism were also up-regulated, importantly the ACH and ACS genes, which manage acetate/acetyl-CoA interconversion, and the nine-member ATO gene family, thought to participate in transmembrane acetate transport and also linked to the process of environmental alkalinization. ^ Studies into the roles of Ach, Acs1 and Acs2 function in alternate carbon metabolism revealed a substantial role for Acs2 and lesser, but distinct roles, for Ach and Acs1. Deletion mutants were made in C. albicans and were phenotypically evaluated both in vitro and in vivo. Loss of Ach function resulted in mild growth defects on ethanol and acetate and no significant attenuation in virulence in a disseminated mouse model of infection. While loss of Acs1 did not produce any significant phenotypes, loss of Acs2 greatly impaired growth on multiple carbon sources, including glucose, ethanol and acetate. We also concluded that ACS1 and ACS2 likely comprise an essential gene pair. Expression analyses indicated that ACS2 is the predominant form under most growth conditions. ^ ATO gene function had been linked to the process of environmental alkalinization, an ammonium-mediated phenomenon described here first in C. albicans. During growth in glucose-poor, amino acid-rich conditions C. albicans can rapidly change its extracellular pH. This process was glucose-repressible and was accompanied by hyphal formation and changes in colony morphology. We showed that introduction of the ATO1G53D point mutant to C. albicans blocked alkalinization, as did over-expression of C. albicans ATO2, the only C. albicans ATO gene to lack the conserved N-terminal domain. A screen for alkalinization-deficient mutants revealed that ACH1 is essential for alkalinization. However, addition of acetate to the media restored alkalinization to the ach1 mutant. We proposed a model of ATO function in which Atos regulated the cellular co-export of ammonium and acetate. ^
Resumo:
Disseminated MAC (dMAC) is the third most prevalent opportunistic infection in AIDS patients. In order to understand the role MAC infection plays in affecting survival of AIDS patients, a cohort of 203 suspected dMAC veterans seen at the Houston Veterans Affairs Medical Center between August 14, 1987 and December 31, 1991 were analyzed. The criteria for suspected dMAC infection was HIV+ men having a CD4+ level $\le$200 cells/mm$\sp3,$ on zidovudine treatment $\ge$1 month and who had any of the following: (a) a confirmed respiratory MAC infection, (b) fever $\ge$101$\sp\circ\rm F$ for $\ge$48 hours, (c) unexplained weight loss of 10 lbs or $\ge$10% BW over 3 months or (d) Hgb $\le$7.5 g/dl or decrease in Hgb $\ge$3.0 g/dl, while on 500-600 mg/day AZT. The study was conducted before the commencement of an effective MAC anti-mycobacterial therapy, so the true course of MAC infection was seen without the confounder of a therapeutic regimen. Kaplan-Meier and Cox regression survival analysis was used to compare 45 MAC culture positive and 118 MAC culture negative veterans. The 1 year survival rate of veterans with documented dMAC infection was 0.37 compared to 0.50 for veterans not acquiring dMAC infection. Significant differences between subgroups were also seen with the variables: PCP prophylaxis, the AIDS indicator disease Candida esophagitis, CD4+ lymphocyte level, CD4 percent lymphocyte level, WBC level, Hgb and Hct levels. Using multivariate modeling, it was determined that PCP prophylaxis (RR = 6.12, CI 2.24-16.68) was a predictor of survival and both CD4% lymphocytes $\le$6.0% (RR = 0.33, CI 0.17-0.68) and WBC level $\le$3000 cells/mm$\sp3$ (RR = 0.60, CI 0.39-0.93) were predictors of mortality. CD4+ level $\le$50 cells/mm$\sp3$ was not a significant predictor of mortality. Although MAC culture status was a significant predictor of mortality in the univariate model, a positive dMAC culture was not a significant predictor of AIDS mortality in the multivariate model. A positive dMAC culture, however, did affect mortality in a stratified analysis when baseline laboratory values were: CD8+ lymphocytes $>$600 cells/mm$\sp3,$ Hgb $>$11.0 g/dl, Hct $>$31.0% and WBC level $>$3000 cells/mm$\sp3.$ ^
Resumo:
Ocean acidification, caused by increased atmospheric carbon dioxide (CO2) concentrations, is currently an important environmental problem. It is therefore necessary to investigate the effects of ocean acidification on all life stages of a wide range of marine organisms. However, few studies have examined the effects of increased CO2 on early life stages of organisms, including corals. Using a range of pH values (pH 7.3, 7.6, and 8.0) in manipulative duplicate aquarium experiments, we have evaluated the effects of increased CO2 on early life stages (larval and polyp stages) of Acropora spp. with the aim of estimating CO2 tolerance thresholds at these stages. Larval survival rates did not differ significantly between the reduced pH and control conditions. In contrast, polyp growth and algal infection rates were significantly decreased at reduced pH levels compared to control conditions. These results suggest that future ocean acidification may lead to reduced primary polyp growth and delayed establishment of symbiosis. Stress exposure experiments using longer experimental time scales and lower levels of CO2 concentrations than those used in this study are needed to establish the threshold of CO2 emissions required to sustain coral reef ecosystems.
Resumo:
Predicted future CO2 levels can affect reproduction, growth, and behaviour of many marine organisms. However, the capacity of species to adapt to predicted changes in ocean chemistry is largely unknown. We used a unique field-based experiment to test for differential survival associated with variation in CO2 tolerance in a wild population of coral-reef fishes. Juvenile damselfish exhibited variation in their response to elevated (700 µatm) CO2 when tested in the laboratory and this influenced their behaviour and risk of mortality in the wild. Individuals that were sensitive to elevated CO2 were more active and move further from shelter in natural coral reef habitat and, as a result, mortality from predation was significantly higher compared with individuals from the same treatment that were tolerant of elevated CO2. If individual variation in CO2 tolerance is heritable, this selection of phenotypes tolerant to elevated CO2 could potentially help mitigate the effects of ocean acidification.
Resumo:
Energy availability and local adaptation are major components in mediating the effects of ocean acidification (OA) on marine species. In a long-term study, we investigated the effects of food availability and elevated pCO2 (ca 400, 1000 and 3000 µatm) on growth of newly settled Amphibalanus (Balanus) improvisus to reproduction, and on their offspring. We also compared two different populations, which were presumed to differ in their sensitivity to pCO2 due to differing habitat conditions: Kiel Fjord, Germany (Western Baltic Sea) with naturally strong pCO2 fluctuations, and the Tjärnö Archipelago, Sweden (Skagerrak) with far lower fluctuations. Over 20 weeks, survival, growth, reproduction and shell strength of Kiel barnacles were all unaffected by elevated pCO2, regardless of food availability. Moulting frequency and shell corrosion increased with increasing pCO2 in adults. Larval development and juvenile growth of the F1 generation were tolerant to increased pCO2, irrespective of parental treatment. In contrast, elevated pCO2 had a strong negative impact on survival of Tjärnö barnacles. Specimens from this population were able to withstand moderate levels of elevated pCO2 over 5 weeks when food was plentiful but showed reduced growth under food limitation. Severe levels of elevated pCO2 negatively impacted growth of Tjärnö barnacles in both food treatments. We demonstrate a conspicuously higher tolerance to elevated pCO2 in Kiel barnacles than in Tjärnö barnacles. This tolerance was carried-over from adults to their offspring. Our findings indicate that populations from fluctuating pCO2 environments are more tolerant to elevated pCO2 than populations from more stable pCO2 habitats. We furthermore provide evidence that energy availability can mediate the ability of barnacles to withstand moderate CO2 stress. Considering the high tolerance of Kiel specimens and the possibility to adapt over many generations, near future OA alone does not seem to present a major threat for A. improvisus