964 resultados para grass sickness
Resumo:
This study aimed at isolating Trypanosoma brucei gambiense from human African trypanosomiasis (HAT) patients from south Sudan. Fifty HAT patients identified during active screening surveys were recruited, most of whom (49/50) were in second-stage disease. Blood and cerebrospinal fluid samples collected from the patients were cryopreserved using Triladyl as the cryomedium. The samples were stored at -150 degrees C in liquid nitrogen vapour in a dry shipper. Eighteen patient stabilates could be propagated in immunosuppressed Mastomys natalensis and/or SCID mice. Parasitaemia was highest in SCID mice. Further subpassages in M. natalensis increased the virulence of the trypanosomes and all 18 isolates recovered from M. natalensis or SCID mice became infective to other immunosuppressed mouse breeds. A comparison of immunosuppressed M. natalensis and Swiss White, C57/BL and BALB/c mice demonstrated that all rodent breeds were susceptible after the second subpassage and developed a parasitaemia >10(6)/ml by Day 5 post infection. The highest parasitaemias were achieved in C57/BL and BALB/c mice. These results indicate that propagation of T. b. gambiense isolates after initial isolation in immunosuppressed M. natalensis or SCID mice can be done in a range of immunosuppressed rodents.
Resumo:
AIM: Acute mountain sickness (AMS) can result in pulmonary and cerebral oedema with overperfusion of microvascular beds, elevated hydrostatic capillary pressure, capillary leakage and consequent oedema as pathogenetic mechanisms. Data on changes in glomerular filtration rate (GFR) at altitudes above 5000 m are very limited. METHODS: Thirty-four healthy mountaineers, who were randomized to two acclimatization protocols, undertook an expedition on Muztagh Ata Mountain (7549 m) in China. Tests were performed at five altitudes: Zurich pre-expedition (PE, 450 m), base camp (BC, 4497 m), Camp 1 (C1, 5533 m), Camp 2 (C2, 6265 m) and Camp 3 (C3, 6865 m). Cystatin C- and creatinine-based (Mayo Clinic quadratic equation) GFR estimates (eGFR) were assessed together with Lake Louise AMS score and other tests. RESULTS: eGFR significantly decreased from PE to BC (P < 0.01). However, when analysing at changes between BC and C3, only cystatin C-based estimates indicated a significant decrease in GFR (P = 0.02). There was a linear decrease in eGFR from PE to C3, with a decrease of approx. 3.1 mL min(-1) 1.73 m(-2) per 1000 m increase in altitude. No differences between eGFR of the two groups with different acclimatization protocols could be observed. There was a significant association between eGFR and haematocrit (P = 0.01), whereas no significant association between eGFR and aldosterone, renin and brain natriuretic peptide could be observed. Finally, higher AMS scores were significantly associated with higher eGFR (P = 0.01). CONCLUSIONS: Renal function declines when ascending from low to high altitude. Cystatin C-based eGFR decreases during ascent in high altitude expedition but increases with AMS scores. For individuals with eGFR <40 mL min(-1) 1.73 m(-2), caution may be necessary when planning trips to high altitude above 4500 m above sea level.
Resumo:
Reed canary grass (Phalaris arundinacea L.) is an invasive species originally from Europe that has now expanded to a large range within the United States. Reed canary grass possesses a number of traits that allow it to thrive in a wide range of environmental factors, including high rates of sedimentation, bouts of flooding, and high levels of nutrient inputs. Therefore, the goals of our study were to determine if 1) certain types of wetland were more susceptible to Reed canary grass invasion, and 2) disturbances facilitated Reed canary grass invasion. This study was conducted within the Keweenaw Bay Indian Community reservation in the Upper Peninsula of Michigan, in Baraga County. We selected 28 wetlands for analysis. At each wetland, we identified and sampled distinct vegetative communities and their corresponding environmental attributes, which included water table depth, pH, conductivity, calcium and magnesium concentrations, and percent organic matter. Disturbances at each site were catalogued and their severity estimated with the aid of aerial photos. A GIS dataset containing information about the location of Reed canary grass within the study wetlands, the surrounding roads and the level of roadside Reed canary grass invasion was also developed. In all, 287 plant species were identified and classified into 16 communities, which were then further grouped into three broad groupings of wetlands: nonforested graminoid, Sphagnum peatlands, and forested wetlands. The two most common disturbances identified were roads and off-road recreation trails, both occurring at 23 of the 28 sites. Logging activity surrounding the wetlands was the next most common disturbance and was found at 18 of the sites. Occurrence of Reed canary grass was most common in the non-forested graminoid communities. Reed canary grass was very infrequent in forested wetlands, and almost never occurred in the Sphagnum peatlands. Disturbance intensity was the most significant environmental factor in explaining Reed canary grass occurrence within wetlands. Statistically significant relationships were identified at distances of 1000 m, 500 m, and 250 m from studied wetlands, between the level of road development and the severity of Reed canary grass invasion along roadsides. Further analysis revealed a significant relationship between roadside Reed canary grass populations and the level of road development (e.g. paved, graded, and ungraded).
Resumo:
Renewable hydrocarbon biofuels are being investigated as possible alternatives to conventional liquid transportation fossil fuels like gasoline, kerosene (aviation fuel), and diesel. A diverse range of biomass feedstocks such as corn stover, sugarcane bagasse, switchgrass, waste wood, and algae, are being evaluated as candidates for pyrolysis and catalytic upgrading to produce drop-in hydrocarbon fuels. This research has developed preliminary life cycle assessments (LCA) for each feedstock-specific pathway and compared the greenhouse gas (GHG) emissions of the hydrocarbon biofuels to current fossil fuels. As a comprehensive study, this analysis attempts to account for all of the GHG emissions associated with each feedstock pathway through the entire life cycle. Emissions from all stages including feedstock production, land use change, pyrolysis, stabilizing the pyrolysis oil for transport and storage, and upgrading the stabilized pyrolysis oil to a hydrocarbon fuel are included. In addition to GHG emissions, the energy requirements and water use have been evaluated over the entire life cycle. The goal of this research is to help understand the relative advantages and disadvantages of the feedstocks and the resultant hydrocarbon biofuels based on three environmental indicators; GHG emissions, energy demand, and water utilization. Results indicate that liquid hydrocarbon biofuels produced through this pyrolysis-based pathway can achieve greenhouse gas emission savings of greater than 50% compared to petroleum fuels, thus potentially qualifying these biofuels under the US EPA RFS2 program. GHG emissions from biofuels ranged from 10.7-74.3 g/MJ from biofuels derived from sugarcane bagasse and wild algae at the extremes of this range, respectively. The cumulative energy demand (CED) shows that energy in every biofuel process is primarily from renewable biomass and the remaining energy demand is mostly from fossil fuels. The CED for biofuel range from 1.25-3.25 MJ/MJ from biofuels derived from sugarcane bagasse to wild algae respectively, while the other feedstock-derived biofuels are around 2 MJ/MJ. Water utilization is primarily from cooling water use during the pyrolysis stage if irrigation is not used during the feedstock production stage. Water use ranges from 1.7 - 17.2 gallons of water per kg of biofuel from sugarcane bagasse to open pond algae, respectively.
Resumo:
Antibiotics are emerging contaminants worldwide. Due to insufficient policy regulations, public awareness, and the constant exposure of the environment to antibiotic sources has created a major environmental concern. Wastewater treatment plants (WWTP) are not equipped to filter-out these compounds before the discharge of the disinfected effluent into water sources (e.g., lakes and streams) and current available technologies are not equipped to remediate these compounds from environmental sources. Hence, the challenge remains to establish a biological system to remove these antibiotics from wastewater. An invitro hydroponic remediation system was developed using vetiver grass (Chrysopogon zizanioides L. Nash) to remediate tetracycline (TC) from water. Comparative metabolomics studies were conducted to investigate the metabolites/pathways associated with tetracycline metabolism in plants and TC-degrading bacteria. The results show that vetiver plants effectively uptake tetracycline from water sources. Vetiver root-associated bacteria recovered during the hydroponic remediation trial were highly tolerant to TC (as high as 600 ppm) and could use TC as a sole carbon and energy source. Growth conditions (pH, temperature, and oxygen requirement) for TC-tolerant bacteria were optimized for higher TC remediation capability from water sources. The plant (roots and shoots) and bacterial species were further characterized for the metabolites produced during the TC degradation process using GC-MS to identify the possible biochemical mechanism involved. Also, the plant root zone was screened for metabolites/enzymes that were secreted during antibiotic degradation and could potentially enhance the degradation process. The root zone was selected for this analysis because this region of the plant has shown a greater capacity for antibiotic degradation compared to the shoot zone. The role of antioxidant enzymes in TC degradation process revealed glutathione-S-transferase (GSTs) as an important group of enzymes in both plant and bacteria potentially involved in TC degradation process. Metabolomics results also suggest potential GST activity in the TC remediation/ transformation process used by plants. This information could be useful in gaining insights for the application of biological remediation systems for the mitigation of antibiotics from waste-water.
Resumo:
OBJECTIVE: Acute mountain sickness is a frequent and debilitating complication of high-altitude exposure, but there is little information on the prevalence and time course of acute mountain sickness in children and adolescents after rapid ascent by mechanical transportation to 3500 m, an altitude at which major tourist destinations are located throughout the world. METHODS: We performed serial assessments of acute mountain sickness (Lake Louise scores) in 48 healthy nonacclimatized children and adolescents (mean +/- SD age: 13.7 +/- 0.3 years; 20 girls and 28 boys), with no previous high-altitude experience, 6, 18, and 42 hours after arrival at the Jungfraujoch high-altitude research station (3450 m), which was reached through a 2.5-hour train ascent. RESULTS: We found that the overall prevalence of acute mountain sickness during the first 3 days at high altitude was 37.5%. Rates were similar for the 2 genders and decreased progressively during the stay (25% at 6 hours, 21% at 18 hours, and 8% at 42 hours). None of the subjects needed to be evacuated to lower altitude. Five subjects needed symptomatic treatment and responded well. CONCLUSION: After rapid ascent to high altitude, the prevalence of acute mountain sickness in children and adolescents was relatively low; the clinical manifestations were benign and resolved rapidly. These findings suggest that, for the majority of healthy nonacclimatized children and adolescents, travel to 3500 m is safe and pharmacologic prophylaxis for acute mountain sickness is not needed.
Resumo:
Bloch, Konrad E., Alexander J. Turk, Marco Maggiorini, Thomas Hess, Tobias Merz, Martina M. Bosch, Daniel Barthelmes, Urs Hefti, Jacqueline Pichler, Oliver Senn, and Otto D. Schoch. Effect of ascent protocol on acute mountain sickness and success at Muztagh Ata, 7546 m. High Alt. Med. Biol. 10:25-32, 2009.-Data on acclimatization during expedition-style climbing to > 5000 m are scant. We evaluated the hypothesis that minor differences in ascent protocol influence acute mountain sickness (AMS) symptoms and mountaineering success in climbers to Muztagh Ata (7546 m), Western China. We performed a randomized, controlled trial during a high altitude medical research expedition to Muztagh Ata. Thirty-four healthy mountaineers (mean age 45 yr, 7 women) were randomized to follow one of two protocols, ascending within 15 or 19 days to the summit of Muztagh Ata at 7546 m, respectively. The main outcome measures, AMS symptom scores and the number of proceeding climbers, were assessed daily. Mean +/- SD AMS-C scores of 16 climbers randomized to slow ascent were 0.06 +/- 0.18, 0.26 +/- 0.08, 0.41 +/- 0.45, 0.53 +/- 0.77 at camps I (5533 m), II (6265 m), III (6865 m), and the summit (7546 m), respectively. Corresponding values in 18 climbers randomized to fast ascent were significantly higher: 0.17 +/- 0.23, 0.43 +/- 0.75, 0.49 +/- 0.36, and 0.69 +/- 0.54 (p < 0.008, vs. slow ascent in regression analysis accounting for weather-related protocol deviation). Climbers randomized to slow ascent were able to ascend according to the protocol without AMS for significantly more days than climbers randomized to fast ascent (p = 0.04, Kaplan-Meier analysis). More climbers randomized to slow ascent were successful in reaching the highest camp at 6865 m without AMS (odds ratio 9.5; 95% confidence interval 1.02 to 89). In climbers ascending to very high altitudes, differences of a few days in acclimatization have a significant impact on symptom severity, the prevalence of AMS, and mountaineering success. ClinicalTrials.gov Identifier NCT00603122.
Resumo:
Pastures containing alfalfa-grass or smooth bromegrass were stocked with .6, .8, or 1.0 cow-calf units per acre to compare cow and calf production in rotational grazing systems managed for optimum forage quality. To remove excess forage early in the grazing season, yearling heifers or steers grazed with the cows in each pasture at a stocking rate of .6 ccu per acre for the first 28, 37, and 40 days of grazing in years one, two, and three. Live forage density and days of grazing per paddock were estimated by sward height. Cows, calves, and yearlings were weighed and cows condition scored every 28 days. All cows grazed for 140 days unless forage became limiting. The cows on the smooth bromegrass pasture stocked at 1.0 cow-calf units per acre were removed after 119 days in 1994, 129 days in 1995, and 125 days in 1996. Cows on one of the alfalfagrass pastures stocked at 1.0 ccu per acre were removed after 136 days of grazing in 1996 because of lack of forage. Alfalfa-grass pastures tended to have a more consistent supply of forage over the grazing season than the bromegrass pastures. Cows grazing the alfalfa-grass pastures had greater seasonal weight gains and body condition score increases and lower yearling weight gains than the smooth bromegrass pastures. Daily and total calf weight gains and total animal production also tended to be greater in alfalfa-cool season grass pastures. Increasing stocking rates resulted in significantly lower cow body condition increases and yearling weight gains, and also increased the amounts of calf and total growing animal produced.
Resumo:
Pastures containing alfalfa-smooth bromegrass or smooth bromegrass were stocked with .6, .8, or 1.0 cow-calf units per acre to compare cow and calf production in rotational grazing systems managed for optimum forage quality. To remove excess forage early in the grazing season, yearling heifers grazed with the cows in each pasture at a stocking rate of .6 heifers per acre for the first 28 days of grazing. Live forage density and days of grazing per paddock were estimated by sward height. Cows, calves, and heifers were weighed and cows condition scored every 28 days. All cows grazed for 140 days except those grazing the smooth bromegrass pasture stocked at 1.0 cow-calf units per acre; these were removed after 119 days in 1994 and 129 days in 1995 because of lack of forage. Alfalfa-grass pastures tended to have a more consistent supply of forage over the grazing season than the bromegrass pastures. Cows grazing the alfalfa-cool season grass pastures had greater seasonal weight gains and body condition score increases and lower heifer weight gains than the smooth bromegrass pastures. Daily and total calf weight gains and total animal production also tended to be greater in alfalfa-cool season grass pastures. Increasing stocking rates resulted in significantly lower condition increases and heifer weight gains, while increasing the amounts of calf and total growing animal produced.