918 resultados para bladder irrigation
Resumo:
In dryland cotton cropping systems, the main weeds and effectiveness of management practices were identified, and the economic impact of weeds was estimated using information collected in a postal and a field survey of Southern Queensland and northern New South Wales. Forty-eight completed questionnaires were returned, and 32 paddocks were monitored in early and late summer for weed species and density. The main problem weeds were bladder ketmia (Hibiscus trionum), common sowthistle (Sonchus oleraceus), barnyard grasses (Echinochloa spp.), liverseed grass (Urochloa panicoides) and black bindweed (Fallopia convolvulus), but the relative importance of these differed with crops, fallows and crop rotations. The weed flora was diverse with 54 genera identified in the field survey. Control of weed growth in rotational crops and fallows depended largely on herbicides, particularly glyphosate in fallow and atrazine in sorghum, although effective control was not consistently achieved. Weed control in dryland cotton involved numerous combinations of selective herbicides, several non-selective herbicides, inter-row cultivation and some manual chipping. Despite this, residual weeds were found at 38-59% of initial densities in about 3-quarters of the survey paddocks. The on-farm financial costs of weeds ranged from $148 to 224/ha.year depending on the rotation, resulting in an estimated annual economic cost of $19.6 million. The approach of managing weed populations across the whole cropping system needs wider adoption to reduce the weed pressure in dryland cotton and the economic impact of weeds in the long term. Strategies that optimise herbicide performance and minimise return of weed seed to the soil are needed. Data from the surveys provide direction for research to improve weed management in this cropping system. The economic framework provides a valuable measure of evaluating likely future returns from technologies or weed management improvements.
Resumo:
This paper is the first of a series that investigates whether new cropping systems with permanent raised beds (PRBs) or Flat land could be successfully used to increase farmers' incomes from rainfed crops in Lombok in Eastern Indonesia. This paper discusses the rice phase of the cropping system. Low grain yields of dry-seeded rice (Oryza sativa) grown on Flat land on Vertisols in the rainfed region of southern Lombok, Eastern Indonesia, are probably mainly due to (a) erratic rainfall (870-1220 mm/yr), with water often limiting at sensitive growth stages, (b) consistently high temperatures (average maximum - 31 C), and (c) low solar radiation. Farmers are therefore poor, and labour is hard and costly, as all operations are manual. Two replicated field experiments were run at Wakan (annual rainfall = 868 mm) and Kawo (1215 mm) for 3 years (2001/2002 to 2003/2004) on Vertisols in southern Lombok. Dry-seeded rice was grown in 4 treatments with or without manual tillage on (a) PRBs, 1.2 m wide, 200 mm high, separated by furrows 300 mm wide, 200 mill deep, with no rice sown in the well-graded furrows, and (b) well-graded Flat land. Excess surface water was harvested from each treatment and used for irrigation after the vegetative stage of the rice. All operations were manual. There were no differences between treatments in grain yield of rice (mean grain yield = 681 g/m(2)) which could be partly explained by total number of tillers/hill and mean panicle length, but not number of productive tillers/hill, plant height or weight of 1000 grains. When the data from both treatments on PRBs and from both treatments on Flat land, each year at each site were analysed, there were also no differences in grain yield of rice (g/m(2)). When rainfall in the wet season up to harvest was over 1000 mm (Year 2; Wakan, Kawo), or plants were water-stressed during crop establishment (Year 1; Wakan) or during grain-fill (Year 3: Kawo), there were significant differences in grain yield (g/1.5 m(2)) between treatments; generally the grain yield (g/1.5 m(2)) on PRBs with or without tillage was less than that on Flat land with or without tillage. However, when the data from both treatments on PRBs and from both treatments on Flat land, each year at each site, were analysed, the greater grain yield of dry-seeded rice on Flat land (mean yield 1 092 g/1.5 m(2)) than that on PRBs (mean 815 g/1.5 m(2)) was mainly because there were 25% more plants on Flat land. Overall when the data in the 2 outer rows and the 2 inner rows on PRBs were each combined, there was a higher number of productive tillers in the combined outer rows (mean 20.7 tillers/hill) compared with that in the combined inner rows on each PRB (mean 18.2 tillers/hill). However, there were no differences in grain yield between combined rows (mean 142 g/m row). Hence with a gap of 500 mm (the distance between the outer rows of plants on adjacent raised beds), plants did not compensate in grain yield for missing plants in furrows. This suggests that rice (a) also sown in furrows, or (b) sown in 7 rows with narrower row-spacing, or (c) sown in 6 rows with slightly wider row-spacing, and narrower gap between outer rows on adjacent beds, may further increase grain yield (g/1.5 m(2)) in this system of PRBs. The growth and the grain yield (y in g/m(2)) of rainfed rice (with rainfall on-site the only source of water for irrigation) depended mainly on the rainfall (x in mm) in the wet season up to harvest (due either to site or year) with y = 1. 1x -308; r(2) = 0.54; p < 0.005. However, 280 mm (i.e. 32%) of the rainfall was not directly used to produce grain (i.e. when y = 0 g/m(2)). Manual tillage did not affect growth and grain yield of rice (g/m(2); g/1.5 m(2)), either on PRB or on Flat land.
Resumo:
An experiment using herds of similar to 20 cows (farmlets) assessed the effects of high stocking rates on production and profitability of feeding systems based on dryland and irrigated perennial ryegrass-based pastures in a Mediterranean environment in South Australia over 4 years. A target level of milk production of 7000 L/cow.year was set, based on predicted intakes of 2.7 t DM/cow.year as concentrates, pasture intakes from 1.5 to 2.7 t/cow.year and purchased fodder. In years 1 and 2, up to 1.5 t DM/cow.year of purchased fodder was used and in years 3 and 4 the amounts were increased if necessary to enable levels of milk production per cow to be maintained at target levels. Cows in dryland farmlets calved in March to May inclusive and were stocked at 2.5, 2.9, 3.3, 3.6 and 4.1 cows/ha, while those in irrigated farmlets calved in August to October inclusive and were stocked at 4.1, 5.2, 6.3 and 7.4 cows/ha. In the first 2 years, when inputs of purchased fodder were limited, milk production per cow was reduced with higher stocking rates (P < 0.01), but in years 3 and 4 there were no differences. Mean production was 7149 kg/cow.year in years 1 and 2, and 8162 kg/cow.year in years 3 and 4. Production per hectare was very closely related to stocking rate in all years (P < 0.01), increasing from 18 to 34 t milk/ha.year for dryland farmlets (1300 to 2200 kg milk solids/ha) and from 30 to 60 t milk/ha.year for irrigated farmlets (2200 to 4100 kg milk solids/ha). Almost all of these increases were attributed to the increases in grain and purchased fodder inputs associated with the increases in stocking rate. Net pasture accumulation rates and pasture harvest were generally not altered with stocking rate, though as stocking rate increased there was a change to more of the pasture being grazed and less conserved in both dryland and irrigated farmlets. Total pasture harvest averaged similar to 8 and 14 t DM/ha.year for dryland and irrigated pastures, respectively. An exception was at the highest stocking rate under irrigation, where pugging during winter was associated with a 14% reduction in annual pasture growth. There were several indications that these high stocking rates may not be sustainable without substantial changes in management practice. There were large and positive nutrient balances and associated increases in soil mineral content (P < 0.01), especially for phosphorus and nitrate nitrogen, with both stocking rate and succeeding years. Levels under irrigation were considerably higher (up to 90 and 240 mg/kg of soil for nitrate nitrogen and phosphorus, respectively) than under dryland pastures (60 and 140 mg/kg, respectively). Soil organic carbon levels did not change with stocking rate, indicating a high level of utilisation of forage grown. Weed ingress was also high (to 22% DM) in all treatments and especially in heavily stocked irrigated pastures during winter. It was concluded the higher stocking rates used exceeded those that are feasible for Mediterranean pastures in this environment and upper levels of stocking are suggested to be 2.5 cows/ha for dryland pastures and 5.2 cows/ha for irrigated pastures. To sustain these suggested stocking rates will require further development of management practices to avoid large increases in soil minerals and weed invasion of pastures.
Resumo:
Bemisia tabaci, biotype B, commonly known as the silverleaf whitefly (SLW) is an alien species that invaded Australia in the mid-90s. This paper reports on the invasion ecology of SLW and the factors that are likely to have contributed to the first outbreak of this major pest in an Australian cotton cropping system, population dynamics of SLW within whitefly-susceptible crop (cotton and cucurbit) and non-crop vegetation (sowthistle, Sonchus spp.) components of the cropping system were investigated over four consecutive growing seasons (September-June) 2001/02-2004/05 in the Emerald Irrigation Area (EIA) of Queensland, Australia. Based on fixed geo-referenced sampling sites, variation in spatial and temporal abundance of SLW within each system component was quantified to provide baseline data for the development of ecologically sustainable pest management strategies. Parasitism of large (3rd and 4th instars) SLW nymphs by native aphelinid wasps was quantified to determine the potential for natural control of SLW populations. Following the initial outbreak in 2001/02, SLW abundance declined and stabilised over the next three seasons. The population dynamics of SLW is characterised by inter-seasonal population cycling between the non-crop (weed) and cotton components of the EIA cropping system. Cotton was the largest sink for and source of SLW during the study period. Over-wintering populations dispersed from weed host plant sources to cotton in spring followed by a reverse dispersal in late summer and autumn to broad-leaved crops and weeds. A basic spatial source-sink analysis showed that SLW adult and nymph densities were higher in cotton fields that were closer to over-wintering weed sources throughout spring than in fields that were further away. Cucurbit fields were not significant sources of SLW and did not appear to contribute significantly to the regional population dynamics of the pest. Substantial parasitism of nymphal stages throughout the study period indicates that native parasitoid species and other natural enemies are important sources of SLW mortality in Australian cotton production systems. Weather conditions and use of broad-spectrum insecticides for pest control are implicated in the initial outbreak and on-going pest status of SLW in the region.
Resumo:
Over recent decades, Australian piggeries have commonly employed anaerobic ponds to treat effluent to a standard suitable for recycling for shed flushing purposes and for irrigation onto nearby agricultural land. Anaerobic ponds are generally sized according to the Rational Design Standard (RDS) developed by Barth (1985), resulting in large ponds, which can be expensive to construct, occupy large land areas, and are difficult and expensive to desludge, potentially disrupting the whole piggery operation. Limited anecdotal and scientific evidence suggests that anaerobic ponds that are undersized according to the RDS, operate satisfactorily, without excessive odour emission, impaired biological function or high rates of solids accumulation. Based on these observations, this paper questions the validity of rigidly applying the principles of the RDS and presents a number of alternate design approaches resulting in smaller, more highly loaded ponds that are easier and cheaper to construct and manage. Based on limited data of pond odour emission, it is suggested that higher pond loading rates may reduce overall odour emission by decreasing the pond volume and surface area. Other management options that could be implemented to reduce pond volumes include permeable pond covers, various solids separation methods, and bio-digesters with impermeable covers, used in conjunction with biofilters and/or systems designed for biogas recovery. To ensure that new effluent management options are accepted by regulatory authorities, it is important for researchers to address both industry and regulator concerns and uncertainties regarding new technology, and to demonstrate, beyond reasonable doubt, that new technologies do not increase the risk of adverse impacts on the environment or community amenity. Further development of raw research outcomes to produce relatively simple, practical guidelines and implementation tools also increases the potential for acceptance and implementation of new technology by regulators and industry.
Resumo:
Seed persistence is poorly quantified for invasive plants of subtropical and tropical environments and Lantana camara, one of the world's worst weeds, is no exception. We investigated germination, seedling emergence, and seed survival of two lantana biotypes (Pink and pink-edged red [PER]) in southeastern Queensland, Australia. Controlled experiments were undertaken in 2002 and repeated in 2004, with treatments comprising two differing environmental regimes (irrigated and natural rainfall) and sowing depths (0 and 2 cm). Seed survival and seedling emergence were significantly affected by all factors (time, biotype, environment, sowing depth, and cohort) (P < 0.001). Seed dormancy varied with treatment (environment, sowing depth, biotype, and cohort) (P < 0.001), but declined rapidly after 6 mo. Significant differential responses by the two biotypes to sowing depth and environment were detected for both seed survival and seedling emergence (P < 0.001). Seed mass was consistently lower in the PER biotype at the population level (P < 0.001), but this variation did not adequately explain the differential responses. Moreover, under natural rainfall the magnitude of the biotype effect was unlikely to result in ecologically significant differences. Seed survival after 36 mo under natural rainfall ranged from 6.8 to 21.3%. Best fit regression analysis of the decline in seed survival over time yielded a five-parameter exponential decay model with a lower asymptote approaching −0.38 (% seed survival = [( 55 − (−0.38)) • e (k • t)] + −0.38; R2 = 88.5%; 9 df). Environmental conditions and burial affected the slope parameter or k value significantly (P < 0.01). Seed survival projections from the model were greatest for buried seeds under natural rainfall (11 yr) and least under irrigation (3 yr). Experimental data and model projections suggest that lantana has a persistent seed bank and this should be considered in management programs, particularly those aimed at eradication.
Resumo:
Surface losses of nitrogen from horticulture farms in coastal Queensland, Australia, may have the potential to eutrophy sensitive coastal marine habitats nearby. A case-study of the potential extent of such losses was investigated in a coastal macadamia plantation. Nitrogen losses were quantified in 5 consecutive runoff events during the 13-month study. Irrigation did not contribute to surface flows. Runoff was generated by storms at combined intensities and durations that were 20–40 mm/h for >9 min. These intensities and durations were within expected short-term (1 year) and long-term (up to 20 years) frequencies of rainfall in the study area. Surface flow volumes were 5.3 ± 1.1% of the episodic rainfall generated by such storms. Therefore, the largest part of each rainfall event was attributed to infiltration and drainage in this farm soil (Kandosol). The estimated annual loss of total nitrogen in runoff was 0.26 kg N/ha.year, representing a minimal loading of nitrogen in surface runoff when compared to other studies. The weighted average concentrations of total sediment nitrogen (TSN) and total dissolved nitrogen (TDN) generated in the farm runoff were 2.81 ± 0.77% N and 1.11 ± 0.27 mg N/L, respectively. These concentrations were considerably greater than ambient levels in an adjoining catchment waterway. Concentrations of TSN and TDN in the waterway were 0.11 ± 0.02% N and 0.50 ± 0.09 mg N/L, respectively. The steep concentration gradient of TSN and TDN between the farm runoff and the waterway demonstrated the occurrence of nutrient loading from the farming landscapes to the waterway. The TDN levels in the stream exceeded the current specified threshold of 0.2–0.3 mg N/L for eutrophication of such a waterway. Therefore, while the estimate of annual loading of N from runoff losses was comparatively low, it was evident that the stream catchment and associated agricultural land uses were already characterised by significant nitrogen loadings that pose eutrophication risks. The reported levels of nitrogen and the proximity of such waterways (8 km) to the coastline may have also have implications for the nearshore (oligotrophic) marine environment during periods of turbulent flow.
Resumo:
Promotion of better procedures for releasing undersize fish, advocacy of catch-and-release angling, and changing minimum legal sizes are increasingly being used as tools for sustainable management of fish stocks. However without knowing the proportion of released fish that survive, the conservation value of any of these measures is uncertain. We developed a floating vertical enclosure to estimate short-term survival of released line-caught tropical and subtropical reef-associated species, and used it to compare the effectiveness of two barotrauma-relief procedures (venting and shotline releasing) on red emperor (Lutjanus sebae). Barotrauma signs varied with capture depth, but not with the size of the fish. Fish from the greatest depths (40-52 m) exhibited extreme signs less frequently than did those from intermediate depths (30-40 m), possibly as a result of swim bladder gas being vented externally through a rupture in the body wall. All but two fish survived the experiment, and as neither release technique significantly improved short-term survival of the red emperor over non-treatment we see little benefit in promoting either venting or shotline releasing for this comparatively resilient species. Floating vertical enclosures can improve short-term post-release mortality estimates as they overcome many problems encountered when constraining fish in submerged cages.
Resumo:
Farmlets, each of 20 cows, were established to field test five milk production systems and provide a learning platform for farmers and researchers in a subtropical environment. The systems were developed through desktop modelling and industry consultation in response to the need for substantial increases in farm milk production following deregulation of the industry. Four of the systems were based on grazing and the continued use of existing farmland resource bases, whereas the fifth comprised a feedlot and associated forage base developed as a greenfield site. The field evaluation was conducted over 4 years under more adverse environmental conditions than anticipated with below average rainfall and restrictions on irrigation. For the grazed systems, mean annual milk yield per cow ranged from 6330 kg/year (1.9 cows/ha) for a herd based on rain-grown tropical pastures to 7617 kg/year (3.0 cows/ha) where animals were based on temperate and tropical irrigated forages. For the feedlot herd, production of 9460 kg/cow.year (4.3 cows/ha of forage base) was achieved. For all herds, the level of production achieved required annual inputs of concentrates of similar to 3 t DM/animal and purchased conserved fodder from 0.3 to 1.5 t DM/animal. This level of supplementary feeding made a major contribution to total farm nutrient inputs, contributing 50% or more of the nitrogen, phosphorus and potassium entering the farming system, and presents challenges to the management of manure and urine that results from the higher stocking rates enabled. Mean annual milk production for the five systems ranged from 88 to 105% of that predicted by the desktop modelling. This level of agreement for the grazed systems was achieved with minimal overall change in predicted feed inputs; however, the feedlot system required a substantial increase in inputs over those predicted. Reproductive performance for all systems was poorer than anticipated, particularly over the summer mating period. We conclude that the desktop model, developed as a rapid response to assist farmers modify their current farming systems, provided a reasonable prediction of inputs required and milk production. Further model development would need to consider more closely climate variability, the limitations summer temperatures place on reproductive success and the feed requirements of feedlot herds.
Resumo:
In February 2004, Redland Shire Council with help from a Horticulture Australia research project was able to establish a stable grass cover of seashore paspalum (Paspalum vaginatum) on a Birkdale park where the soil had previously proved too salty to grow anything else. Following on from their success with this small 0.2 ha demonstration area, Redland Shire has since invested hundreds of thousands of dollars in successfully turfing other similarly “impossible” park areas with seashore paspalum. Urban salinity can arise for different reasons in different places. In inland areas such as southern NSW and the WA wheatbelt, the usual cause is rising groundwater bringing salt to the surface. In coastal sites, salt spray or periodic tidal inundation can result in problems. In Redland Shire’s case, the issue was compacted marine sediments (mainly mud) dug up and dumped to create foreshore parkland in the course of artificial canal developments. At Birkdale, this had created a site that was both strongly acid and too salty for most plants. Bare saline scalds were interspersed by areas of unthrifty grass. Finding a salt tolerant grass is no “silver bullet” or easy solution to salinity problems. Rather, it buys time to implement sustainable long-term establishment and maintenance practices, which are even more critical than with conventional turfgrasses. These practices include annual slicing or coring in conjunction with gypsum/dolomite amendment and light topdressing with sandy loam soil (to about 1 cm depth), adequate maintenance fertiliser, weed control measures, regular leaching irrigation was applied to flush salts below the root zone, and irrigation scheduling to maximise infiltration and minimise run off. Three other halophytic turfgrass species were also identified, each of them adapted to different environments, management regimes and uses. These have been shortlisted for larger-scale plantings in future work.
Resumo:
Soil water repellency occurs widely in horticultural and agricultural soils when very dry. The gradual accumulation and breakdown of surface organic matter over time produces wax-like organic acids, which coat soil particles preventing uniform entry of water into the soil. Water repellency is usually managed by regular surfactant applications. Surfactants, literally, are surface active agents (SURFace ACTive AgeNTS). Their mode of action is to reduce the surface tension of water, allowing it to penetrate and wet the soil more easily and completely. This practice improves water use efficiency (by requiring less water to wet the soil and by capturing rainfall and irrigation more effectively and rapidly). It also reduces nutrient losses through run-off erosion or leaching. These nutrients have the potential to pollute the surrounding environment and water courses. This project investigated potential improvements to standard practices (product combination and scheduling) for surfactant use to overcome localised dry spots on water repellent soils and thus improve turf quality and water use efficiency. Weather conditions for the duration of the trial prevented the identification of improved practices in terms of combination and scheduling. However, the findings support previous research that the use of soil surfactants decreased the time for water to infiltrate dry soil samples taken from a previously severely hydrophobic site. Data will be continually collected from this trial site on a private contractual basis, with the hope that improvements to standard practices will be observed during the drier winter months when moisture availability is a limiting factor for turfgrass growth and quality.
Resumo:
Salinity is an increasingly important issue in both rural and urban areas throughout much of Australia. The use of recycled/reclaimed water and other sources of poorer quality water to irrigate turf is also increasing. Hybrid Bermudagrass (Cynodon dactylon (L.) Pers. x C. transvaalensis Burtt Davey), together with the parent species C. dactylon, are amongst the most widely used warm-season turf grass groups. Twelve hybrid Bermudagrass genotypes and one accession each of Bermudagrass (C. dactylon), African Bermudagrass (C. transvaalensis) and seashore paspalum (Paspalum vaginatum Sw.) were grown in a glasshouse experiment with six different salinity treatments applied hydroponically through the irrigation water (ECW = <0.1, 6, 12, 18, 24 or 30 dSm-1) in a flood-and-drain system. Each pot was clipped progressively at 2-weekly intervals over the 12-week experimental period to determine dry matter production; leaf firing was rated visually on 3 occasions during the last 6 weeks of salinity treatment. At the end of the experiment, dry weights of roots and crowns below clipping height were also determined. Clipping yields declined sharply after about the first 6 weeks of salinity treatment, but then remained stable at substantially lower levels of dry matter production from weeks 8 to 12. Growth data over this final 4-week experimental period is therefore a more accurate guide to the relative salinity tolerance of the 15 entries than data from the preceding 8 weeks. Based on these data, the 12 hybrid Bermudagrass genotypes showed moderate salinity tolerance, with FloraDwarfM, 'Champion Dwarf', NovotekM and 'TifEagle' ranking as the most salt tolerant and 'Patriot', 'Santa Ana', 'Tifgreen' and TifSport M the least tolerant within the hybrid group. Nevertheless, Santa Ana, for example, maintained relatively strong root growth as salinity increased, and so may show better salt tolerance in practice than predicted from the growth data alone. The 12 hybrid Bermudagrasses and the single African Bermudagrass genotype were all ranked above FloraTeXM Bermudagrass in terms of salt tolerance. However, seashore paspalum, which is widely acknowledged as a halophytic species showing high salt tolerance, ranked well above all 14 Cynodon genotypes in terms of salinity tolerance.
Resumo:
Temperate species and tropical crop silage are the basis for forage production for the dairy industry in the Australian subtropics. Irrigation is the key resource needed for production, with little survival of temperate species under rain-grown conditions except for lucerne. Annual ryegrass (Lolium multiflorum), fertilised with either inorganic nitrogen or grown with clovers, is the main cool season forage for the dairy industry. It is sown into fully prepared seedbeds, oversown into tropical grasses, especially kikuyu (Pennisetum clandestinum) or sown after mulching. There has been a continual improvement in the performance of annual and hybrid ryegrass cultivars over the last 25 years. In small plot, cutting experiments, yields of annual ryegrass typically range from 15 to 21 t DM/ha, with equivalent on-farm yields of 7 to 14 t DM/ha of utilised material. Rust (Puccinia coronata) remains the major concern although resistance is more stable than in oats. There have also been major improvements in the performance of perennial ryegrass (L. perenne) cultivars although their persistence under grazing is insufficient to make them a reliable forage source for the subtropics. On the other hand, tall fescue (Festuca arundinacea) and prairie grass (Bromus willdenowii) cultivars perform well under cutting and grazing, although farmer resistance to the use of tall fescue is strong. White clover (Trifolium repens) is a reliable and persistent performer although disease usually reduces its performance in the third year after sowing. Persian (Shaftal) annual clover (T. resupinatum) gives good winter production but the performance of berseem clover (T. alexandrinum) is less reliable and the sub clovers (T. subterraneum) are generally not suited to clay soils of neutral to alkaline pH. Lucerne (Medicago sativa), either as a pure stand or in mixtures, is a high producing legume under both irrigation and natural rainfall. Understanding the importance of leaf and crown diseases, and the development of resistant cultivars, have been the reasons for its reliability. Insects on temperate species are not as serious a problem in the subtropics as in New Zealand (NZ). Fungal and viral diseases, on the other hand, cause many problems and forage performance would benefit from more research into resistance.
Resumo:
A 2000-03 study to improve irrigation efficiency of grassed urban public areas in northern Australia found it would be difficult to grow most species in dry areas without supplementary watering. Sporoboulus virginicus and sand couch, Zoysia macrantha, were relatively drought-tolerant. Managers of sporting fields, parks and gardens could more than halve their current water use by irrigating over a long cycle, irrigating according to seasonal conditions and using grasses with low water use and sound soil management practices that encourage deep rooting. The use of effluent water provides irrigation and fertiliser cost savings and reduced nitrogen and phosphorus discharge to local waterways. Projected savings are $8000/ha/year in water costs for a typical sporting field.
Resumo:
Uropathogenic Escherichia coli is the primary cause of urinary tract infections, which affects over 60% of women during their lifetime. UPEC exhibits a number of virulence traits that facilitate colonization of the bladder, including inhibition of cytokine production by bladder epithelial cells. The goal of this study was to identify the mechanism of this inhibition. We observed that cytokine suppression was associated with rapid cytotoxicity toward epithelial cells. We found that cytotoxicity, cytokine suppression and alpha-hemolysin production were all tightly linked in clinical isolates. We screened a UPEC fosmid library and identified clones that gained the cytotoxicity and cytokine-suppression phenotypes. Both clones contained fosmids encoding a PAI II(J96)-like domain and expressed the alpha-hemolysin (hlyA) encoded therein. Mutation of the fosmid-encoded hly operon abolished cytotoxicity and cytokine suppression. Similarly, mutation of the chromosomal hlyCABD operon of UPEC isolate F11 also abolished these phenotypes, and they could be restored by introducing the PAI II(J96)-like domain-encoding fosmid. We also examined the role of alpha-hemolysin in cytokine production both in the murine UTI model as well as patient specimens. We conclude that E. coli utilizes alpha-hemolysin to inhibit epithelial cytokine production in vitro. Its contribution to inflammation during infection requires further study.