13 resultados para Tool for water management
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Benchmarking irrigation performance has always been a challenge. As part of the Rural Water Use Efficiency (RWUE3) project the team in, collaboration with the Cotton Catchments Communities CRC, National Certificate of Educational Achievement and the Knowledge Management Phase 2 project, aimed to standardise the irrigation indices in use within the cotton industry. This was achieved through: - the delivery of training workshops - access to benchmarking tools - promotion of benchmarking as a best practice to be adopted on farm.
Resumo:
The shelf life of mangoes is limited by two main postharvest diseases when not consistently managed. These are anthracnose ( Colletotrichum gloeosporioides) and stem end rots (SER) ( Fusicoccum parvum). The management of these diseases has often relied mainly on the use of fungicides either as field spray treatments or as postharvest dips. These have done a fairly good job at serving the industry and allowing fruits to be transported, stored and sold at markets distant from the areas of production. There are however concerns on the continuous use of these fungicides as the main or only tool for the management of these diseases. This has necessitated a re-think of how these diseases could be sustainably managed into the future using a systems approach that focuses on integrated crop management. It is a holistic approach that considers all the crop protection management strategies including the genetics of the plant and its ability to naturally defend itself from infection with plant activators and growth regulators. It also considers other cultural or agronomic management tools such as the use of crop nutrition, timely application of irrigation water and the pruning of trees on a regular basis as a means of reducing inoculum levels in the orchards. The ultimate aim of this approach is to increase yields and obtain long term sustainable production. It is guided by the sustainable crop production principle which states that producers should apply as little inputs as possible but as much as needed.
Resumo:
Quantifying the local crop response to irrigation is important for establishing adequate irrigation management strategies. This study evaluated the effect of irrigation applied with subsurface drip irrigation on field corn (Zea mays L.) evapotranspiration (ETc), yield, water use efficiencies (WUE = yield/ETc, and IWUE = yield/irrigation), and dry matter production in the semiarid climate of west central Nebraska. Eight treatments were imposed with irrigation amounts ranging from 53 to 356 mm in 2005 and from 22 to 226 mm in 2006. A soil water balance approach (based on FAO-56) was used to estimate daily soil water and ETc. Treatments resulted in seasonal ETc of 580-663 mm and 466-656 mm in 2005 and 2006, respectively. Yields among treatments differed by as much as 22% in 2005 and 52% in 2006. In both seasons, irrigation significantly affected yields, which increased with irrigation up to a point where irrigation became excessive. Distinct relationships were obtained each season. Yields increased linearly with seasonal ETc (R 2 = 0.89) and ETc/ETp (R 2 = 0.87) (ETp = ETc with no water stress). The yield response factor (ky), which indicates the relative reduction in yield to relative reduction in ETc, averaged 1.58 over the two seasons. WUE increased non-linearly with seasonal ETc and with yield. WUE was more sensitive to irrigation during the drier 2006 season, compared with 2005. Both seasons, IWUE decreased sharply with irrigation. Irrigation significantly affected dry matter production and partitioning into the different plant components (grain, cob, and stover). On average, the grain accounted for the majority of the above-ground plant dry mass (≈59%), followed by the stover (≈33%) and the cob (≈8%). The dry mass of the plant and that of each plant component tended to increase with seasonal ETc. The good relationships obtained in the study between crop performance indicators and seasonal ETc demonstrate that accurate estimates of ETc on a daily and seasonal basis can be valuable for making tactical in-season irrigation management decisions and for strategic irrigation planning and management.
Resumo:
The main weeds and weed management practices undertaken in broad acre dryland cropping areas of north-eastern Australia have been identified. The information was collected in a comprehensive postal survey of both growers and agronomists from Dubbo in New South Wales (NSW) through to Clermont in central Queensland, where 237 surveys were returned. A very diverse weed flora of 105 weeds from 91 genera was identified for the three cropping zones within the region (central Queensland, southern Queensland and northern NSW). Twenty-three weeds were common to all cropping zones. The major common weeds were Sonchus oleraceus, Rapistrum rugosum, Echinochloa spp. and Urochloa panicoides. The main weeds were identified for both summer and winter fallows, and sorghum, wheat and chickpea crops for each of the zones, with some commonality as well as floral uniqueness recorded. More genera were recorded in the fallows than in crops, and those in summer fallows exceeded the number in winter. Across the region, weed management relied heavily on herbicides. In fallows, glyphosate and mixes with glyphosate were very common, although the importance of the glyphosate mix partner differed among the cropping zones. Use and importance of pre-emergence herbicides in-crop varied considerably among the zones. In wheat, more graminicides were used in northern NSW than in southern Queensland, and virtually none were used in central Queensland, reflecting the differences in winter grass weed flora across the region. Atrazine was the major herbicide used in sorghum, although metolachlor was also used predominantly in northern NSW. Fallow and inter-row cultivation were used more often in the southern areas of the region. Grazing of fallows was more prominent in northern NSW. High crop seeding rates were not commonly recorded indicating that growers are not using crop competition as a tool for weed management. Although many management practices were recorded overall, few growers were using integrated weed management, and herbicide resistance has been and continues to be an issue for the region.
Resumo:
Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.
Resumo:
Technology demonstration sites for remote water management for Roma region.
Resumo:
The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.
Resumo:
NITROUS OXIDE (N2O) IS a potent greenhouse gas and the predominant ozone-depleting substance in the atmosphere. Agricultural nitrogenous fertiliser use is the major source of human-induced N2O emissions. A field experiment was conducted at Bundaberg from October 2012 to September 2014 to examine the impacts of legume crop (soybean) rotation as an alternative nitrogen (N) source on N2O emissions during the fallow period and to investigate low-emission soybean residue management practices. An automatic monitoring system and manual gas sampling chambers were used to measure greenhouse gas emissions from soil. Soybean cropping during the fallow period reduced N2O emissions compared to the bare fallow. Based on the N content in the soybean crop residues, the fertiliser N application rate was reduced by about 120 kg N/ha for the subsequent sugarcane crop. Consequently, emissions of N2O during the sugarcane cropping season were significantly lower from the soybean cropped soil than those from the conventionally fertilised (145 kg N/ha) soil following bare fallow. However, tillage that incorporated the soybean crop residues into soil promoted N2O emissions in the first two months. Spraying a nitrification inhibitor (DMPP) onto the soybean crop residues before tillage effectively prevented the N2O emission spikes. Compared to conventional tillage, practising no-till with or without growing a nitrogen catch crop during the time after soybean harvest and before cane planting also reduced N2O emissions substantially. These results demonstrated that soybean rotation during the fallow period followed with N conservation management practices could offer a promising N2O mitigation strategy in sugarcane farming. Further investigation is required to provide guidance on N and water management following soybean fallow to maintain sugar productivity.
Resumo:
Dwindling water supplies for irrigation are prompting alternative management choices by irrigators. Limited irrigation, where less water is applied than full crop demand, may be a viable approach. Application of limited irrigation to corn was examined in this research. Corn was grown in crop rotations with dryland, limited irrigation, or full irrigation management from 1985 to 1999. Crop rotations included corn following corn (continuous corn), corn following wheat, followed by soybean (wheat-corn-soybean), and corn following soybean (corn-soybean). Full irrigation was managed to meet crop evapotranspiration requirements (ETc). Limited irrigation was managed with a seasonal target of no more than 150 mm applied. Precipitation patterns influenced the outcomes of measured parameters. Dryland yields had the most variation, while fully irrigated yields varied the least. Limited irrigation yields were 80% to 90%> of fully irrigated yields, but the limited irrigation plots received about half the applied water. Grain yields were significantly different among irrigation treatments. Yields were not significantly different among rotation treatments for all years and water treatments. For soil water parameters, more statistical differences were detected among the water management treatments than among the crop rotation treatments. Economic projections of these management practices showed that full irrigation produced the most income if water was available. Limited irrigation increased income significantly from dryland management.
Resumo:
Rabbit Haemorrhagic Disease Virus (RHDV) was introduced to Australia in 1995 for the control of wild rabbits. Initial outbreaks greatly reduced rabbit numbers and the virus has continued to control rabbits to varying degrees in different parts of Australia. However, recent field evidence suggests that the virus may be becoming less effective in those areas that have previously experienced repeated epizootics causing high mortality. There are also reports of rabbits returning to pre-1995 density levels, Virus and host can be expected to co-evolve. The host will develop resistance to the virus with the virus subsequently changing to overcome that resistance. It has been 12 years since the release of RHDV and it is an opportune time to examine where the dynamic currently stands between RHDV and rabbits. Laboratory challenge tests have indicated that resistance to RHDV has developed to different degrees in populations throughout Australia. In one population a low dose (1:25 dilution) of Czech strain RHDV failed to infect a single susceptible rabbit, yet infected a low to high (up to 73%) percentage across other populations tested. Different selection pressures are present in these populations and will be driving the level of resistance being seen. The mechanisms and genetics behind the development of resistance are also important as the on-going use of RHDV as a control tool in the management of rabbits relies on our understanding of factors influencing the efficacy of the virus. Understanding how resistance has developed may provide clues on how best to use the virus to circumvent these mechanisms. Similarly, it will help in managing populations that have yet to develop high levels of resistance.
Resumo:
Abstract of Macbeth, G. M., Broderick, D., Buckworth, R. & Ovenden, J. R. (In press, Feb 2013). Linkage disequilibrium estimation of effective population size with immigrants from divergent populations: a case study on Spanish mackerel (Scomberomorus commerson). G3: Genes, Genomes and Genetics. Estimates of genetic effective population size (Ne) using molecular markers are a potentially useful tool for the management of endangered through to commercial species. But, pitfalls are predicted when the effective size is large, as estimates require large numbers of samples from wild populations for statistical validity. Our simulations showed that linkage disequilibrium estimates of Ne up to 10,000 with finite confidence limits can be achieved with sample sizes around 5000. This was deduced from empirical allele frequencies of seven polymorphic microsatellite loci in a commercially harvested fisheries species, the narrow barred Spanish mackerel (Scomberomorus commerson). As expected, the smallest standard deviation of Ne estimates occurred when low frequency alleles were excluded. Additional simulations indicated that the linkage disequilibrium method was sensitive to small numbers of genotypes from cryptic species or conspecific immigrants. A correspondence analysis algorithm was developed to detect and remove outlier genotypes that could possibly be inadvertently sampled from cryptic species or non-breeding immigrants from genetically separate populations. Simulations demonstrated the value of this approach in Spanish mackerel data. When putative immigrants were removed from the empirical data, 95% of the Ne estimates from jacknife resampling were above 24,000.
Resumo:
Rabbit haemorrhagic disease is a major tool for the management of introduced, wild rabbits in Australia. However, new evidence suggests that rabbits may be developing resistance to the disease. Rabbits sourced from wild populations in central and southeastern Australia, and domestic rabbits for comparison, were experimentally challenged with a low 60 ID50 oral dose of commercially available Czech CAPM 351 virus - the original strain released in Australia. Levels of resistance to infection were generally higher than for unselected domestic rabbits and also differed (0-73% infection rates) between wild populations. Resistance was lower in populations from cooler, wetter regions and also low in arid regions with the highest resistance seen within zones of moderate rainfall. These findings suggest the external influences of non-pathogenic calicivirus in cooler, wetter areas and poor recruitment in arid populations may influence the development rate of resistance in Australia.
Resumo:
The root-lesion nematode, Pratylenchus thornei, can reduce wheat yields by >50%. Although this nematode has a broad host range, crop rotation can be an effective tool for its management if the host status of crops and cultivars is known. The summer crops grown in the northern grain region of Australia are poorly characterised for their resistance to P. thornei and their role in crop sequencing to improve wheat yields. In a 4-year field experiment, we prepared plots with high or low populations of P. thornei by growing susceptible wheat or partially resistant canaryseed (Phalaris canariensis); after an 11-month, weed-free fallow, several cultivars of eight summer crops were grown. Following another 15-month, weed-free fallow, P. thornei-intolerant wheat cv. Strzelecki was grown. Populations of P. thornei were determined to 150 cm soil depth throughout the experiment. When two partially resistant crops were grown in succession, e.g. canaryseed followed by panicum (Setaria italica), P. thornei populations were <739/kg soil and subsequent wheat yields were 3245 kg/ha. In contrast, after two susceptible crops, e.g. wheat followed by soybean, P. thornei populations were 10 850/kg soil and subsequent wheat yields were just 1383 kg/ha. Regression analysis showed a linear, negative response of wheat biomass and grain yield with increasing P. thornei populations and a predicted loss of 77% for biomass and 62% for grain yield. The best predictor of wheat yield loss was P. thornei populations at 0-90 cm soil depth. Crop rotation can be used to reduce P. thornei populations and increase wheat yield, with greatest gains being made following two partially resistant crops grown sequentially.