77 resultados para Irrigated bean
Resumo:
We present a participatory modelling framework that integrates information from interviews and discussions with farmers and consultants, with dynamic bio-economic models to answer complex questions on the allocation of limited resources at the farm business level. Interviews and discussions with farmers were used to: describe the farm business; identify relevant research questions; identify potential solutions; and discuss and learn from the whole-farm simulations. The simulations are done using a whole-farm, multi-field configuration of APSIM (APSFarm). APSFarm results were validated against farmers' experience. Once the model was accepted by the participating farmers as a fair representation of their farm business, the model was used to explore changes in the tactical or strategic management of the farm and results were then discussed to identify feasible options for improvement. Here we describe the modelling framework and present an example of the application of integrative whole farm system tools to answer relevant questions from an irrigated farm business case study near Dalby (151.27E - 27.17S), Queensland, Australia. Results indicated that even though cotton crops generates more farm income per hectare a more diversified rotation with less cotton would be relatively more profitable, with no increase in risk, as a more cotton dominated traditional rotation. Results are discussed in terms of the benefits and constraints from developing and applying more integrative approaches to represent farm businesses and their management in participatory research projects with the aim of designing more profitable and sustainable irrigated farming systems.
Resumo:
The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.
Resumo:
In irrigated cropping, as with any other industry, profit and risk are inter-dependent. An increase in profit would normally coincide with an increase in risk, and this means that risk can be traded for profit. It is desirable to manage a farm so that it achieves the maximum possible profit for the desired level of risk. This paper identifies risk-efficient cropping strategies that allocate land and water between crop enterprises for a case study of an irrigated farm in Southern Queensland, Australia. This is achieved by applying stochastic frontier analysis to the output of a simulation experiment. The simulation experiment involved changes to the levels of business risk by systematically varying the crop sowing rules in a bioeconomic model of the case study farm. This model utilises the multi-field capability of the process based Agricultural Production System Simulator (APSIM) and is parameterised using data collected from interviews with a collaborating farmer. We found sowing rules that increased the farm area sown to cotton caused the greatest increase in risk-efficiency. Increasing maize area also improved risk-efficiency but to a lesser extent than cotton. Sowing rules that increased the areas sown to wheat reduced the risk-efficiency of the farm business. Sowing rules were identified that had the potential to improve the expected farm profit by ca. $50,000 Annually, without significantly increasing risk. The concept of the shadow price of risk is discussed and an expression is derived from the estimated frontier equation that quantifies the trade-off between profit and risk.
Resumo:
With the aim of increasing peanut production in Australia, the Australian peanut industry has recently considered growing peanuts in rotation with maize at Katherine in the Northern Territory—a location with a semi-arid tropical climate and surplus irrigation capacity. We used the well-validated APSIM model to examine potential agronomic benefits and long-term risks of this strategy under the current and warmer climates of the new region. Yield of the two crops, irrigation requirement, total soil organic carbon (SOC), nitrogen (N) losses and greenhouse gas (GHG) emissions were simulated. Sixteen climate stressors were used; these were generated by using global climate models ECHAM5, GFDL2.1, GFDL2.0 and MRIGCM232 with a median sensitivity under two Special Report of Emissions Scenarios over the 2030 and 2050 timeframes plus current climate (baseline) for Katherine. Effects were compared at three levels of irrigation and three levels of N fertiliser applied to maize grown in rotations of wet-season peanut and dry-season maize (WPDM), and wet-season maize and dry-season peanut (WMDP). The climate stressors projected average temperature increases of 1°C to 2.8°C in the dry (baseline 24.4°C) and wet (baseline 29.5°C) seasons for the 2030 and 2050 timeframes, respectively. Increased temperature caused a reduction in yield of both crops in both rotations. However, the overall yield advantage of WPDM increased from 41% to up to 53% compared with the industry-preferred sequence of WMDP under the worst climate projection. Increased temperature increased the irrigation requirement by up to 11% in WPDM, but caused a smaller reduction in total SOC accumulation and smaller increases in N losses and GHG emission compared with WMDP. We conclude that although increased temperature will reduce productivity and total SOC accumulation, and increase N losses and GHG emissions in Katherine or similar northern Australian environments, the WPDM sequence should be preferable over the industry-preferred sequence because of its overall yield and sustainability advantages in warmer climates. Any limitations of irrigation resulting from climate change could, however, limit these advantages.
Resumo:
We present here the complete genome sequences of a novel polerovirus from Trifolium subterraneum (subterranean clover) and Cicer arietinum (chickpea) and compare these to a partial viral genome sequence obtained from Macroptilium lathyroides (phasey bean). We propose the name phasey bean mild yellows virus for this novel polerovirus.
Resumo:
Like all high yielding farming systems nitrogen (N) is a key component to their productivity and profitability and Australian irrigated cotton growers are tending to apply more N than is required for the level of lint yield that is being achieved. This suggests either over application of N or inefficient systems limiting the response of cotton to N inputs. To investigate this four replicated trials were established in commercial fields during the 2014/15 season. The trials were aiming to measure the difference in response of irrigated cotton to the application of N under flood and overhead irrigation systems. The application treatments utilized eight upfront rates of applied N, ranging from 0 N kg/ha to a maximum of 410 kg N/ha, with three of the fours trials receiving a growerdetermined in-crop application of N in the irrigation water. The two flood irrigation systems had lower lint yields from similar levels of N input compared to one of the overhead irrigated sites; the result from the second overhead site was impacted by disease. This paper discusses the response of plant N uptake, lint yield and fertilizer N recovery to N application..
Resumo:
A new foliar disease was observed on baby lima bean (Phaseolus lunatus) in fields across western New York State, USA. The disease occurred in 10 fields with variable incidence and severity. Symptoms were initially necrotic, tan spots on leaves with red to reddish brown irregular margins that coalesced to encompass the entire leaf and cause abscission. Pycnidia were observed within the lesions. Isolations from diseased leaves yielded several pycnidial forming fungi, including a Didymella species. These isolates were characterized by morphology and sequencing of multiple reference genes (internal transcribed spacer (ITS), partial actin, β- tubulin (tub2), translation elongation factor 1-α (TEF), 28S rDNA large subunit (LSU), rpb2, and calmodulin). A four gene phylogeny (ITS, tub2, LSU, and rpb2) showed that the isolates from baby lima bean belonged to a well-supported clade that contained the type culture of Didymella americana. Pathogenicity of the isolates on three commonly grown cultivars of baby lima bean was confirmed. Symptoms that developed on inoculated plants were similar to those observed on diseased plants in the field. This is the first report of D. americana on baby lima bean.
Resumo:
The hypothesis that contaminant plants growing amongst chickpea serve as Helicoverpa sinks by diverting oviposition pressure away from the main crop was tested under field conditions. Gain (recruitment) and loss (presumed mortality) of juvenile stages of Helicoverpa spp. on contaminant faba bean and wheat plants growing in chickpea plots were quantified on a daily basis over a 12-d period. The possibility of posteclosion movement of larvae from the contaminants to the surrounding chickpea crop was examined. Estimated total loss of the census population varied from 80 to 84% across plots and rows. The loss of brown eggs (40–47%) contributed most to the overall loss estimate, followed by loss of white eggs (27–35%) and larvae (6–9%). The cumulative number of individuals entering the white and brown egg and larval stages over the census period ranged from 15 to 58, 10–48 and 1–6 per m row, respectively. The corresponding estimates of mean stage-specific loss, expressed as a percentage of individuals entering the stage, ranged from 52 to 57% for white eggs, 87–108% for brown eggs and 71–87% for first-instar larvae. Mean larval density on chickpea plants in close proximity to the contaminant plants did not exceed the baseline larval density on chickpea further away from the contaminants across rows and plots. The results support the hypothesis that contaminant plants in chickpea plots serve as Helicoverpa sinks by diverting egg pressure from the main crop and elevating mortality of juvenile stages. Deliberate contamination of chickpea crops with other plant species merits further investigation as a cultural pest management strategy for Helicoverpa spp.
Resumo:
Land application of piggery effluent (containing urine, faeces, water, and wasted feed) is under close scrutiny as a potential source of water resource contamination with phosphorus (P). This paper investigates two case studies of the impact of long-term piggery effluent-P application to soil. A Natrustalf (Sodosol) at P1 has received a net load of 3700 kg effluent P/ha over 19 years. The Haplustalf (Dermosol) selected (P2) has received a net load of 310 000 kg P/ha over 30 years. Total, bicarbonate extractable, and soluble P forms were determined throughout the soil profiles for paired (irrigated and unirrigated) sites at P1 and P2, as well as P sorption and desorption characteristics. Surface bicarbonate (PB, 0 - 0.05 m depth) and dilute CaCl2 extractable molybdate-reactive P (PC) have been significantly elevated by effluent irrigation (P1: PB unirrigated 23±1, irrigated 290±6; PC unirrigated 0.03±0.00, irrigated 23.9±0.2. P2: PB unirrigated 72±48, irrigated 3950±1960; PC unirrigated 0.7±0.0, irrigated 443±287 mg P/kg; mean±s.d.). Phosphorus enrichment to 1.5 m, detected as PB, was observed at P2. Elevated concentrations of CaCl2 extractable organic P forms (POC; estimated by non-molybdate reactive P in centrifuged supernatants) were observed from the soil surface of P1 to a depth of 0.4 m. Despite the extent of effluent application at both of these sites, only P1 displayed evidence of significant accumulation of POC. The increase in surface soil total P (0 - 0.05 m depth) due to effluent irrigation was much greater than laboratory P sorption (>25 times for P1; >57 times for P2) for a comparable range of final solution concentrations (desorption extracts ranged from 1-5 mg P/L for P1 and 50-80 mg P/L for P2). Precipitation of sparingly soluble P phases was evidenced in the soils of the P2 effluent application area.
Resumo:
Mounting levels of insecticide resistance within Australian Helicoverpa spp. populations have resulted in the adoption of non-chemical IPM control practices such as trap cropping with chickpea, Cicer arietinum (L.). However, a new leaf blight disease affecting chickpea in Australia has the potential to limit its use as a trap crop. Therefore this paper evaluates the potential of a variety of winter-active legume crops for use as an alternative spring trap crop to chickpea as part of an effort to improve the area-wide management strategy for Helicoverpa spp. in central Queensland’s cotton production region. The densities of Helicoverpa eggs and larvae were compared over three seasons on replicated plantings of chickpea, Cicer arietinum (L.), field pea Pisum sativum (L), vetch, Vicia sativa (L.) and faba bean, Vicia faba (L.). Of these treatments, field pea was found to harbour the highest densities of eggs. A partial life table study of the fate of eggs oviposited on field pea and chickpea suggested that large proportions of the eggs laid on field pea suffered mortality due to dislodgment from the plants after oviposition. Plantings of field pea as a replacement trap crop for chickpea under commercial conditions confirmed the high level of attractiveness of this crop to ovipositing moths. The use of field pea as a trap crop as part of an areawide management programme for Helicoverpa spp. is discussed.
Resumo:
Wheat is one of the major food crops in the world. It is Australia's largest crop and most important agricultural commodity. In Australia the crop is grown under rainfed conditions with inherently important regional environmental differences; wheat growing areas are characterized by winter dominant rainfall in southern and western Australia and summer rainfall in northern Australia. Maximizing yield potential across these diverse regions is dependent upon managing, either genetically or agronomically, those factors in the environment that limit yield. The potential of synthetic backcross lines (SBLs) to increase yield in the diverse agroecological zones of Australia was investigated. Significant yield advantages were found for many of the SBLs across diverse environments. Depending on the environment, the yield of the SBLs ranged from 8% to 30% higher than the best local check in Australia. Apart from adaptation to semiarid water stressed conditions, some SBLs were also found to be significantly higher yielding under more optimal (irrigated) conditions. The four testing environments were classified into two groups, with the northern and southern environments being in separate groups. An elite group of SBLs was identified that exhibited broad adaptation across all diverse Australian environments included in this study. Other SBLs showed specific adaptation to either northern or southern Australia. This study showed that SBLs are likely to provide breeders with the opportunity to significantly improve wheat yield beyond what was previously possible in a number of diverse production environments.
Resumo:
In this study, 120–144 commercial varieties and breeding lines were assessed for grain size attributes including plump grain (>2.8 mm) and retention (>2.5 mm+>2.8 mm). Grain samples were produced from replicated trials at 25 sites across four years. Climatic conditions varied between years as well as between sites. Several of the trial sites were irrigated while the remaining were produced under dryland conditions. A number of the dryland sites suffered severe drought stress. The grain size data was analysed for genetic (G), environmental (E) and genotype by environment (G×E) interactions. All analyses included maturity as a covariate. The genetic effect on grain size was greater than environmental or maturity effects despite some sites suffering terminal moisture stress. The model was used to calculate heritability values for each site used in the study. These values ranged from 89 to 98% for plump grain and 88 to 96% for retention. The results demonstrated that removing the sources of non-heritable variation, such as maturity and field effects, can improve genetic estimates of the retention and plump grain fractions. By partitioning all variance components, and thereby having more robust estimates of genetic differences, plant breeders can have greater confidence in selecting barley genotypes which maintain large, stable grain size across a range of environments.
Resumo:
This study reports on the use of naturally occurring F-specific coliphages, as well as spiked MS-2 phage, to evaluate a land-based effluent treatment/reuse system and an effluent irrigation scheme. Both the natural phages and the spiked MS-2 phage indicated that the effluent treatment/reuse system (FILTER - Filtration and Irrigated cropping for Land Treatment and Effluent Reuse) achieved a reduction in phage levels over the treatment system by one to two log10. FILTER reduced natural F-specific phage numbers from around 103 to below 102 100-ml-1 and the spiked phage from 105 to around 104 100-ml-1 (incoming compared with outgoing water). In the effluent irrigation scheme, phage spiked into the holding ponds dropped from 106 to 102 100-ml-1 after 168 h (with no detectable levels of natural F-specific phage being found prior to spiking). Only low levels of the spiked phage (102 gm-1) could be recovered from soil irrigated with phage-spiked effluent (at 106 phage 100 ml-1) or from fruits (around 102 phage per fruit) that had direct contact with soil which had been freshly irrigated with the same phage-spiked effluent.
Resumo:
The parasitic weed Orobanche crenata inflicts major damage on faba bean, lentil, pea and other crops in Mediterranean environments. The development of methods to control O. crenata is to a large extent hampered by the complexity of host-parasite systems. Using a model of host-parasite interactions can help to explain and understand this intricacy. This paper reports on the evaluation and application of a model simulating host-parasite competition as affected by environment and management that was implemented in the framework of the Agricultural Production Systems Simulator (APSIM). Model-predicted faba bean and O. crenata growth and development were evaluated against independent data. The APSIM-Fababean and -Parasite modules displayed a good capability to reproduce effects of pedoclimatic conditions, faba bean sowing date and O. crenata infestation on host-parasite competition. The r(2) values throughout exceeded 0.84 (RMSD: 5.36 days) for phenological, 0.85 (RMSD: 223.00 g m(-2)) for host growth and 0.78 (RMSD: 99.82 g m(-2)) for parasite growth parameters. Inaccuracies of simulated faba bean root growth that caused some bias of predicted parasite number and host yield loss may be dealt with by more flexibly simulating vertical root distribution. The model was applied in simulation experiments to determine optimum sowing windows for infected and non-infected faba bean in Mediterranean environments. Simulation results proved realistic and testified to the capability of APSIM to contribute to the development of tactical approaches in parasitic weed control.