9 resultados para Cloud-based systems
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Concepts of agricultural sustainability and possible roles of simulation modelling for characterising sustainability were explored by conducting, and reflecting on, a sustainability assessment of rain-fed wheat-based systems in the Middle East and North Africa region. We designed a goal-oriented, model-based framework using the cropping systems model Agricultural Production Systems sIMulator (APSIM). For the assessment, valid (rather than true or false) sustainability goals and indicators were identified for the target system. System-specific vagueness was depicted in sustainability polygons-a system property derived from highly quantitative data-and denoted using descriptive quantifiers. Diagnostic evaluations of alternative tillage practices demonstrated the utility of the framework to quantify key bio-physical and chemical constraints to sustainability. Here, we argue that sustainability is a vague, emergent system property of often wicked complexity that arises out of more fundamental elements and processes. A 'wicked concept of sustainability' acknowledges the breadth of the human experience of sustainability, which cannot be internalised in a model. To achieve socially desirable sustainability goals, our model-based approach can inform reflective evaluation processes that connect with the needs and values of agricultural decision-makers. Hence, it can help to frame meaningful discussions, from which actions might emerge.
Resumo:
This paper describes a new knowledge acquisition method using a generic design environment where context-sensitive knowledge is used to build specific DSS for rural business. Although standard knowledge acquisition methods have been applied in rural business applications, uptake remains low and familiar weaknesses such as obsolescence and brittleness apply. We describe a decision support system (DSS) building environment where contextual factors relevant to the end users are directly taken into consideration. This "end user enabled design environment" (EUEDE) engages both domain experts in creating an expert knowledge base and business operators/end users (such as farmers) in using this knowledge for building their specific DSS. We document the knowledge organisation for the problem domain, namely a dairy industry application. This development involved a case-study research approach used to explore dairy operational knowledge. In this system end users can tailor their decision-making requirements using their own judgement to build specific DSSs. In a specific end user's farming context, each specific DSS provides expert suggestions to assist farmers in improving their farming practice. The paper also shows the environment's generic capability.
Resumo:
The in vivo faecal egg count reduction test (FECRT) is the most commonly used test to detect anthelmintic resistance (AR) in gastrointestinal nematodes (GIN) of ruminants in pasture based systems. However, there are several variations on the method, some more appropriate than others in specific circumstances. While in some cases labour and time can be saved by just collecting post-drench faecal worm egg counts (FEC) of treatment groups with controls, or pre- and post-drench FEC of a treatment group with no controls, there are circumstances when pre- and post-drench FEC of an untreated control group as well as from the treatment groups are necessary. Computer simulation techniques were used to determine the most appropriate of several methods for calculating AR when there is continuing larval development during the testing period, as often occurs when anthelmintic treatments against genera of GIN with high biotic potential or high re-infection rates, such as Haemonchus contortus of sheep and Cooperia punctata of cattle, are less than 100% efficacious. Three field FECRT experimental designs were investigated: (I) post-drench FEC of treatment and controls groups, (II) pre- and post-drench FEC of a treatment group only and (III) pre- and post-drench FEC of treatment and control groups. To investigate the performance of methods of indicating AR for each of these designs, simulated animal FEC were generated from negative binominal distributions with subsequent sampling from the binomial distributions to account for drench effect, with varying parameters for worm burden, larval development and drench resistance. Calculations of percent reductions and confidence limits were based on those of the Standing Committee for Agriculture (SCA) guidelines. For the two field methods with pre-drench FEC, confidence limits were also determined from cumulative inverse Beta distributions of FEC, for eggs per gram (epg) and the number of eggs counted at detection levels of 50 and 25. Two rules for determining AR: (1) %reduction (%R) < 95% and lower confidence limit <90%; and (2) upper confidence limit <95%, were also assessed. For each combination of worm burden, larval development and drench resistance parameters, 1000 simulations were run to determine the number of times the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been declared. When continuing larval development occurs during the testing period of the FECRT, the simulations showed AR should be calculated from pre- and post-drench worm egg counts of an untreated control group as well as from the treatment group. If the widely used resistance rule 1 is used to assess resistance, rule 2 should also be applied, especially when %R is in the range 90 to 95% and resistance is suspected.
Resumo:
Assessing the sustainability of crop and soil management practices in wheat-based rotations requires a well-tested model with the demonstrated ability to sensibly predict crop productivity and changes in the soil resource. The Agricultural Production Systems Simulator (APSIM) suite of models was parameterised and subsequently used to predict biomass production, yield, crop water and nitrogen (N) use, as well as long-term soil water and organic matter dynamics in wheat/chickpea systems at Tel Hadya, north-western Syria. The model satisfactorily simulated the productivity and water and N use of wheat and chickpea crops grown under different N and/or water supply levels in the 1998-99 and 1999-2000 experimental seasons. Analysis of soil-water dynamics showed that the 2-stage soil evaporation model in APSIM's cascading water-balance module did not sufficiently explain the actual soil drying following crop harvest under conditions where unused water remained in the soil profile. This might have been related to evaporation from soil cracks in the montmorillonitic clay soil, a process not explicitly simulated by APSIM. Soil-water dynamics in wheat-fallow and wheat-chickpea rotations (1987-98) were nevertheless well simulated when the soil water content in 0-0.45 m soil depth was set to 'air dry' at the end of the growing season each year. The model satisfactorily simulated the amounts of NO3-N in the soil, whereas it underestimated the amounts of NH 4-N. Ammonium fixation might be part of the soil mineral-N dynamics at the study site because montmorillonite is the major clay mineral. This process is not simulated by APSIM's nitrogen module. APSIM was capable of predicting long-term trends (1985-98) in soil organic matter in wheat-fallow and wheat-chickpea rotations at Tel Hadya as reported in literature. Overall, results showed that the model is generic and mature enough to be extended to this set of environmental conditions and can therefore be applied to assess the sustainability of wheat-chickpea rotations at Tel Hadya.
Resumo:
Point sources of wastewater pollution, including effluent from municipal sewage treatment plants and intensive livestock and processing industries, can contribute significantly to the degradation of receiving waters (Chambers et al. 1997; Productivity Commission 2004). This has led to increasingly stringent local wastewater discharge quotas (particularly regarding Nitrogen, Phosphorous and suspended solids), and many municipal authorities and industry managers are now faced with upgrading their existing treatment facilities in order to comply. However, with high construction, energy and maintenance expenses and increasing labour costs, traditional wastewater treatment systems are becoming an escalating financial burden for the communities and industries that operate them. This report was generated, in the first instance, for the Burdekin Shire Council to provide information on design aspects and parameters critical for developing duckweed-based wastewater treatment (DWT) in the Burdekin region. However, the information will be relevant to a range of wastewater sources throughout Queensland. This information has been collated from published literature and both overseas and local studies of pilot and full-scale DWT systems. This report also considers options to generate revenue from duckweed production (a significant feature of DWT), and provides specifications and component cost information (current at the time of publication) for a large-scale demonstration of an integrated DWT and fish production system.
Resumo:
The main outputs anticipated include enchanced knowledge of key water-nutrient dynamics in relation to key soil management techniques and a suite of improved and practical soil management options in sweet potatoes.
Resumo:
Dairy farms located in the subtropical cereal belt of Australia rely on winter and summer cereal crops, rather than pastures, for their forage base. Crops are mostly established in tilled seedbeds and the system is vulnerable to fertility decline and water erosion, particularly over summer fallows. Field studies were conducted over 5 years on contrasting soil types, a Vertosol and Sodosol, in the 650-mm annual-rainfall zone to evaluate the benefits of a modified cropping program on forage productivity and the soil-resource base. Growing forage sorghum as a double-crop with oats increased total mean annual production over that of winter sole-crop systems by 40% and 100% on the Vertosol and Sodosol sites respectively. However, mean annual winter crop yield was halved and overall forage quality was lower. Ninety per cent of the variation in winter crop yield was attributable to fallow and in-crop rainfall. Replacing forage sorghum with the annual legume lablab reduced fertiliser nitrogen (N) requirements and increased forage N concentration, but reduced overall annual yield. Compared with sole-cropped oats, double-cropping reduced the risk of erosion by extending the duration of soil water deficits and increasing the time ground was under plant cover. When grown as a sole-crop, well fertilised forage sorghum achieved a mean annual cumulative yield of 9.64 and 6.05 t DM/ha on the Vertosol and Sodosol, respectively, being about twice that of sole-cropped oats. Forage sorghum established using zero-tillage practices and fertilised at 175 kg N/ha. crop achieved a significantly higher yield and forage N concentration than did the industry-standard forage sorghum (conventional tillage and 55 kg N/ha. crop) on the Vertosol but not on the Sodosol. On the Vertosol, mean annual yield increased from 5.65 to 9.64 t DM/ha (33 kg DM/kg N fertiliser applied above the base rate); the difference in the response between the two sites was attributed to soil type and fertiliser history. Changing both tillage practices and N-fertiliser rate had no affect on fallow water-storage efficiency but did improve fallow ground cover. When forage sorghum, grown as a sole crop, was replaced with lablab in 3 of the 5 years, overall forage N concentration increased significantly, and on the Vertosol, yield and soil nitrate-N reserves also increased significantly relative to industry-standard sorghum. All forage systems maintained or increased the concentration of soil nitrate-N (0-1.2-m soil layer) over the course of the study. Relative to sole-crop oats, alternative forage systems were generally beneficial to the concentration of surface-soil (0-0.1 m) organic carbon and systems that included sorghum showed most promise for increasing soil organic carbon concentration. We conclude that an emphasis on double-or summer sole-cropping rather than winter sole-cropping will advantage both farm productivity and the soil-resource base.
Resumo:
Field studies were conducted over 5 years on two dairy farms in southern Queensland to evaluate the impacts of zero-tillage, nitrogen (N) fertiliser and legumes on a winter-dominant forage system based on raingrown oats. Oats was able to be successfully established using zero-tillage methods, with no yield penalties and potential benefits in stubble retention over the summer fallow. N fertiliser, applied at above industry-standard rates (140 vs. 55 kg/ha.crop) in the first 3 years, increased forage N concentration significantly and had residual effects on soil nitrate-N at both sites. At one site, crop yield was increased by 10 kg DM/ha. kg fertiliser N applied above industry-standard rates. The difference between sites in fertiliser response reflected contrasting soil and fertiliser history. There was no evidence that modifications to oats cropping practices (zero-tillage and increased N fertiliser) increased surface soil organic carbon (0-10 cm) in the time frame of the present study. When oats was substituted with annual legumes, there were benefits in improved forage N content of the oat crop immediately following, but legume yield was significantly inferior to oats. In contrast, the perennial legume Medicago sativa was competitive in biomass production and forage quality with oats at both sites and increased soil nitrate-N levels following termination. However, its contribution to winter forage was low at 10% of total production, compared with 40% for oats, and soil water reserves were significantly reduced at one site, which had an impact on the following oat production. The study demonstrated that productive grazed oat crops can be grown using zero tillage and that increased N fertiliser is more consistent in its effect on N concentration than on forage yield. A lucerne ley provides a strategy for raising soil nitrate-N concentration and increasing overall forage productivity, although winter forage production is reduced.
Resumo:
We trace the evolution of the representation of management in cropping and grazing systems models, from fixed annual schedules of identical actions in single paddocks toward flexible scripts of rules. Attempts to define higher-level organizing concepts in management policies, and to analyse them to identify optimal plans, have focussed on questions relating to grazing management owing to its inherent complexity. “Rule templates” assist the re-use of complex management scripts by bundling commonly-used collections of rules with an interface through which key parameters can be input by a simulation builder. Standard issues relating to parameter estimation and uncertainty apply to management sub-models and need to be addressed. Techniques for embodying farmers' expectations and plans for the future within modelling analyses need to be further developed, especially better linking planning- and rule-based approaches to farm management and analysing the ways that managers can learn.