837 resultados para Decision Support
Resumo:
The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.
Resumo:
When exposed to hot (22-35 degrees C) and dry climatic conditions in the field during the final 4-6 weeks of pod filling, peanuts (Arachis hypogaea L.) can accumulate highly carcinogenic and immuno-suppressing aflatoxins. Forecasting of the risk posed by these conditions can assist in minimizing pre-harvest contamination. A model was therefore developed as part of the Agricultural Production Systems Simulator (APSIM) peanut module, which calculated an aflatoxin risk index (ARI) using four temperature response functions when fractional available soil water was <0.20 and the crop was in the last 0.40 of the pod-filling phase. ARI explained 0.95 (P <= 0.05) of the variation in aflatoxin contamination, which varied from 0 to c. 800 mu g/kg in 17 large-scale sowings in tropical and four sowings in sub-tropical environments carried out in Australia between 13 November and 16 December 2007. ARI also explained 0.96 (P <= 0.01) of the variation in the proportion of aflatoxin-contaminated loads (>15 mu g/kg) of peanuts in the Kingaroy region of Australia during the period between the 1998/99 and 2007/08 seasons. Simulation of ARI using historical climatic data from 1890 to 2007 indicated a three-fold increase in its value since 1980 compared to the entire previous period. The increase was associated with increases in ambient temperature and decreases in rainfall. To facilitate routine monitoring of aflatoxin risk by growers in near real time, a web interface of the model was also developed. The ARI predicted using this interface for eight growers correlated significantly with the level of contamination in crops (r=095, P <= 0.01). These results suggest that ARI simulated by the model is a reliable indicator of aflatoxin contamination that can be used in aflatoxin research as well as a decision-support tool to monitor pre-harvest aflatoxin risk in peanuts.
Resumo:
Ongoing pressure to minimise costs of production, growing markets for low residue and organic wool and meat, resistance to chemicals in louse populations, and the deregistration of diazinon for dipping and jetting have contributed to a move away from routine annual application of lousicides to more integrated approaches to controlling lice. Advances including improved methods for monitoring and detection of lice, an expanded range of louse control products and the availability of a web-accessible suite of decision support tools for wool growers (LiceBossTM) will aid this transition. Possibilities for the future include an on-farm detection test and non-chemical control methods. The design and extension of well-constructed resistance management programs to preserve the effectiveness of recently available new product groups should be a priority.
Resumo:
Spotted gum dominant forests occur from Cooktown in northern Queensland (Qld) to Orbost in Victoria (Boland et al. 2006) and these forests are commercially very important with spotted gum the most commonly harvested hardwood timber in Qld and one of the most important in New South Wales (NSW). Spotted gum has a wide range of end uses from solid wood products through to power transmission poles and generally has excellent sawing and timber qualities (Hopewell 2004). The private native forest resource in southern Qld and northern NSW is a critical component of the hardwood timber industry (Anon 2005, Timber Qld 2006) and currently half or more of the native forest timber resource harvested in northern NSW and Qld is sourced from private land. However, in many cases productivity on private lands is well below what could be achieved with appropriate silvicultural management. This project provides silvicultural management tools to assist extension staff, land owners and managers in the south east Qld and north eastern NSW regions. The intent was that this would lead to improvement of the productivity of the private estate through implementation of appropriate management. The other intention of this project was to implement a number of silvicultural experiments and demonstration sites to provide data on growth rates of managed and unmanaged forests so that landholders can make informed decisions on the future management of their forests. To assist forest managers and improve the ability to predict forest productivity in the private resource, the project has developed: • A set of spotted gum specific silvicultural guidelines for timber production on private land that cover both silvicultural treatment and harvesting. The guidelines were developed for extension officers and property owners. • A simple decision support tool, referred to as the spotted gum productivity assessment tool (SPAT), that allows an estimation of: 1. Tree growth productivity on specific sites. Estimation is based on the analysis of site and growth data collected from a large number of yield and experimental plots on Crown land across a wide range of spotted gum forest types. Growth algorithms were developed using tree growth and site data and the algorithms were used to formulate basic economic predictors. 2. Pasture development under a range of tree stockings and the expected livestock carrying capacity at nominated tree stockings for a particular area. 3. Above-ground tree biomass and carbon stored in trees. •A series of experiments in spotted gum forests on private lands across the study area to quantify growth and to provide measures of the effect of silvicultural thinning and different agro-forestry regimes. The adoption and use of these tools by farm forestry extension officers and private land holders in both field operations and in training exercises will, over time, improve the commercial management of spotted gum forests for both timber and grazing. Future measurement of the experimental sites at ages five, 10 and 15 years will provide longer term data on the effects of various stocking rates and thinning regimes and facilitate modification and improvement of these silvicultural prescriptions.
Resumo:
Aim Frail older people typically suffer several chronic diseases, receive multiple medications and are more likely to be institutionalized in residential aged care facilities. In such patients, optimizing prescribing and avoiding use of high-risk medications might prevent adverse events. The present study aimed to develop a pragmatic, easily applied algorithm for medication review to help clinicians identify and discontinue potentially inappropriate high-risk medications. Methods The literature was searched for robust evidence of the association of adverse effects related to potentially inappropriate medications in older patients to identify high-risk medications. Prior research into the cessation of potentially inappropriate medications in older patients in different settings was synthesized into a four-step algorithm for incorporation into clinical assessment protocols for patients, particularly those in residential aged care facilities. Results The algorithm comprises several steps leading to individualized prescribing recommendations: (i) identify a high-risk medication; (ii) ascertain the current indications for the medication and assess their validity; (iii) assess if the drug is providing ongoing symptomatic benefit; and (iv) consider withdrawing, altering or continuing medications. Decision support resources were developed to complement the algorithm in ensuring a systematic and patient-centered approach to medication discontinuation. These include a comprehensive list of high-risk medications and the reasons for inappropriateness, lists of alternative treatments, and suggested medication withdrawal protocols. Conclusions The algorithm captures a range of different clinical scenarios in relation to potentially inappropriate medications, and offers an evidence-based approach to identifying and, if appropriate, discontinuing such medications. Studies are required to evaluate algorithm effects on prescribing decisions and patient outcomes.
Resumo:
Monitoring aflatoxin and developing improved peanut drying practices, cadmium management and web based irrigation decision support systems.
Resumo:
The results of the pilot demonstrated that a pharmacist delivered vaccinations services is feasible in community pharmacy and is safe and effective. The accessibility of the pharmacist across the influenza season provided the opportunity for more people to be vaccinated, particularly those who had never received an influenza vaccine before. Patient satisfaction was extremely high with nearly all patients happy to recommend the service and to return again next year. Factors critical to the success of the service were: 1. Appropriate facilities 2. Competent pharmacists 3. Practice and decision support tools 4. In-‐store implementation support We demonstrated in the pilot that vaccination recipients preferred a private consultation area. As the level of privacy afforded to the patients increased (private room vs. booth), so did the numbers of patients vaccinated. We would therefore recommend that the minimum standard of a private consultation room or closed-‐in booth, with adequate space for multiple chairs and a work / consultation table be considered for provision of any vaccination services. The booth or consultation room should be used exclusively for delivering patient services and should not contain other general office equipment, nor be used as storage for stock. The pilot also demonstrated that a pharmacist-‐specific training program produced competent and confident vaccinators and that this program can be used to retrofit the profession with these skills. As vaccination is within the scope of pharmacist practice as defined by the Pharmacy Board of Australia, there is potential for the universities to train their undergraduates with this skill and provide a pharmacist vaccination workforce in the near future. It is therefore essential to explore appropriate changes to the legislation to facilitate pharmacists’ practice in this area. Given the level of pharmacology and medicines knowledge of pharmacists, combined with their new competency of providing vaccinations through administering injections, it is reasonable to explore additional vaccines that pharmacists could administer in the community setting. At the time of writing, QPIP has already expanded into Phase 2, to explore pharmacists vaccinating for whooping cough and measles. Looking at the international experience of pharmacist delivered vaccination, we would recommend considering expansion to other vaccinations in the future including travel vaccinations, HPV and selected vaccinations to those under the age of 18 years. Overall the results of the QPIP implementation have demonstrated that an appropriately trained pharmacist can deliver safely and effectively influenza vaccinations to adult patients in the community. The QPIP showed the value that the accessibility of pharmacists brings to public health outcomes through improved access to vaccinations and the ability to increase immunisation rates in the general population. Over time with the expansion of pharmacist vaccination services this will help to achieve more effective herd immunity for some of the many diseases which currently have suboptimal immunisation rates.
Resumo:
The forest simulator is a computerized model for predicting forest growth and future development as well as effects of forest harvests and treatments. The forest planning system is a decision support tool, usually including a forest simulator and an optimisation model, for finding the optimal forest management actions. The information produced by forest simulators and forest planning systems is used for various analytical purposes and in support of decision making. However, the quality and reliability of this information can often be questioned. Natural variation in forest growth and estimation errors in forest inventory, among other things, cause uncertainty in predictions of forest growth and development. This uncertainty stemming from different sources has various undesirable effects. In many cases outcomes of decisions based on uncertain information are something else than desired. The objective of this thesis was to study various sources of uncertainty and their effects in forest simulators and forest planning systems. The study focused on three notable sources of uncertainty: errors in forest growth predictions, errors in forest inventory data, and stochastic fluctuation of timber assortment prices. Effects of uncertainty were studied using two types of forest growth models, individual tree-level models and stand-level models, and with various error simulation methods. New method for simulating more realistic forest inventory errors was introduced and tested. Also, three notable sources of uncertainty were combined and their joint effects on stand-level net present value estimates were simulated. According to the results, the various sources of uncertainty can have distinct effects in different forest growth simulators. The new forest inventory error simulation method proved to produce more realistic errors. The analysis on the joint effects of various sources of uncertainty provided interesting knowledge about uncertainty in forest simulators.
Resumo:
Department of Forest Resource Management in the University of Helsinki has in years 2004?2007 carried out so-called SIMO -project to develop a new generation planning system for forest management. Project parties are organisations doing most of Finnish forest planning in government, industry and private owned forests. Aim of this study was to find out the needs and requirements for new forest planning system and to clarify how parties see targets and processes in today's forest planning. Representatives responsible for forest planning in each organisation were interviewed one by one. According to study the stand-based system for managing and treating forests continues in the future. Because of variable data acquisition methods with different accuracy and sources, and development of single tree interpretation, more and more forest data is collected without field work. The benefits of using more specific forest data also calls for use of information units smaller than tree stand. In Finland the traditional way to arrange forest planning computation is divided in two elements. After updating the forest data to present situation every stand unit's growth is simulated with different alternative treatment schedule. After simulation, optimisation selects for every stand one treatment schedule so that the management program satisfies the owner's goals in the best possible way. This arrangement will be maintained in the future system. The parties' requirements to add multi-criteria problem solving, group decision support methods as well as heuristic and spatial optimisation into system make the programming work more challenging. Generally the new system is expected to be adjustable and transparent. Strict documentation and free source code helps to bring these expectations into effect. Variable growing models and treatment schedules with different source information, accuracy, methods and the speed of processing are supposed to work easily in system. Also possibilities to calibrate models regionally and to set local parameters changing in time are required. In future the forest planning system will be integrated in comprehensive data management systems together with geographic, economic and work supervision information. This requires a modular method of implementing the system and the use of a simple data transmission interface between modules and together with other systems. No major differences in parties' view of the systems requirements were noticed in this study. Rather the interviews completed the full picture from slightly different angles. In organisation the forest management is considered quite inflexible and it only draws the strategic lines. It does not yet have a role in operative activity, although the need and benefits of team level forest planning are admitted. Demands and opportunities of variable forest data, new planning goals and development of information technology are known. Party organisations want to keep on track with development. One example is the engagement in extensive SIMO-project which connects the whole field of forest planning in Finland.
Resumo:
The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.
Resumo:
Aim: Effective decisions for managing invasive species depend on feedback about the progress of eradication efforts. Panetta & Lawes. developed the eradograph, an intuitive graphical tool that summarizes the temporal trajectories of delimitation and extirpation to support decision-making. We correct and extend the tool, which was affected by incompatibilities in the units used to measure these features that made the axes impossible to interpret biologically. Location: Victoria, New South Wales and Queensland, Australia. Methods: Panetta and Lawes' approach represented delimitation with estimates of the changes in the area known to be infested and extirpation with changes in the mean time since the last detection. We retain the original structure but propose different metrics that improve biological interpretability. We illustrate the methods with a hypothetical example and real examples of invasion and treatment of branched broomrape (Orobanche ramosa L.) and the guava rust complex (Puccinia psidii (Winter 1884)) in Australia. Results: These examples illustrate the potential of the tool to guide decisions about the effectiveness of search and control activities. Main conclusions: The eradograph is a graphical data summary tool that provides insight into the progress of eradication. Our correction and extension of the tool make it easier to interpret and provide managers with better decision support. © 2013 John Wiley & Sons Ltd.
Resumo:
Introduction Electronic medication administration record (eMAR) systems are promoted as a potential intervention to enhance medication safety in residential aged care facilities (RACFs). The purpose of this study was to conduct an in-practice evaluation of an eMAR being piloted in one Australian RACF before its roll out, and to provide recommendations for system improvements. Methods A multidisciplinary team conducted direct observations of workflow (n=34 hours) in the RACF site and the community pharmacy. Semi-structured interviews (n=5) with RACF staff and the community pharmacist were conducted to investigate their views of the eMAR system. Data were analysed using a grounded theory approach to identify challenges associated with the design of the eMAR system. Results The current eMAR system does not offer an end-to-end solution for medication management. Many steps, including prescribing by doctors and communication with the community pharmacist, are still performed manually using paper charts and fax machines. Five major challenges associated with the design of eMAR system were identified: limited interactivity; inadequate flexibility; problems related to information layout and semantics; the lack of relevant decision support; and system maintenance issues.We suggest recommendations to improve the design of the eMAR system and to optimize existing workflows. Discussion Immediate value can be achieved by improving the system interactivity, reducing inconsistencies in data entry design and offering dedicated organisational support to minimise connectivity issues. Longer-term benefits can be achieved by adding decision support features and establishing system interoperability requirements with stakeholder groups (e.g. community pharmacies) prior to system roll out. In-practice evaluations of technologies like eMAR system have great value in identifying design weaknesses which inhibit optimal system use.
Resumo:
The development of fishery indicators is a crucial undertaking as it ultimately provides evidence to stakeholders about the status of fished species such as population size and survival rates. In Queensland, as in many other parts of the world, age-abundance indicators (e.g. fish catch rate and/or age composition data) are traditionally used as the evidence basis because they provide information on species life history traits as well as on changes in fishing pressures and population sizes. Often, however, the accuracy of the information from age-abundance indicators can be limited due to missing or biased data. Consequently, improved statistical methods are required to enhance the accuracy, precision and decision-support value of age-abundance indicators.
Resumo:
Two trials were done in this project. One was a continuation of work started under a previous GRDC/SRDC-funded activity, 'Strategies to improve the integration of legumes into cane based farming systems'. This trial aimed to assess the impact of trash and tillage management options and nematicide application on nematodes and crop performance. Methods and results are contained in the following publication: Halpin NV, Stirling GR, Rehbein WE, Quinn B, Jakins A, Ginns SP. The impact of trash and tillage management options and nematicide application on crop performance and plant-parasitic nematode populations in a sugarcane/peanut farming system. Proc. Aust. Soc. Sugar Cane Technol. 37, 192-203. Nematicide application in the plant crop significantly reduced total numbers of plant parasitic nematodes (PPN) but there was no impact on yield. Application of nematicide to the ratoon crop significantly reduced sugar yield. The study confirmed other work demonstrating that implementation of strategies like reduced tillage reduced populations of total PPN, suggesting that the soil was more suppressive to PPN in those treatments. The second trial, a variety trial, demonstrated the limited value of nematicide application in sugarcane farming systems. This study has highlighted that growers shouldn’t view nematicides as a ‘cure all’ for paddocks that have historically had high PPN numbers. Nematicides have high mammalian toxicity, have the potential to contaminate ground water (Kookana et al. 1995) and are costly. The cost of nematicide used in R1 was approx. $320 - $350/ha, adding $3.50/t of cane in a 100 t/ha crop. Also, our study demonstrated that a single nematicide treatment at the application rate registered for sugarcane is not very effective in reducing populations of nematode pests. There appears to be some levels of resistance to nematodes within the current suite of varieties available to the southern canelands. For example the soil in plots that were growing Q183 had 560% more root knot nematodes / 200mL soil compared to plots that grew Q245. The authors see great value in investment into a nematode screening program that could rate varieties into groups of susceptibility to both major sugarcane nematode pests. Such a rating could then be built into a decision support ‘tree’ or tool to better enable producers to select varieties on a paddock by paddock basis.
Resumo:
Stakeholder engagement is important for successful management of natural resources, both to make effective decisions and to obtain support. However, in the context of coastal management, questions remain unanswered on how to effectively link decisions made at the catchment level with objectives for marine biodiversity and fisheries productivity. Moreover, there is much uncertainty on how to best elicit community input in a rigorous manner that supports management decisions. A decision support process is described that uses the adaptive management loop as its basis to elicit management objectives, priorities and management options using two case studies in the Great Barrier Reef, Australia. The approach described is then generalised for international interest. A hierarchical engagement model of local stakeholders, regional and senior managers is used. The result is a semi-quantitative generic elicitation framework that ultimately provides a prioritised list of management options in the context of clearly articulated management objectives that has widespread application for coastal communities worldwide. The case studies show that demand for local input and regional management is high, but local influences affect the relative success of both engagement processes and uptake by managers. Differences between case study outcomes highlight the importance of discussing objectives prior to suggesting management actions, and avoiding or minimising conflicts at the early stages of the process. Strong contributors to success are a) the provision of local information to the community group, and b) the early inclusion of senior managers and influencers in the group to ensure the intellectual and time investment is not compromised at the final stages of the process. The project has uncovered a conundrum in the significant gap between the way managers perceive their management actions and outcomes, and community's perception of the effectiveness (and wisdom) of these same management actions.