8 resultados para Risk and loss functions
em eResearch Archive - Queensland Department of Agriculture
Resumo:
The hypothesis that contaminant plants growing amongst chickpea serve as Helicoverpa sinks by diverting oviposition pressure away from the main crop was tested under field conditions. Gain (recruitment) and loss (presumed mortality) of juvenile stages of Helicoverpa spp. on contaminant faba bean and wheat plants growing in chickpea plots were quantified on a daily basis over a 12-d period. The possibility of posteclosion movement of larvae from the contaminants to the surrounding chickpea crop was examined. Estimated total loss of the census population varied from 80 to 84% across plots and rows. The loss of brown eggs (40–47%) contributed most to the overall loss estimate, followed by loss of white eggs (27–35%) and larvae (6–9%). The cumulative number of individuals entering the white and brown egg and larval stages over the census period ranged from 15 to 58, 10–48 and 1–6 per m row, respectively. The corresponding estimates of mean stage-specific loss, expressed as a percentage of individuals entering the stage, ranged from 52 to 57% for white eggs, 87–108% for brown eggs and 71–87% for first-instar larvae. Mean larval density on chickpea plants in close proximity to the contaminant plants did not exceed the baseline larval density on chickpea further away from the contaminants across rows and plots. The results support the hypothesis that contaminant plants in chickpea plots serve as Helicoverpa sinks by diverting egg pressure from the main crop and elevating mortality of juvenile stages. Deliberate contamination of chickpea crops with other plant species merits further investigation as a cultural pest management strategy for Helicoverpa spp.
Resumo:
While the method using specialist herbivores in managing invasive plants (classical biological control) is regarded as relatively safe and cost-effective in comparison to other methods of management, the rarity of strict monophagy among insect herbivores illustrates that, like any management option, biological control is not risk-free. The challenge for classical biological control is therefore to predict risks and benefits a priori. In this study we develop a simulation model that may aid in this process. We use this model to predict the risks and benefits of introducing the chrysomelid beetle Charidotis auroguttata to manage the invasive liana Macfadyena unguis-cati in Australia. Preliminary host-specificity testing of this herbivore indicated that there was limited feeding on a non-target plant, although the non-target was only able to sustain some transitions of the life cycle of the herbivore. The model includes herbivore, target and non-target life history and incorporates spillover dynamics of populations of this herbivore from the target to the non-target under a variety of scenarios. Data from studies of this herbivore in the native range and under quarantine were used to parameterize the model and predict the relative risks and benefits of this herbivore when the target and non-target plants co-occur. Key model outputs include population dynamics on target (apparent benefit) and non-target (apparent risk) and fitness consequences to the target (actual benefit) and non-target plant (actual risk) of herbivore damage. The model predicted that risk to the non-target became unacceptable (i.e. significant negative effects on fitness) when the ratio of target to non-target in a given patch ranged from 1:1 to 3:2. By comparing the current known distribution of the non-target and the predicted distribution of the target we were able to identify regions in Australia where the agent may be pose an unacceptable risk. By considering risk and benefit simultaneously, we highlight how such a simulation modelling approach can assist scientists and regulators in making more objective decisions a priori, on the value of releasing specialist herbivores as biological control agents.
Resumo:
The global importance of grasslands is indicated by their extent; they comprise some 26% of total land area and 80% of agriculturally productive land. The majority of grasslands are located in tropical developing countries where they are particularly important to the livelihoods of some one billion poor peoples. Grasslands clearly provide the feed base for grazing livestock and thus numerous high-quality foods, but such livestock also provide products such as fertilizer, transport, traction, fibre and leather. In addition, grasslands provide important services and roles including as water catchments, biodiversity reserves, for cultural and recreational needs, and potentially a carbon sink to alleviate greenhouse gas emissions. Inevitably, such functions may conflict with management for production of livestock products. Much of the increasing global demand for meat and milk, particularly from developing countries, will have to be supplied from grassland ecosystems, and this will provide difficult challenges. Increased production of meat and milk generally requires increased intake of metabolizable energy, and thus increased voluntary intake and/or digestibility of diets selected by grazing animals. These will require more widespread and effective application of improved management. Strategies to improve productivity include fertilizer application, grazing management, greater use of crop by-products, legumes and supplements and manipulation of stocking rate and herbage allowance. However, it is often difficult to predict the efficiency and cost-effectiveness of such strategies, particularly in tropical developing country production systems. Evaluation and on-going adjustment of grazing systems require appropriate and reliable assessment criteria, but these are often lacking. A number of emerging technologies may contribute to timely low-cost acquisition of quantitative information to better understand the soil-pasture-animal interactions and animal management in grassland systems. Development of remote imaging of vegetation, global positioning technology, improved diet markers, near IR spectroscopy and modelling provide improved tools for knowledge-based decisions on the productivity constraints of grazing animals. Individual electronic identification of animals offers opportunities for precision management on an individual animal basis for improved productivity. Improved outcomes in the form of livestock products, services and/or other outcomes from grasslands should be possible, but clearly a diversity of solutions are needed for the vast range of environments and social circumstances of global grasslands.
Resumo:
The reliability of ants as bioindicators of ecosystem condition is dependent on the consistency of their response to localised habitat characteristics, which may be modified by larger-scale effects of habitat fragmentation and loss. We assessed the relative contribution of habitat fragmentation, habitat loss and within-patch habitat characteristics in determining ant assemblages in semi-arid woodland in Queensland, Australia. Species and functional group abundance were recorded using pitfall traps across 20 woodland patches in landscapes that exhibited a range of fragmentation states. Of fragmentation measures, changes in patch area and patch edge contrast exerted the greatest influence on species assemblages, after accounting for differences in habitat loss. However, 35% of fragmentation effects on species were confounded by the effects of habitat characteristics and habitat loss. Within-patch habitat characteristics explained more than twice the amount of species variation attributable to fragmentation and four times the variation explained by habitat loss. The study indicates that within-patch habitat characteristics are the predominant drivers of ant composition. We suggest that caution should be exercised in interpreting the independent effects of habitat fragmentation and loss on ant assemblages without jointly considering localised habitat attributes and associated joint effects.
Resumo:
Hendra virus is a highly pathogenic novel paramyxovirus causing sporadic fatal infection in horses and humans in Australia. Species of fruit-bats (genus Pteropus), commonly known as flying-foxes, are the natural host of the virus. We undertook a survey of horse owners in the states of Queensland and New South Wales, Australia to assess the level of adoption of recommended risk management strategies and to identify impediments to adoption. Survey questionnaires were completed by 1431 respondents from the target states, and from a spectrum of industry sectors. Hendra virus knowledge varied with sector, but was generally limited, with only 13% of respondents rating their level of knowledge as high or very high. The majority of respondents (63%) had seen their state’s Hendra virus information for horse owners, and a similar proportion found the information useful. Fifty-six percent of respondents thought it moderately, very or extremely likely that a Hendra virus case could occur in their area, yet only 37% said they would consider Hendra virus if their horse was sick. Only 13% of respondents stabled their horses overnight, although another 24% said it would be easy or very easy to do so, but hadn’t done so. Only 13% and 15% of respondents respectively had horse feed bins and water points under solid cover. Responses varied significantly with state, likely reflecting different Hendra virus history. The survey identified inconsistent awareness and/or adoption of available knowledge, confusion in relation to Hendra virus risk perception, with both over-and under-estimation of true risk, and lag in the uptake of recommended risk minimisation strategies, even when these were readily implementable. However, we also identified frustration and potential alienation by horse owners who found the recommended strategies impractical, onerous and prohibitively expensive. The insights gained from this survey have broader application to other complex risk-management scenarios.
Resumo:
In irrigated cropping, as with any other industry, profit and risk are inter-dependent. An increase in profit would normally coincide with an increase in risk, and this means that risk can be traded for profit. It is desirable to manage a farm so that it achieves the maximum possible profit for the desired level of risk. This paper identifies risk-efficient cropping strategies that allocate land and water between crop enterprises for a case study of an irrigated farm in Southern Queensland, Australia. This is achieved by applying stochastic frontier analysis to the output of a simulation experiment. The simulation experiment involved changes to the levels of business risk by systematically varying the crop sowing rules in a bioeconomic model of the case study farm. This model utilises the multi-field capability of the process based Agricultural Production System Simulator (APSIM) and is parameterised using data collected from interviews with a collaborating farmer. We found sowing rules that increased the farm area sown to cotton caused the greatest increase in risk-efficiency. Increasing maize area also improved risk-efficiency but to a lesser extent than cotton. Sowing rules that increased the areas sown to wheat reduced the risk-efficiency of the farm business. Sowing rules were identified that had the potential to improve the expected farm profit by ca. $50,000 Annually, without significantly increasing risk. The concept of the shadow price of risk is discussed and an expression is derived from the estimated frontier equation that quantifies the trade-off between profit and risk.
Resumo:
Bats of the genus Pteropus (flying-foxes) are the natural host of Hendra virus (HeV) which periodically causes fatal disease in horses and humans in Australia. The increased urban presence of flying-foxes often provokes negative community sentiments because of reduced social amenity and concerns of HeV exposure risk, and has resulted in calls for the dispersal of urban flying-fox roosts. However, it has been hypothesised that disturbance of urban roosts may result in a stress-mediated increase in HeV infection in flying-foxes, and an increased spillover risk. We sought to examine the impact of roost modification and dispersal on HeV infection dynamics and cortisol concentration dynamics in flying-foxes. The data were analysed in generalised linear mixed models using restricted maximum likelihood (REML). The difference in mean HeV prevalence in samples collected before (4.9%), during (4.7%) and after (3.4%) roost disturbance was small and non-significant (P = 0.440). Similarly, the difference in mean urine specific gravity-corrected urinary cortisol concentrations was small and non-significant (before = 22.71 ng/mL, during = 27.17, after = 18.39) (P= 0.550). We did find an underlying association between cortisol concentration and season, and cortisol concentration and region, suggesting that other (plausibly biological or environmental) variables play a role in cortisol concentration dynamics. The effect of roost disturbance on cortisol concentration approached statistical significance for region, suggesting that the relationship is not fixed, and plausibly reflecting the nature and timing of disturbance. We also found a small positive statistical association between HeV excretion status and urinary cortisol concentration. Finally, we found that the level of flying-fox distress associated with roost disturbance reflected the nature and timing of the activity, highlighting the need for a ‘best practice’ approach to dispersal or roost modification activities. The findings usefully inform public discussion and policy development in relation to Hendra virus and flying-fox management.
Resumo:
Reforestation will have important consequences for the global challenges of mitigating climate change, arresting habitat decline and ensuring food security. We examined field-scale trade-offs between carbon sequestration of tree plantings and biodiversity potential and loss of agricultural land. Extensive surveys of reforestation across temperate and tropical Australia (N = 1491 plantings) were used to determine how planting width and species mix affect carbon sequestration during early development (< 15 year). Carbon accumulation per area increased significantly with decreasing planting width and with increasing proportion of eucalypts (the predominant over-storey genus). Highest biodiversity potential was achieved through block plantings (width > 40 m) with about 25% of planted individuals being eucalypts. Carbon and biodiversity goals were balanced in mixed-species plantings by establishing narrow belts (width < 20 m) with a high proportion (>75%) of eucalypts, and in monocultures of mallee eucalypt plantings by using the widest belts (ca. 6–20 m). Impacts on agriculture were minimized by planting narrow belts (ca. 4 m) of mallee eucalypt monocultures, which had the highest carbon sequestering efficiency. A plausible scenario where only 5% of highly-cleared areas (<30% native vegetation cover remaining) of temperate Australia are reforested showed substantial mitigation potential. Total carbon sequestration after 15 years was up to 25 Mt CO2-e year−1 when carbon and biodiversity goals were balanced and 13 Mt CO2-e year−1 if block plantings of highest biodiversity potential were established. Even when reforestation was restricted to marginal agricultural land (<$2000 ha−1 land value, 28% of the land under agriculture in Australia), total mitigation potential after 15 years was 17–26 Mt CO2-e year−1 using narrow belts of mallee plantings. This work provides guidance on land use to governments and planners. We show that the multiple benefits of young tree plantings can be balanced by manipulating planting width and species choice at establishment. In highly-cleared areas, such plantings can sequester substantial biomass carbon while improving biodiversity and causing negligible loss of agricultural land.