40 resultados para extreme response bias
Resumo:
Liquid forms of phosphorus (P) have been shown to be more effective than granular P for promoting cereal growth in alkaline soils with high levels of free calcium carbonate on Eyre Peninsula, South Australia. However, the advantage of liquid over granular P forms of fertiliser has not been fully investigated across the wide range of soils used for grain production in Australia. A glasshouse pot experiment tested if liquid P fertilisers were more effective for growing spring wheat (Triticum aestivum L.) than granular P (monoammonium phosphate) in 28 soils from all over Australia with soil pH (H2O) ranging from 5.2 to 8.9. Application of liquid P resulted in greater shoot biomass, as measured after 4 weeks' growth (mid to late tillering, Feeks growth stage 2-3), than granular P in 3 of the acidic to neutral soils and in 3 alkaline soils. Shoot dry matter responses of spring wheat to applied liquid or granular P were related to soil properties to determine if any of the properties predicted superior yield responses to liquid P. The calcium carbonate content of soil was the only soil property that significantly contributed to predicting when liquid P was more effective than granular P. Five soil P test procedures (Bray, Colwell, resin, isotopically exchangeable P, and diffusive gradients in thin films (DGT)) were assessed to determine their ability to measure soil test P on subsamples of soil collected before the experiment started. These soil test values were then related to the dry matter shoot yields to assess their ability to predict wheat yield responses to P applied as liquid or granular P. All 5 soil test procedures provided a reasonable prediction of dry matter responses to applied P as either liquid or granular P, with the resin P test having a slightly greater predictive capacity on the range of soils tested. The findings of this investigation suggest that liquid P fertilisers do have some potential applications in non-calcareous soils and confirm current recommendations for use of liquid P fertiliser to grow cereal crops in highly calcareous soils. Soil P testing procedures require local calibration for response to the P source that is going to be used to amend P deficiency.
Resumo:
Buffel grass [Pennisetum ciliare (L.) Link] has been widely introduced in the Australian rangelands as a consequence of its value for productive grazing, but tends to competitively establish in non-target areas such as remnant vegetation. In this study, we examined the influence landscape-scale and local-scale variables had upon the distribution of buffel grass in remnant poplar box (Eucalyptus populnea F. Muell.) dominant woodland fragments in the Brigalow Bioregion, Queensland. Buffel grass and variables thought to influence its distribution in the region were measured at 60 sites, which were selected based on the amount of native woodland retained in the landscape and patch size. An information-theoretic modelling approach and hierarchical partitioning revealed that the most influential variable was the percent of retained vegetation within a 1-km spatial extent. From this, we identified a critical threshold of similar to 30% retained vegetation in the landscape, above which the model predicted buffel grass was not likely to occur in a woodland fragment. Other explanatory variables in the model were site based, and included litter cover and long-term rainfall. Given the paucity of information on the effect of buffel grass upon biodiversity values, we undertook exploratory analyses to determine whether buffel grass cover influenced the distribution of grass, forb and reptile species. We detected some trends; hierarchical partitioning revealed that buffel grass cover was the most important explanatory variable describing habitat preferences of four reptile species. However, establishing causal links - particularly between native grass and forb species and buffel grass - was problematic owing to possible confounding with grazing pressure. We conclude with a set of management recommendations aimed at reducing the spread of buffel grass into remnant woodlands.
Resumo:
Pre-release evaluation of the efficacy of biological control agents is often not possible in the case of many invasive species targeted for biocontrol. In such circumstances simulating herbivory could yield significant insights into plant response to damage, thereby improving the efficiency of agent prioritisation, increasing the chances of regulating the performance of invasive plants through herbivory and minimising potential risks posed by release of multiple herbivores. We adopted this approach to understand the weaknesses herbivores could exploit, to manage the invasive liana, Macfadyena unguis-cati. We simulated herbivory by damaging the leaves, stem, root and tuber of the plant, in isolation and in combination. We also applied these treatments at multiple frequencies. Plant response in terms of biomass allocation showed that at least two severe defoliation treatments were required to diminish this liana's climbing habit and reduce its allocation to belowground tuber reserves. Belowground damage appears to have negligible effect on the plant's biomass production and tuber damage appears to trigger a compensatory response. Plant response to combinations of different types of damage did not differ significantly to that from leaf damage. This suggests that specialist herbivores in the leaf-feeding guild capable of removing over 50% of the leaf tissue may be desirable in the biological control of this invasive species.
Resumo:
Bos taurus indicus cattle are less susceptible to infestation with Rhipicephalus (Boophilus) microplus than Bos taurus taurus cattle but the immunological basis of this difference is not understood. We compared the dynamics of leukocyte infiltrations (T cell subsets, B cells, major histocompatibility complex (MHC) class II-expressing cells, granulocytes) in the skin near the mouthparts of larvae of R. microplus in B. t. indicus and B. t. taurus cattle. Previously naïve cattle were infested with 50,000 larvae (B. t. indicus) or 10,000 larvae (B. t. taurus) weekly for 6 weeks. One week after the last infestation all of the animals were infested with 20,000 larvae of R. microplus. Skin punch biopsies were taken from all animals on the day before the primary infestation and from sites of larval attachment on the day after the first, second, fourth and final infestations. Infiltrations with CD3+, CD4+, CD8+ and [gamma][delta] T cells followed the same pattern in both breeds, showing relatively little change during the first four weekly infestations, followed by substantial increases at 7 weeks post-primary infestation. There was a tendency for more of all cell types except granulocytes to be observed in the skin of B. t. indicus cattle but the differences between the two breeds were consistently significant only for [gamma][delta] T cells. Granulocyte infiltrations increased more rapidly from the day after infestation and were higher in B. t. taurus cattle than in B. t. indicus. Granulocytes and MHC class II-expressing cells infiltrated the areas closest to the mouthparts of larvae. A large volume of granulocyte antigens was seen in the gut of attached, feeding larvae.
Resumo:
Understanding plant demography and plant response to herbivory is critical to the selection of effective weed biological control agents. We adopt the metaphor of 'filters' to suggest how agent prioritisation may be improved to narrow our choices down to those likely to be most effective in achieving the desired weed management outcome. Models can serve to capture our level of knowledge (or ignorance) about our study system and we illustrate how one type of modelling approach (matrix models) may be useful in identifying the weak link in a plant life cycle by using a hypothetical and an actual weed example (Parkinsonia aculeata). Once the vulnerable stage has been identified we propose that studying plant response to herbivory (simulated and/or actual) can help identify the guilds of herbivores to which a plant is most likely to succumb. Taking only potentially effective agents through the filter of host specificity may improve the chances of releasing safe and effective agents. The methods we outline may not always lead us definitively to the successful agent(s), but such an empirical, data-driven approach will make the basis for agent selection explicit and serve as testable hypotheses once agents are released.
Resumo:
Live vaccines containing attenuated parasite strains are increasingly used to control chicken coccidiosis. In this paper antibody responses elicited by infections with wild-type and attenuated strains of Eimeria tenella and E.necatrix were characterized by immunoblotting and ELISA with homologous and heterologous antisera. Few differences between antisera from birds infected with wild and attenuated strains of E. tenella were evident in immunoblots conducted with merozoite antigen preparations from both E. tenella strains, however the reactivity of sera raised in birds infected with the wild-type strain was noticeably more intense. In ELISAs conducted with merozoite antigen preparations, antisera from birds infected with the wild-type strains of E. tenella and E. necatrix consistently produced a significantly higher (P < 0.05) antibody response than antisera from birds infected with the attenuated strains. Likewise, avidity ELISAs conducted with the E. tenella strains demonstrated that antibodies in birds infected with the wild-type strain were of significantly higher avidity (P < 0.05) than antibodies in birds infected with the attenuated strain. The differences in the antibody responses are probably due to changes in the attenuated strain as a result of selection for precocious development and the less severe tissue damage and inflammation of the intestine resulting from infection with the attenuated strain.
Resumo:
Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.
Resumo:
This section outlines the most important issues addressed in the management of the response in the two infected states, New South Wales and Queensland. There were differences in the management of the response between the states for logistic, geographic and organisation structural reasons. Issues included the use of control centres, information centres, the problems associated with the lack of trained staff to undertake all the roles, legislative issues, controls of horse movements, the availability of resources for adequate surveillance, the challenges of communication between disparate groups and tracing the movements of both humans and horses.
Resumo:
The equine influenza (EI) outbreak presented many challenges that required high-level coordination and decision making, as well as the development of new approaches for satisfactory and consistent resolution. This paper outlines the elements of the national coordination arrangements, preparatory arrangements in place prior to the outbreak that facilitated national coordination, and some of the issues faced and resolved in the response.
Resumo:
Crop models for herbaceous ornamental species typically include functions for temperature and photoperiod responses, but very few incorporate vernalization, which is a requirement of many traditional crops. This study investigated the development of floriculture crop models, which describe temperature responses, plus photoperiod or vernalization requirements, using Australian native ephemerals Brunonia australis and Calandrinia sp. A novel approach involved the use of a field crop modelling tool, DEVEL2. This optimization program estimates the parameters of selected functions within the development rate models using an iterative process that minimizes sum of squares residual between estimated and observed days for the phenological event. Parameter profiling and jack-knifing are included in DEVEL2 to remove bias from parameter estimates and introduce rigour into the parameter selection process. Development rate of B. australis from planting to first visible floral bud (VFB) was predicted using a multiplicative approach with a curvilinear function to describe temperature responses and a broken linear function to explain photoperiod responses. A similar model was used to describe the development rate of Calandrinia sp., except the photoperiod function was replaced with an exponential vernalization function, which explained a facultative cold requirement and included a coefficient for determining the vernalization ceiling temperature. Temperature was the main environmental factor influencing development rate for VFB to anthesis of both species and was predicted using a linear model. The phenology models for B. australis and Calandrinia sp. described development rate from planting to VFB and from VFB to anthesis in response to temperature and photoperiod or vernalization and may assist modelling efforts of other herbaceous ornamental plants. In addition to crop management, the vernalization function could be used to identify plant communities most at risk from predicted increases in temperature due to global warming.
Resumo:
Fumigation of stored grain with phosphine (PH 3) is used widely to control the lesser grain borer Rhyzopertha dominica. However, development of high level resistance to phosphine in this species threatens control. Effective resistance management relies on knowledge of the expression of resistance in relation to dosage at all life stages. Therefore, we determined the mode of inheritance of phosphine resistance and strength of the resistance phenotype at each developmental stage. We achieved this by comparing mortality and developmental delay between a strongly resistant strain (R-strain), a susceptible strain (S-strain) and their F 1 progenies. Resistance was a maternally inherited, semi-dominant trait in the egg stage but was inherited as an autosomal, incompletely recessive trait in larvae and pupae. The rank order of developmental tolerance in both the sensitive and resistant strains was eggs > pupae > larvae. Comparison of published values for the response of adult R. dominica relative to our results from immature stages reveals that the adult stage of the S-strain is more sensitive to phosphine than are larvae. This situation is reversed in the R-strain as the adult stage is much more resistant to phosphine than even the most tolerant immature stage. Phosphine resistance factors at LC 50 were eggs 400×, larvae 87× and pupae 181× with respect to reference susceptible strain (S-strain) adults indicating that tolerance conferred by a particular immature stage neither strongly nor reliably interacts with the genetic resistance element. Developmental delay relative to unfumigated control insects was observed in 93% of resistant pupae, 86% of resistant larvae and 41% of resistant eggs. Increased delay in development and the toxicity response to phosphine exposure were both incompletely recessive. We show that resistance to phosphine has pleiotropic effects and that the expression of these effects varies with genotype and throughout the life history of the insect. © 2012.
Resumo:
Propagation of subtropical eucalypts is often limited by low production of rooted cuttings in winter. This study tested whether changing the temperature of Corymbia citriodora and Eucalyptus dunnii stock plants from 28/23A degrees C (day/night) to 18/13A degrees C, 23/18A degrees C or 33/28A degrees C affected the production of cuttings by stock plants, the concentrations of Ca and other nutrients in cuttings, and the subsequent percentages of cuttings that formed roots. Optimal temperatures for shoot production were 33/28A degrees C and 28/23A degrees C, with lower temperatures reducing the number of harvested cuttings. Stock plant temperature regulated production of rooted cuttings, firstly by controlling shoot production and, secondly, by affecting the ensuing rooting percentage. Shoot production was the primary factor regulating rooted cutting production by C. citriodora, but both shoot production and root production were key determinants of rooted cutting production in E. dunnii. Effects of lower stock plant temperatures on rooting were not the result of reduced Ca concentration, but consistent relationships were found between adventitious root formation and B concentration. Average rooting percentages were low (1-15% for C. citriodora and 2-22% for E. dunnii) but rooted cutting production per stock plant (e.g. 25 for C. citriodora and 52 for E. dunnii over 14 weeks at 33/28A degrees C) was sufficient to establish clonal field tests for plantation forestry.
Resumo:
There is an increasing need to understand what makes vegetation at some locations more sensitive to climate change than others. For savanna rangelands, this requires building knowledge of how forage production in different land types will respond to climate change, and identifying how location-specific land type characteristics, climate and land management control the magnitude and direction of its responses to change. Here, a simulation analysis is used to explore how forage production in 14 land types of the north-eastern Australian rangelands responds to three climate change scenarios of +3A degrees C, +17% rainfall; +2A degrees C, -7% rainfall; and +3A degrees C, -46% rainfall. Our results demonstrate that the controls on forage production responses are complex, with functional characteristics of land types interacting to determine the magnitude and direction of change. Forage production may increase by up to 60% or decrease by up to 90% in response to the extreme scenarios of change. The magnitude of these responses is dependent on whether forage production is water or nitrogen (N) limited, and how climate changes influence these limiting conditions. Forage production responds most to changes in temperature and moisture availability in land types that are water-limited, and shows the least amount of change when growth is restricted by N availability. The fertilisation effects of doubled atmospheric CO2 were found to offset declines in forage production under 2A degrees C warming and a 7% reduction in rainfall. However, rising tree densities and declining land condition are shown to reduce potential opportunities from increases in forage production and raise the sensitivity of pastures to climate-induced water stress. Knowledge of these interactions can be applied in engaging with stakeholders to identify adaptation options.
Resumo:
Tribolium castaneum (Herbst) and Rhyzopertha dominica (F.) are common cosmopolitan pests of stored grain and grain products. We evaluated the relative attraction of T.castaneum and R.dominica to wheat, sorghum and cotton seeds in the field, near grain storage facilities and well away from storages in southern and central Queensland using multiple trapping techniques. The results show that T.castaneum is more strongly attracted to linted cotton seed relative to wheat, whereas R.dominica did not respond to cotton seed at all and was attracted only to wheat. Significantly more adults of T.castaneum (10-15 times) were attracted to traps placed on the ground, near grain storage, than to equivalent traps that were suspended (1.5m above the ground) nearby. These results suggest that Tribolium beetles detect and respond to resources towards the end of their dispersal flight, after which they localize resources while walking. By contrast R.dominica was captured only in suspended traps, which suggests they fly directly onto resources as they localize them. The ability of both species to colonize and reproduce in isolated resource patches within the relatively short time of 1month is illustrated by the returns from the traps deployed in the field (at least 1km from the nearest stored grain) even though they caught only a few beetles. The results presented here provide novel insights about the resource location behaviours of both T.castaneum and R.dominica. In particular, the relationship of T.castaneum with non-cereal resources that are not conventionally associated with this species suggests an emphasis on these other resources in investigating the resource location behaviour of these beetles. This new perspective on the ecology of T. castaneum highlights the potential role of non-cereal resources (such as the lint on cotton seed) in the spread of grain pest infestations.
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.