38 resultados para Response Elements


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pre-release evaluation of the efficacy of biological control agents is often not possible in the case of many invasive species targeted for biocontrol. In such circumstances simulating herbivory could yield significant insights into plant response to damage, thereby improving the efficiency of agent prioritisation, increasing the chances of regulating the performance of invasive plants through herbivory and minimising potential risks posed by release of multiple herbivores. We adopted this approach to understand the weaknesses herbivores could exploit, to manage the invasive liana, Macfadyena unguis-cati. We simulated herbivory by damaging the leaves, stem, root and tuber of the plant, in isolation and in combination. We also applied these treatments at multiple frequencies. Plant response in terms of biomass allocation showed that at least two severe defoliation treatments were required to diminish this liana's climbing habit and reduce its allocation to belowground tuber reserves. Belowground damage appears to have negligible effect on the plant's biomass production and tuber damage appears to trigger a compensatory response. Plant response to combinations of different types of damage did not differ significantly to that from leaf damage. This suggests that specialist herbivores in the leaf-feeding guild capable of removing over 50% of the leaf tissue may be desirable in the biological control of this invasive species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bos taurus indicus cattle are less susceptible to infestation with Rhipicephalus (Boophilus) microplus than Bos taurus taurus cattle but the immunological basis of this difference is not understood. We compared the dynamics of leukocyte infiltrations (T cell subsets, B cells, major histocompatibility complex (MHC) class II-expressing cells, granulocytes) in the skin near the mouthparts of larvae of R. microplus in B. t. indicus and B. t. taurus cattle. Previously naïve cattle were infested with 50,000 larvae (B. t. indicus) or 10,000 larvae (B. t. taurus) weekly for 6 weeks. One week after the last infestation all of the animals were infested with 20,000 larvae of R. microplus. Skin punch biopsies were taken from all animals on the day before the primary infestation and from sites of larval attachment on the day after the first, second, fourth and final infestations. Infiltrations with CD3+, CD4+, CD8+ and [gamma][delta] T cells followed the same pattern in both breeds, showing relatively little change during the first four weekly infestations, followed by substantial increases at 7 weeks post-primary infestation. There was a tendency for more of all cell types except granulocytes to be observed in the skin of B. t. indicus cattle but the differences between the two breeds were consistently significant only for [gamma][delta] T cells. Granulocyte infiltrations increased more rapidly from the day after infestation and were higher in B. t. taurus cattle than in B. t. indicus. Granulocytes and MHC class II-expressing cells infiltrated the areas closest to the mouthparts of larvae. A large volume of granulocyte antigens was seen in the gut of attached, feeding larvae.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soft-leaf buffalo grass is increasing in popularity as an amenity turfgrass in Australia. This project was instigated to assess the adaptation of and establish management guidelines for its use in Australias vast array of growing environments. There is an extensive selection of soft-leaf buffalo grass cultivars throughout Australia and with the countrys changing climates from temperate in the south to tropical in the north not all cultivars are going to be adapted to all regions. The project evaluated 19 buffalo grass cultivars along with other warm-season grasses including green couch, kikuyu and sweet smother grass. The soft-leaf buffalo grasses were evaluated for their growth and adaptation in a number of regions throughout Australia including Western Australia, Victoria, ACT, NSW and Queensland. The growth habit of the individual cultivars was examined along with their level of shade tolerance, water use, herbicide tolerance, resistance to wear, response to nitrogen applications and growth potential in highly alkaline (pH) soils. The growth habit of the various cultivars currently commercially available in Australia differs considerably from the more robust type that spreads quicker and is thicker in appearance (Sir Walter, Kings Pride, Ned Kelly and Jabiru) to the dwarf types that are shorter and thinner in appearance (AusTine and AusDwarf). Soft-leaf buffalo grass types tested do not differ in water use when compared to old-style common buffalo grass. Thus, soft-leaf buffalo grasses, like other warm-season turfgrass species, are efficient in water use. These grasses also recover after periods of low water availability. Individual cultivar differences were not discernible. In high pH soils (i.e. on alkaline-side) some elements essential for plant growth (e.g. iron and manganese) may be deficient causing turfgrass to appear pale green, and visually unacceptable. When 14 soft-leaf buffalo grass genotypes were grown on a highly alkaline soil (pH 7.5-7.9), cultivars differed in leaf iron, but not in leaf manganese, concentrations. Nitrogen is critical to the production of quality turf. The methods for applying this essential element can be manipulated to minimise the maintenance inputs (mowing) during the peak growing period (summer). By applying the greatest proportion of the turfs total nitrogen requirements in early spring, peak summer growth can be reduced resulting in a corresponding reduction in mowing requirements. Soft-leaf buffalo grass cultivars are more shade and wear tolerant than other warm-season turfgrasses being used by homeowners. There are differences between the individual buffalo grass varieties however. The majority of types currently available would be classified as having moderate levels of shade tolerance and wear reasonably well with good recovery rates. The impact of wear in a shaded environment was not tested and there is a need to investigate this as this is a typical growing environment for many homeowners. The use of herbicides is required to maintain quality soft-leaf buffalo grass turf. The development of softer herbicides for other turfgrasses has seen an increase in their popularity. The buffalo grass cultivars currently available have shown varying levels of susceptibility to the chemicals tested. The majority of the cultivars evaluated have demonstrated low levels of phytotoxicity to the herbicides chlorsulfuron (Glean) and fluroxypyr (Starane and Comet). In general, soft leaf buffalo grasses are varied in their makeup and have demonstrated varying levels of tolerance/susceptibility/adaptation to the conditions they are grown under. Consequently, there is a need to choose the cultivar most suited to the environment it is expected to perform in and the management style it will be exposed to. Future work is required to assess how the structure of the different cultivars impacts on their capacity to tolerate wear, varying shade levels, water use and herbicide tolerance. The development of a growth model may provide the solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding plant demography and plant response to herbivory is critical to the selection of effective weed biological control agents. We adopt the metaphor of 'filters' to suggest how agent prioritisation may be improved to narrow our choices down to those likely to be most effective in achieving the desired weed management outcome. Models can serve to capture our level of knowledge (or ignorance) about our study system and we illustrate how one type of modelling approach (matrix models) may be useful in identifying the weak link in a plant life cycle by using a hypothetical and an actual weed example (Parkinsonia aculeata). Once the vulnerable stage has been identified we propose that studying plant response to herbivory (simulated and/or actual) can help identify the guilds of herbivores to which a plant is most likely to succumb. Taking only potentially effective agents through the filter of host specificity may improve the chances of releasing safe and effective agents. The methods we outline may not always lead us definitively to the successful agent(s), but such an empirical, data-driven approach will make the basis for agent selection explicit and serve as testable hypotheses once agents are released.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Live vaccines containing attenuated parasite strains are increasingly used to control chicken coccidiosis. In this paper antibody responses elicited by infections with wild-type and attenuated strains of Eimeria tenella and E.necatrix were characterized by immunoblotting and ELISA with homologous and heterologous antisera. Few differences between antisera from birds infected with wild and attenuated strains of E. tenella were evident in immunoblots conducted with merozoite antigen preparations from both E. tenella strains, however the reactivity of sera raised in birds infected with the wild-type strain was noticeably more intense. In ELISAs conducted with merozoite antigen preparations, antisera from birds infected with the wild-type strains of E. tenella and E. necatrix consistently produced a significantly higher (P < 0.05) antibody response than antisera from birds infected with the attenuated strains. Likewise, avidity ELISAs conducted with the E. tenella strains demonstrated that antibodies in birds infected with the wild-type strain were of significantly higher avidity (P < 0.05) than antibodies in birds infected with the attenuated strain. The differences in the antibody responses are probably due to changes in the attenuated strain as a result of selection for precocious development and the less severe tissue damage and inflammation of the intestine resulting from infection with the attenuated strain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This section outlines the most important issues addressed in the management of the response in the two infected states, New South Wales and Queensland. There were differences in the management of the response between the states for logistic, geographic and organisation structural reasons. Issues included the use of control centres, information centres, the problems associated with the lack of trained staff to undertake all the roles, legislative issues, controls of horse movements, the availability of resources for adequate surveillance, the challenges of communication between disparate groups and tracing the movements of both humans and horses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fumigation of stored grain with phosphine (PH 3) is used widely to control the lesser grain borer Rhyzopertha dominica. However, development of high level resistance to phosphine in this species threatens control. Effective resistance management relies on knowledge of the expression of resistance in relation to dosage at all life stages. Therefore, we determined the mode of inheritance of phosphine resistance and strength of the resistance phenotype at each developmental stage. We achieved this by comparing mortality and developmental delay between a strongly resistant strain (R-strain), a susceptible strain (S-strain) and their F 1 progenies. Resistance was a maternally inherited, semi-dominant trait in the egg stage but was inherited as an autosomal, incompletely recessive trait in larvae and pupae. The rank order of developmental tolerance in both the sensitive and resistant strains was eggs > pupae > larvae. Comparison of published values for the response of adult R. dominica relative to our results from immature stages reveals that the adult stage of the S-strain is more sensitive to phosphine than are larvae. This situation is reversed in the R-strain as the adult stage is much more resistant to phosphine than even the most tolerant immature stage. Phosphine resistance factors at LC 50 were eggs 400×, larvae 87× and pupae 181× with respect to reference susceptible strain (S-strain) adults indicating that tolerance conferred by a particular immature stage neither strongly nor reliably interacts with the genetic resistance element. Developmental delay relative to unfumigated control insects was observed in 93% of resistant pupae, 86% of resistant larvae and 41% of resistant eggs. Increased delay in development and the toxicity response to phosphine exposure were both incompletely recessive. We show that resistance to phosphine has pleiotropic effects and that the expression of these effects varies with genotype and throughout the life history of the insect. © 2012.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Propagation of subtropical eucalypts is often limited by low production of rooted cuttings in winter. This study tested whether changing the temperature of Corymbia citriodora and Eucalyptus dunnii stock plants from 28/23A degrees C (day/night) to 18/13A degrees C, 23/18A degrees C or 33/28A degrees C affected the production of cuttings by stock plants, the concentrations of Ca and other nutrients in cuttings, and the subsequent percentages of cuttings that formed roots. Optimal temperatures for shoot production were 33/28A degrees C and 28/23A degrees C, with lower temperatures reducing the number of harvested cuttings. Stock plant temperature regulated production of rooted cuttings, firstly by controlling shoot production and, secondly, by affecting the ensuing rooting percentage. Shoot production was the primary factor regulating rooted cutting production by C. citriodora, but both shoot production and root production were key determinants of rooted cutting production in E. dunnii. Effects of lower stock plant temperatures on rooting were not the result of reduced Ca concentration, but consistent relationships were found between adventitious root formation and B concentration. Average rooting percentages were low (1-15% for C. citriodora and 2-22% for E. dunnii) but rooted cutting production per stock plant (e.g. 25 for C. citriodora and 52 for E. dunnii over 14 weeks at 33/28A degrees C) was sufficient to establish clonal field tests for plantation forestry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tribolium castaneum (Herbst) and Rhyzopertha dominica (F.) are common cosmopolitan pests of stored grain and grain products. We evaluated the relative attraction of T.castaneum and R.dominica to wheat, sorghum and cotton seeds in the field, near grain storage facilities and well away from storages in southern and central Queensland using multiple trapping techniques. The results show that T.castaneum is more strongly attracted to linted cotton seed relative to wheat, whereas R.dominica did not respond to cotton seed at all and was attracted only to wheat. Significantly more adults of T.castaneum (10-15 times) were attracted to traps placed on the ground, near grain storage, than to equivalent traps that were suspended (1.5m above the ground) nearby. These results suggest that Tribolium beetles detect and respond to resources towards the end of their dispersal flight, after which they localize resources while walking. By contrast R.dominica was captured only in suspended traps, which suggests they fly directly onto resources as they localize them. The ability of both species to colonize and reproduce in isolated resource patches within the relatively short time of 1month is illustrated by the returns from the traps deployed in the field (at least 1km from the nearest stored grain) even though they caught only a few beetles. The results presented here provide novel insights about the resource location behaviours of both T.castaneum and R.dominica. In particular, the relationship of T.castaneum with non-cereal resources that are not conventionally associated with this species suggests an emphasis on these other resources in investigating the resource location behaviour of these beetles. This new perspective on the ecology of T. castaneum highlights the potential role of non-cereal resources (such as the lint on cotton seed) in the spread of grain pest infestations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A recent report to the Australian Government identified concerns relating to Australia's capacity to respond to a medium to large outbreak of FMD. To assess the resources required, the AusSpread disease simulation model was used to develop a plausible outbreak scenario that included 62 infected premises in five different states at the time of detection, 28 days after the disease entered the first property in Victoria. Movements of infected animals and/or contaminated product/equipment led to smaller outbreaks in NSW, Queensland, South Australia and Tasmania. With unlimited staff resources, the outbreak was eradicated in 63 days with 54 infected premises and a 98% chance of eradication within 3 months. This unconstrained response was estimated to involve 2724 personnel. Unlimited personnel was considered unrealistic, and therefore, the course of the outbreak was modelled using three levels of staffing and the probability of achieving eradication within 3 or 6 months of introduction determined. Under the baseline staffing level, there was only a 16% probability that the outbreak would be eradicated within 3 months, and a 60% probability of eradication in 6 months. Deployment of an additional 60 personnel in the first 3 weeks of the response increased the likelihood of eradication in 3 months to 68%, and 100% in 6 months. Deployment of further personnel incrementally increased the likelihood of timely eradication and decreased the duration and size of the outbreak. Targeted use of vaccination in high-risk areas coupled with the baseline personnel resources increased the probability of eradication in 3 months to 74% and to 100% in 6 months. This required 25 vaccination teams commencing 12 days into the control program increasing to 50 vaccination teams 3 weeks later. Deploying an equal number of additional personnel to surveillance and infected premises operations was equally effective in reducing the outbreak size and duration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A significantly increased water regime can lead to inundation of rivers, creeks and surrounding floodplains- and thus impact on the temporal dynamics of both the extant vegetation and the dormant, but viable soil-seed bank of riparian corridors. The study documented changes in the soil seed-bank along riparian corridors before and after a major flood event in January 2011 in southeast Queensland, Australia. The study site was a major river (the Mooleyember creek) near Roma, Central Queensland impacted by the extreme flood event and where baseline ecological data on riparian seed-bank populations have previously been collected in 2007, 2008 and 2009. After the major flood event, we collected further soil samples from the same locations in spring/summer (November–December 2011) and in early autumn (March 2012). Thereafter, the soils were exposed to adequate warmth and moisture under glasshouse conditions, and emerged seedlings identified taxonomically. Flooding increased seed-bank abundance but decreased its species richness and diversity. However, flood impact was less than that of yearly effect but greater than that of seasonal variation. Seeds of trees and shrubs were few in the soil, and were negatively affected by the flood; those of herbaceous and graminoids were numerous and proliferate after the flood. Seed-banks of weedy and/or exotic species were no more affected by the flood than those of native and/or non-invasive species. Overall, the studied riparian zone showed evidence of a quick recovery of its seed-bank over time, and can be considered to be resilient to an extreme flood event.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cat’s claw creeper vine, Dolichandra unguis-cati (L.) L.G.Lohmann (formerly known as Macfadyena unguis-cati (L.) A.H.Gentry), a Weed of National Significance (WoNS), is a structural woody parasite that is highly invasive along sensitive riparian corridors and native forests of coastal and inland eastern Australia. As part of evaluation of the impact of herbicide and mechanical/physical control techniques on the long-term reduction of biomass of the weed and expected return of native flora, we have set-up permanent vegetation plots in: (a) infested and now chemically/physically treated, (b) infested but untreated and (c) un-infested patches. The treatments were set up in both riparian and non-riparian habitats to document changes that occur in seed bank flora over a two-year post-treatment period. Response to treatment varied spatially and temporally. However, following chemical and physical removal treatments, treated patches exhibited lower seed bank abundance and diversity than infested and patches lacking the weed, but differences were not statistically significant. Thus it will be safe to say that spraying herbicides using the recommended rate does not undermine restoration efforts.