31 resultados para 100 years
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Fortunately, plants have developed highly effective mechanisms with which to defend themselves when attacked by potentially disease-causing microorganisms. If not, then they would succumb to the many pathogenic fungi, bacteria, viruses, nematodes and insect pests, and disease would prevail. These natural defence systems of plants can be deliberately activated to provide some protection against the major pathogens responsible for causing severe yield losses in agricultural and horticultural crops. This is the basis of what is known as ‘induced’ or ‘acquired’ disease resistance in plants. Although the phenomenon of induced resistance has been known amongst plant pathologists for over 100 years, its inclusion into pest and disease management programmes has been a relatively recent development, ie. within the last 5 years. This review will discuss very briefly some of the characteristics of the induced resistance phenomenon, outline some of the advantages and limitations to its implementation and provide some examples within a postharvest pathology context. Finally some approaches being investigated by the fruit pathology team at DPI Indooroopilly and collaborators will be outlined.
Resumo:
This paper reports on the use of APSIM - Maize for retrospective analysis of performance of a high input, high yielding maize crop and analysis of predicted performance of maize grown with high inputs over the long-term (>100 years) for specified scenarios of environmental conditions (temperature and radiation) and agronomic inputs (sowing date, plant population, nitrogen fertiliser and irrigation) at Boort, Victoria, Australia. It uses a high yielding (17 400 kg/ha dry grain, 20 500 kg/ha at 15% water) commercial crop grown in 2004-05 as the basis of the study. Yield for the agronomic and environmental conditions of 2004-05 was predicted accurately, giving confidence that the model could be used for the detailed analyses undertaken. The analysis showed that the yield achieved was close to that possible with the conditions and agronomic inputs of 2004-05. Sowing dates during 21 September to 26 October had little effect on predicted yield, except when combined with reduced temperature. Single year and long-term analyses concluded that a higher plant population (11 plants/m2) is needed to optimise yield, but that slightly lower N and irrigation inputs are appropriate for the plant population used commercially (8.4 plants/m2). Also, compared with changes in agronomic inputs increases in temperature and/or radiation had relatively minor effects, except that reduced temperature reduces predicted yield substantially. This study provides an approach for the use of models for both retrospective analysis of crop performance and assessment of long-term variability of crop yield under a wide range of agronomic and environmental conditions.
Resumo:
Rabbits continued to infest Bulloo Downs in southwest Queensland even after rabbit haemorrhagic disease virus (RHDV) had effectively reduced rabbit populations to very low levels in most other arid parts of Australia. Control efforts for over 100 years have all appeared unable to stop rabbits causing damage to cattle production and native plants and animals in the area. In 2001 an experiment established to measure the benefit of rabbit control to biodiversity and cattle production showed warren ripping to cause an immediate reduction in rabbit activity. Three months after ripping there were still 98% fewer rabbits in ripped plots despite these plots being exposed to invasion from surrounding populations. The cost of ripping was high because of the high density of warrens and is prohibitive for a full-scale programme. Nevertheless, ripping warrens just in the rabbit’s drought refuge (2002 -2004) appears to have effectively controlled rabbits over the entire property. Following one good season rabbits still have not recovered where the drought refuge was effectively ripped. Destroying warrens in the areas where rabbits survived droughts achieved a reduction in rabbits of over 99% ompared to a similar area near Coongie Lakes in South Australia. Low rabbit numbers allowed cattle to continue to be run on the property even though the area experienced seven consecutive years with below average rainfall. It still remains to be seen whether rabbits can recover from this low population-base during a run of good seasons. If rabbit numbers remain suppressed after a run of good seasons then rabbit control by destruction of drought refuge could be repeated at Coongie Lakes and other drought refuge areas in the arid zone. Identification and treatment of areas similar to Bulloo Downs where rabbits survive drought may relieve a very large area of arid Australia from the damage caused by rabbits.
Resumo:
Despite biocontrol research spanning over 100 years, the hybrid weed, commonly referred to as Lantana camara, is not under adequate control. Host specificity and varietal preference of released agents, climatic suitability of a region for released agents, number of agents introduced and range or area of infestation appear to play a role in limiting biocontrol success. At least one of 41 species of mainly leaf- or flower-feeding insects has been introduced, or spread, to 41 of the 70 countries or regions where lantana occurs. Over half (26) of these species have established, achieving varying levels of herbivory and presumably some degree of control. Accurate taxonomy of the plant and adaptation of potential agents to the host plant are some of the better predictors of at least establishment success. Retrospective analysis of the hosts of introduced biocontrol agents for L. camara show that a greater proportion of agents that were collected from L. camara or Lantana urticifolia established, than agents that were collected from other species of Lantana. Of the introduced agents that had established and were oligophagous, 18 out of 22 established. The proportion of species establishing, declined with the number of species introduced. However, there was no trend when oceanic islands were treated separately from mainland areas and the result is likely an artefact of how introductions have changed over time. A calculated index of the degree of herbivory due to agents known to have caused some damage per country, was not related to land area infested with lantana for mainlands nor for oceanic islands. However, the degree of herbivory is much higher on islands than mainlands. This difference between island and mainland situations may reflect population dynamics in patchy or metapopulation landscapes. Basic systematic studies of the host remain crucial to successful biocontrol, especially of hybrid weeds like L. camara. Potential biocontrol agents should be monophages collected from the most closely related species to the target weed or be phytophages that attack several species of lantana. Suitable agents should be released in the most ideal ecoclimatic area. Since collection of biocontrol agents has been limited to a fraction of the known number of phytophagous species available, biocontrol may be improved by targeting insects that feed on stems and roots, as well as the agents that feed on leaves and flowers.
Resumo:
Since their release over 100 years ago, camels have spread across central Australia and increased in number. Increasingly, they are being seen as a pest, with observed impacts from overgrazing and damage to infrastructure such as fences. Irregular aerial surveys since 1983 and an interview-based survey in 1966 suggest that camels have been increasing at close to their maximum rate. A comparison of three models of population growth fitted to these, albeit limited, data suggests that the Northern Territory population has indeed been growing at an annual exponential rate of r = 0.074, or 8% per year, with little evidence of a density-dependent brake. A stage-structured model using life history data from a central Australian camel population suggests that this rate approximates the theoretical maximum. Elasticity analysis indicates that adult survival is by far the biggest influence on rate of increase and that a 9% reduction in survival from 96% is needed to stop the population growing. In contrast, at least 70% of mature females need to be sterilised to have a similar effect. In a benign environment, a population of large mammals such as camels is expected to grow exponentially until close to carrying capacity. This will frustrate control programs, because an ever-increasing number of animals will need to be removed for zero growth the longer that culling or harvesting effort is delayed. A population projection for 2008 suggests ~10 500 animals need to be harvested across the Northern Territory. Current harvests are well short of this. The ability of commercial harvesting to control camel populations in central Australia will depend on the value of animals, access to animals and the presence of alternative species to harvest when camels are at low density.
Resumo:
Context: For over 100 years, control efforts have been unable to stop rabbits causing damage to cattle production and native plants and animals on large properties in arid parts of Australia. Warren destruction by ripping has shown promise, but doubts about long-term success and the perceived expense of treating vast areas have led to this technique not being commonly used. Aims: This study measured the long-term reduction in rabbit activity and calculated the potential cost saving associated with treating just the areas where rabbits are believed to survive drought. Wealso considered whether ripping should be used in a full-scale rabbit control program on a property where rabbits have been exceptionally resilient to the influence of biological and other control measures. Methods: Rabbits were counted along spotlight transects before warrens were ripped and during the two years after ripping, in treated and untreated plots. Rabbit activity was recorded to determine the immediate and long-term impact of ripping, up to seven years after treatment. The costs of ripping warrens within different distances from drought refuge areas were calculated. Key results: Destroying rabbit warrens by ripping caused an immediate reduction in rabbit activity and there were still 98% fewer rabbits counted by spotlight in ripped plots five months after ripping. Seven years after ripping no active warrens were found in ripped plots, whereas 57% of warrens in unripped plots showed signs of rabbit activity. The cost of ripping only the areas where rabbits were likely to seek refuge from drought was calculated to be less than 4%of the cost of ripping all warrens on the property. Conclusions: Destroying rabbit warrens by ripping is a very effective way of reducing rabbit numbers on large properties in arid Australia. Ripping should commence in areas used by rabbits to survive drought. It is possible that no further ripping will be required. Implications: Strategic destruction of warrens in drought refuge areas could provide an alternative to biological control for managing rabbits on large properties in the Australian arid zone.
Resumo:
Genotype-environment interactions (GEI) limit genetic gain for complex traits such as tolerance to drought. Characterization of the crop environment is an important step in understanding GEI. A modelling approach is proposed here to characterize broadly (large geographic area, long-term period) and locally (field experiment) drought-related environmental stresses, which enables breeders to analyse their experimental trials with regard to the broad population of environments that they target. Water-deficit patterns experienced by wheat crops were determined for drought-prone north-eastern Australia, using the APSIM crop model to account for the interactions of crops with their environment (e.g. feedback of plant growth on water depletion). Simulations based on more than 100 years of historical climate data were conducted for representative locations, soils, and management systems, for a check cultivar, Hartog. The three main environment types identified differed in their patterns of simulated water stress around flowering and during grain-filling. Over the entire region, the terminal drought-stress pattern was most common (50% of production environments) followed by a flowering stress (24%), although the frequencies of occurrence of the three types varied greatly across regions, years, and management. This environment classification was applied to 16 trials relevant to late stages testing of a breeding programme. The incorporation of the independently-determined environment types in a statistical analysis assisted interpretation of the GEI for yield among the 18 representative genotypes by reducing the relative effect of GEI compared with genotypic variance, and helped to identify opportunities to improve breeding and germplasm-testing strategies for this region.
Resumo:
Puccinia psidii, the causal agent of myrtle rust, was first recorded from Latin America more than 100 years ago. It occurs on many native species of Myrtaceae in Latin America and also infects non-native plantation-grown Eucalyptus species in the region. The pathogen has gradually spread to new areas including Australia and most recently South Africa. The aim of this study was to consider the susceptibility of selected Eucalyptus genotypes, particularly those of interest to South African forestry, to infection by P. psidii. In addition, risk maps were compiled based on suitable climatic conditions and the occurrence of potential susceptible tree species. This made it possible to identify the season when P. psidii would be most likely to infect and to define the geographic areas where the rust disease would be most likely to establish in South Africa. As expected, variation in susceptibility was observed between eucalypt genotypes tested. Importantly, species commonly planted in South Africa show good potential for yielding disease-tolerant material for future planting. Myrtle rust is predicted to be more common in spring and summer. Coastal areas, as well as areas in South Africa with subtropical climates, are more conducive to outbreaks of the pathogen.
Resumo:
Heavy wheel traffic causes soil compaction, which adversely affects crop production and may persist for several years. We applied known compaction forces to entire plots annually for 5 years, and then determined the duration of the adverse effects on the properties of a Vertisol and the performance of crops under no-till dryland cropping with residue retention. For up to 5 years after a final treatment with a 10 Mg axle load on wet soil, soil shear strength at 70-100 mm and cone index at 180-360 mm were significantly (P < 0.05) higher than in a control treatment, and soil water storage and grain yield were lower. We conclude that compaction effects persisted because (1) there were insufficient wet-dry cycles to swell and shrink the entire compacted layer, (2) soil loosening by tillage was absent and (3) there were fewer earthworms in the compacted soil. Compaction of dry soil with 6 Mg had little effect at any time, indicating that by using wheel traffic only when the soil is dry, problems can be avoided. Unfortunately such a restriction is not always possible because sowing, tillage and harvest operations often need to be done when the soil is wet. A more generally applicable solution, which also ensures timely operations, is the permanent separation of wheel zones and crop zones in the field--the practice known as controlled traffic farming. Where a compacted layer already exists, even on a clay soil, management options to hasten repair should be considered, e.g. tillage, deep ripping, sowing a ley pasture or sowing crop species more effective at repairing compacted soil.
Resumo:
Swan’s Lagoon, which is 125 km south-south-west of Townsville, was purchased by the Queensland Government as a beef cattle research station in 1961. It is situated within the seasonally-dry tropical spear grass region of North Queensland. The station was expanded from 80 km2 to 340 km2 by purchase of the adjoining Expedition block in 1978. The first advisory committee formed and initiated research in 1961. The median annual rainfall of 708 mm (28 inches) is highly variable, with over 80% usually falling in December–April. Annual evaporation is 2.03 metres. The 60% of useable area is mostly flat with low fertility duplex soils, of which more than 50% is phosphorus deficient. Natural spear grass-based pastures predominate over the station. Swan’s Lagoon research has contributed to understanding the biology of many aspects of beef production for northern Australia. Research outcomes have provided options to deal with the region’s primary challenges of weaning rates averaging less than 60%, annual growth rates averaging as little as 100 kg, high mortality rates and high management costs. All these relate to the region’s variable and highly seasonal rainfall—challenges that add to insect-borne viruses, ticks, buffalo fly and internal parasites. As well as the vast amount of practical beef production science produced at Swan’s Lagoon, generations of staff have been trained there to support beef producers throughout Queensland and northern Australia to increase their business efficiency. The Queensland Government has provided most of the funds for staffing and operations. Strong beef industry support is reflected in project funding from meat industry levies, managed by Meat and Livestock Australia (MLA) and its predecessors. MLA has consistently provided the majority of operational research funding since the first grant for ‘Studies of management practices, adaption of different breeds and strains to tropical environments, and studies on tick survival and resistance’ in 1962–63. A large number of other agencies and commercial companies have also supported research.
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
Attention is directed at land application of piggery effluent (containing urine, faeces, water, and wasted feed) as a potential source of water resource contamination with phosphorus (P). This paper summarises P-related properties of soil from 0-0.05 m depth at 11 piggery effluent application sites, in order to explore the impact that effluent application has had on the potential for run-off transport of P. The sites investigated were situated on Alfisol, Mollisol, Vertisol, and Spodosol soils in areas that received effluent for 1.5-30 years (estimated effluent-P applications of 100-310000 kg P/ha in total). Total (PT), bicarbonate extractable (PB), and soluble P forms were determined for the soil (0-0.05 m) at paired effluent and no-effluent sites, as well as texture, oxalate-extractable Fe and Al, organic carbon, and pH. All forms of soil P at 0-0.05 m depth increased with effluent application (PB at effluent sites was 1.7-15 times that at no-effluent sites) at 10 of the 11 sites. Increases in PB were strongly related to net P applications (regression analysis of log values for 7 sites with complete data sets: 82.6 % of variance accounted for, p <0.01). Effluent irrigation tended to increase the proportion of soil PT in dilute CaCl2-extractable forms (PTC: effluent average 2.0 %; no-effluent average 0.6%). The proportion of PTC in non-molybdate reactive forms (centrifuged supernatant) decreased (no-effluent average, 46.4 %; effluent average, 13.7 %). Anaerobic lagoon effluent did not reliably acidify soil, since no consistent relationship was observed for pH with effluent application. Soil organic carbon was increased in most of the effluent areas relative to the no-effluent areas. The four effluent areas where organic carbon was reduced had undergone intensive cultivation and cropping. Current effluent management at many of the piggeries failed to maximise the potential for waste P recapture. Ten of the case-study effluent application areas have received effluent-P in excess of crop uptake. While this may not represent a significant risk of leaching where sorption retains P, it has increased the risk of transport of P by run-off. Where such sites are close to surface water, run-off P loads should be managed.
Resumo:
A strategy comprising a winter/spring protein supplement, rumen modifier and hormonal growth promotant (Compudose 400) was used in either the first year (Tl), second year (T2), or in both years (T1+2) following weaning in Brahman cross steers as a means of increasing liveweight gain up to 2.5 years of age. T2 produced the heaviest final liveweight (544.7 kg) and highest overall liveweight gain (366.7 kg), but these were not significantly different from T1 (538.6 kg; 360.9 kg), or T1+2 (528.7 kg; 349.3 kg). However, final liveweight and overall liveweight gains of T1 and T2 but not T1+2 were significantly greater than for untreated (C) steers (504.9 kg; 325.2 kg, both P < 0.05). Regardless of the strategy imposed, liveweight and liveweight gain were enhanced, however final liveweights in each treatment were below the preferred minimum target liveweight (570-580 kg) for premium export markets. Treatment in both years gave no benefit over treatment in 1 year only. 19th Biennial Conference. 5-9 July 1992. LaTrobe University, Melbourne.
Resumo:
Fruit-piercing moths are significant pests of a range of fruit crops throughout much of the world's tropics and subtropics. Feeding damage by the adult moths is most widely reported in varieties of citrus. In the years 2003 and 2004, fruit-piercing moth activity was observed regularly at night in citrus crops in northeast Australia, to determine the level of maturity (based on rind colour) and soundness of fruit attacked. 'Navelina' navel and 'Washington' navel orange, grapefruit and mixed citrus crops were assessed, and fruit was rated and placed into five categories: green, colouring, ripe, overripe and damaged. There were no statistical differences in the percentage of fruit attacked in each category across crops. However, within the individual crops significant proportions of green 'Navelina' fruit (58.7%) and green mixed citrus (57.1%) were attacked in 2004. Among all the crops assessed, 25.1% of moth feeding occurred on overripe or damaged fruit. Crops started to be attacked at least 8 weeks before picking, but in two crops there were large influxes of moths (reaching 27 and 35 moths/100 trees, respectively) immediately before harvest. Moth activity was most intense between late February and late March. Eudocima fullonia (Clerck) represented 79.1% of all moths recorded on fruit, with Eudocima materna (L.), Eudocima salaminia (Cramer) and Serrodes campana (Guen.) the only other species observed capable of inflicting primary damage. Our results suggest that growers should monitor moth activity from 8 weeks before harvest and consider remedial action if moth numbers increase substantially as the crop matures or there is a history of moth problems. The number of fruit pickings could be increased to progressively remove ripe fruit or early harvest of the entire crop contemplated if late influxes of moths are known.
Resumo:
Synthetic backcrossed-derived bread wheats (SBWs) from CIMMYT were grown in the Northwest of Mexico at Centro de Investigaciones Agrícolas del Noroeste (CIANO) and sites across Australia during three seasons. During three consecutive years Australia received “shipments” of different SBWs from CIMMYT for evaluation. A different set of lines was evaluated each season, as new materials became available from the CIMMYT crop enhancement program. These consisted of approximately 100 advanced lines (F7) per year. SBWs had been top and backcrossed to CIMMYT cultivars in the first two shipments and to Australian wheat cultivars in the third one. At CIANO, the SBWs were trialled under receding soil moisture conditions. We evaluated both the performance of each line across all environments and the genotype-by-environment interaction using an analysis that fits a multiplicative mixed model, adjusted for spatial field trends. Data were organised in three groups of multienvironment trials (MET) containing germplasm from shipment 1 (METShip1), 2 (METShip2), and 3 (METShip3), respectively. Large components of variance for the genotype × environment interaction were found for each MET analysis, due to the diversity of environments included and the limited replication over years (only in METShip2, lines were tested over 2 years). The average percentage of genetic variance explained by the factor analytic models with two factors was 50.3% for METShip1, 46.7% for METShip2, and 48.7% for METShip3. Yield comparison focused only on lines that were present in all locations within a METShip, or “core” SBWs. A number of core SBWs, crossed to both Australian and CIMMYT backgrounds, outperformed the local benchmark checks at sites from the northern end of the Australian wheat belt, with reduced success at more southern locations. In general, lines that succeeded in the north were different from those in the south. The moderate positive genetic correlation between CIANO and locations in the northern wheat growing region likely reflects similarities in average temperature during flowering, high evaporative demand, and a short flowering interval. We are currently studying attributes of this germplasm that may contribute to adaptation, with the aim of improving the selection process in both Mexico and Australia.