14 resultados para Lead times

em eResearch Archive - Queensland Department of Agriculture


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Gascoyne-Murchison region of Western Australia experiences an arid to semi-arid climate with a highly variable temporal and spatial rainfall distribution. The region has around 39.2 million hectares available for pastoral lease and supports predominantly catle and sheep grazing leases. In recent years a number of climate forecasting systems have been available offering rainfall probabilities with different lead times and a forecast period; however, the extent to which these systems are capable of fulfilling the requirements of the local pastoralists is still ambiguous. Issues can range from ensuring forecasts are issued with sufficient lead time to enable key planning or decisions to be revoked or altered, to ensuring forecast language is simple and clear, to negate possible misunderstandings in interpretation. A climate research project sought to provide an objective method to determine which available forecasting systems had the greatest forecasting skill at times of the year relevant to local property management. To aid this climate research project, the study reported here was undertaken with an overall objective of exploring local pastoralists' climate information needs. We also explored how well they understand common climate forecast terms such as 'mean', median' and 'probability', and how they interpret and apply forecast information to decisions. A stratified, proportional random sampling was used for the purpose of deriving the representative sample based on rainfall-enterprise combinations. In order to provide more time for decision-making than existing operational forecasts that are issued with zero lead time, pastoralists requested that forecasts be issued for May-July and January-March with lead times counting down from 4 to 0 months. We found forecasts of between 20 and 50 mm break-of-season or follow-up rainfall were likely to influence decisions. Eighty percent of pastoralists demonstrated in a test question that they had a poor technical understanding of how to interpret the standard wording of a probabilistic median rainfall forecast. this is worthy of further research to investigate whether inappropriate management decisions are being made because the forecasts are being misunderstood. We found more than half the respondents regularly access and use weather and climate forecasts or outlook information from a range of sources and almost three-quarters considered climate information or tools useful, with preferred methods for accessing this information by email, faxback service, internet and the Department of Agriculture Western Australia's Pastoral Memo. Despite differences in enterprise types and rainfall seasonality across the region we found seasonal climate forecasting needs were relatively consistent. It became clear that providing basic training and working with pastoralists to help them understand regional climatic drivers, climate terminology and jargon, and the best ways to apply the forecasts to enhance decision-making are important to improve their use of information. Consideration could also be given to engaging a range of producers to write the climate forecasts themselves in the language they use and understand, in consultation with the scientists who prepare the forecasts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Surveys were conducted between 1997 and 2001 to investigate the incidence of overwintering Helicoverpa spp. pupae under summer crop residues on the Darling Downs, Queensland. Only Helicoverpa armigera was represented in collections of overwintering pupae. The results indicated that late-season crops of cotton, sorghum, maize, soybean, mungbean and sunflower were equally likely to have overwintering pupae under them. In the absence of tillage practices, these crops had the potential to produce similar numbers of moths/ha in the spring. There were expected differences between years in the densities of overwintering pupae and the number of emerged moths/ha. Irrigated crops produced 2.5 times more moths/ha than dryland crops. Overall survival from autumn-formed pupae to emerged moths averaged 44%, with a higher proportion of pupae under maize surviving to produce moths than each of the other crops. Parasitoids killed 44.1% of pupae, with Heteropelma scaposum representing 83.3% of all parasitoids reared from pupae. Percentage parasitism levels were lower in irrigated crops (27.6%) compared with dryland crops (40.5%). Recent changes to Helicoverpa spp. management in cotton/grain-farming systems in south-eastern Queensland, including widespread adoption of Bt cotton, and use of more effective and more selective insecticides, could lead to lower densities of overwintering pupae under late summer crops.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Instantaneous natural mortality rates and a nonparametric hunting mortality function are estimated from a multiple-year tagging experiment with arbitrary, time-dependent fishing or hunting mortality. Our theory allows animals to be tagged over a range of times in each year, and to take time to mix into the population. Animals are recovered by hunting or fishing, and death events from natural causes occur but are not observed. We combine a long-standing approach based on yearly totals, described by Brownie et al. (1985, Statistical Inference from Band Recovery Data: A Handbook, Second edition, United States Fish and Wildlife Service, Washington, Resource Publication, 156), with an exact-time-of-recovery approach originated by Hearn, Sandland and Hampton (1987, Journal du Conseil International pour l'Exploration de la Mer, 43, 107-117), who modeled times at liberty without regard to time of tagging. Our model allows for exact times of release and recovery, incomplete reporting of recoveries, and potential tag shedding. We apply our methods to data on the heavily exploited southern bluefin tuna (Thunnus maccoyii).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The amount and timing of early wet-season rainfall are important for the management of many agricultural industries in north Australia. With this in mind, a wet-season onset date is defined based on the accumulation of rainfall to a predefined threshold, starting from 1 September, for each square of a 1° gridded analysis of daily rainfall across the region. Consistent with earlier studies, the interannual variability of the onset dates is shown to be well related to the immediately preceding July-August Southern Oscillation index (SOI). Based on this relationship, a forecast method using logistic regression is developed to predict the probability that onset will occur later than the climatological mean date. This method is expanded to also predict the probabilities that onset will be later than any of a range of threshold dates around the climatological mean. When assessed using cross-validated hindcasts, the skill of the predictions exceeds that of climatological forecasts in the majority of locations in north Australia, especially in the Top End region, Cape York, and central Queensland. At times of strong anomalies in the July-August SOI, the forecasts are reliably emphatic. Furthermore, predictions using tropical Pacific sea surface temperatures (SSTs) as the predictor are also tested. While short-lead (July-August predictor) forecasts are more skillful using the SOI, long-lead (May-June predictor) forecasts are more skillful using Pacific SSTs, indicative of the longer-term memory present in the ocean.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fibre diameter can vary dramatically along a wool staple, especially in the Mediterranean environment of southern Australia with its dry summers and abundance of green feed in spring. Other research results have shown a very low phenotypic correlation between fibre diameter grown between seasons. Many breeders use short staples to measure fibre diameter for breeding purposes and also to promote animals for sale. The effectiveness of this practice is determined by the relative response to selection by measuring fibre traits on a full 12 months wool staple as compared to measuring them only on part of a staple. If a high genetic correlation exists between the part record and the full record, then using part records may be acceptable to identify genetically superior animals. No information is available on the effectiveness of part records. This paper investigated whether wool growth and fibre diameter traits of Merino wool grown at different times of the year in a Mediterranean environment, are genetically the same trait, respectively. The work was carried out on about 7 dyebanded wool sections/animal.year, on ewes from weaning to hogget age, in the Katanning Merino resource flocks over 6 years. Relative clean wool growth of the different sections had very low heritability estimates of less than 0.10, and they were phenotypically and genetically poorly correlated with 6 or 12 months wool growth. This indicates that part record measurement of clean wool growth of these sections will be ineffective as indirect selection criteria to improve wool growth genetically. Staple length growth as measured by the length between dyebands, would be more effective with heritability estimates of between 0.20 and 0.30. However, these measurements were shown to have a low genetic correlation with wool grown for 12 months which implies that these staple length measurements would only be half as efficient as the wool weight for 6 or 12 months to improve total clean wool weight. Heritability estimates of fibre diameter, coefficient of variation of fibre diameter and fibre curvature were relatively high and were genetically and phenotypically highly correlated across sections. High positive phenotypic and genetic correlations were also found between fibre diameter, coefficient of variation of fibre diameter and fibre curvature of the different sections and similar measurements for wool grown over 6 or 12 months. Coefficient of variation of fibre diameter of the sections also had a moderate negative phenotypic and genetic correlation with staple strength of wool staples grown over 6 months indicating that coefficient of variation of fibre diameter of any section would be as good an indirect selection criterion to improve stable strength as coefficient of variation of fibre diameter for wool grown over 6 or 12 months. The results indicate that fibre diameter, coefficient of variation of fibre diameter and fibre curvature of wool grown over short periods of time have virtually the same heritability as that of wool grown over 12 months, and that the genetic correlation between fibre diameter, coefficient of variation of fibre diameter and fibre curvature on part and on full records is very high (rg > 0.85). This indicates that fibre diameter, coefficient of variation of fibre diameter and fibre curvature on part records can be used as selection criteria to improve these traits. However, part records of greasy and clean wool growth would be much less efficient than fleece weight for wool grown over 6 or 12 months because of the low heritability of part records and the low genetic correlation between these traits on part records and on wool grown for 12 months.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project reviewed international research conducted on the possible role of plants in alleviating high temperatures in our living spaces. The literature review served to identify the work that has already been carried out in the area and to highlight the gaps to be filled by experimental research. A pilot study then investigated the thermal properties of six of the most common landscaping materials. This project clearly shows that plants can play a significant role in modifying the thermal conditions of urban environments. Tall trees can shade nearby buildings and allow for reductions in cooling costs. In addition to basic shading, the dispersal of heat via the plant’s natural transpiration stream has long been recognised as an important component of the urban energy balance. It has been shown that urban temperatures can be up to 7°C higher than nearby rural areas, illustrating the impact of plants on their environment. These benefits argue against the idea of removing plants from landscapes in order to save on water in times of drought. Similarly, the idea of switching to artificial turf is questionable, since artificial turf still requires watering and can reach temperatures that far exceed the safe range for players. While vegetation offers evaporative cooling, non-vegetative, impervious surfaces such as concrete do not, and can therefore cause greater surface and soil temperatures. In addition, the higher temperatures associated with these impervious surfaces can negatively affect the growth of plants in surrounding areas. Permeable surfaces, such as mulches, have better insulating properties and can prevent excessive heating of the soil. However, they can also lead to an increase in reflected longwave radiation, causing the leaves of plants to close their water-conducting pores and reducing the beneficial cooling effects of transpiration. The results show that the energy balance of our surroundings is complicated and that all components of a landscape will have an impact on thermal conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lead (Pb) poisoning of cattle has been relatively common in Australia and sump oil has been identified as an important cause of Pb toxicity for cattle because they seem to have a tendency to drink it. Lead-free petrol has been available in Australia since 1975, so the aim of this study was to assess the current risk to cattle from drinking used automotive oils. Sump or gear box oil was collected from 56 vehicles being serviced. The low levels of Pb found suggest that the removal of leaded petrol from the Australian market as a public health measure has benefited cattle by eliminating the risk of acute poisoning from used engine oil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Suitable for gaining some insights into important questions about the management of turf in dry times. Improve your product quality and avoid unnecessary losses. Can varieties help? How important are soils in conserving moisture and how do I measure my soil's condition? How can I make the best use of available water? Can water retaining amendments assist in establishing turf? Is recycled water a good option? Contains research results from turfgrass trials conducted by Queensland Government scientists for Queensland conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project built upon the successful outcomes of a previous project (TU02005) by adding to the database of salt tolerance among warm season turfgrass cultivars, through further hydroponic screening trials. Hydroponic screening trials focussed on new cultivars or cultivars that were not possible to cover in the time available under TU02005, including: 11 new cultivars of Paspalum vaginatum; 13 cultivars of Cynodon dactylon; six cultivars of Stenotaphrum secundatum; one accession of Cynodon transvaalensis; 12 Cynodon dactylon x transvaalensis hybrids; two cultivars of Sporobolus virginicus; five cultivars of Zoysia japonica; one cultivar of Z. macrantha, one common form of Z. tenuifolia and one Z. japonica x tenuifolia hybrid. The relative salinity tolerance of different turfgrasses is quantified in terms of their growth response to increasing levels of salinity, often defined by the salt level that equates to a 50% reduction in shoot yield, or alternatively the threshold salinity. The most salt tolerant species in these trials were Sporobolus virginicus and Paspalum vaginatum, consistent with the findings from TU02005 (Loch, Poulter et al. 2006). Cynodon dactylon showed the largest range in threshold values with some cultivars highly sensitive to salt, while others were tolerant to levels approaching that of the more halophytic grasses. Coupled with the observational and anecdotal evidence of high drought tolerance, this species and other intermediately tolerant species provide options for site specific situations in which soil salinity is coupled with additional challenges such as shade and high traffic conditions. By recognising the fact that a salt tolerant grass is not the complete solution to salinity problems, this project has been able to further investigate sustainable long-term establishment and management practices that maximise the ability of the selected grass to survive and grow under a particular set of salinity and usage parameters. Salt-tolerant turf grasses with potential for special use situations were trialled under field conditions at three sites within the Gold Coast City Council, while three sites, established under TU02005 within the Redland City Council boundaries were monitored for continued grass survival. Several randomised block experiments within Gold Coast City were established to compare the health and longevity of seashore paspalum (Paspalum vaginatum), Manila grass (Zoysia matrella), as well as the more tolerant cultivars of other species like buffalo grass (Stenotaphrum secundatum) and green couch (Cynodon dactylon). Whilst scientific results were difficult to achieve in the field situation, where conditions cannot be controlled, these trials provided valuable observational evidence of the likely survival of these species. Alternatives to laying full sod such as sprigging were investigated, and were found to be more appropriate for areas of low traffic as the establishment time is greater. Trials under controlled and protected conditions successfully achieved a full cover of Paspalum vaginatum from sprigs in a 10 week time frame. Salt affected sites are often associated with poor soil structure. Part of the research investigated techniques for the alleviation of soil compaction frequently found on saline sites. Various methods of soil de-compaction were investigated on highly compacted heavy clay soil in Redlands City. It was found that the heavy duplex soil of marine clay sediments required the most aggressive of treatments in order to achieve limited short-term effects. Interestingly, a well constructed sports field showed a far greater and longer term response to de-compaction operations, highlighting the importance of appropriate construction in the successful establishment and management of turfgrasses on salt affected sites. Fertiliser trials in this project determined plant demand for nitrogen (N) to species level. This work produced data that can be used as a guide when fertilising, in order to produce optimal growth and quality in the major turf grass species used in public parkland. An experiment commenced during TU02005 and monitored further in this project, investigated six representative warm-season turfgrasses to determine the optimum maintenance requirements for fertiliser N in south-east Queensland. In doing so, we recognised that optimum level is also related to use and intensity of use, with high profile well-used parks requiring higher maintenance N than low profile parks where maintaining botanical composition at a lower level of turf quality might be acceptable. Kikuyu (Pennisetum clandestinum) seemed to require the greatest N input (300-400 kg N/ha/year), followed by the green couch (Cynodon dactylon) cultivars ‘Wintergreen’ and ‘FLoraTeX’ requiring approximately 300 kg N/ha/year for optimal condition and growth. ‘Sir Walter’ (Stenotaphrum secundatum) and ‘Sea Isle 1’ (Paspalum vaginatum) had a moderate requirement of approximately 200 kg/ha/year. ‘Aussiblue’ (Digitaria didactyla)maintained optimal growth and quality at 100-200 kg N/ha/year. A set of guidelines has been prepared to provide various options from the construction and establishment of new grounds, through to the remediation of existing parklands by supporting the growth of endemic grasses. They describe a best management process through which salt affected sites should be assessed, remediated and managed. These guidelines, or Best Management Practices, will be readily available to councils. Previously, some high salinity sites have been turfed several times over a number of years (and Council budgets) for a 100% failure record. By eliminating this budgetary waste through targeted workable solutions, local authorities will be more amenable to investing appropriate amounts into these areas. In some cases, this will lead to cost savings as well as resulting in better quality turf. In all cases, however, improved turf quality will be of benefit to ratepayers, directly through increased local use of open space in parks and sportsfields and indirectly by attracting tourists and other visitors to the region bringing associated economic benefits. At the same time, environmental degradation and erosion of soil in bare areas will be greatly reduced.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Production of a silvicultural manual for Vanuatu Whitewood.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

National Citrus Scion Breeding Program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phosphine is the only economically viable fumigant for routine control of insect pests of stored food products, but its continued use is now threatened by the world-wide emergence of high-level resistance in key pest species. Phosphine has a unique mode of action relative to well-characterised contact pesticides. Similarly, the selective pressures that lead to resistance against field sprays differ dramatically from those encountered during fumigation. The consequences of these differences have not been investigated adequately. We determine the genetic basis of phosphine resistance in Rhyzopertha dominica strains collected from New South Wales and South Australia and compare this with resistance in a previously characterised strain from Queensland. The resistance levels range from 225 and 100 times the baseline response of a sensitive reference strain. Moreover, molecular and phenotypic data indicate that high-level resistance was derived independently in each of the three widely separated geographical regions. Despite the independent origins, resistance was due to two interacting genes in each instance. Furthermore, complementation analysis reveals that all three strains contain an incompletely recessive resistance allele of the autosomal rph1 resistance gene. This is particularly noteworthy as a resistance allele at rph1 was previously proposed to be a necessary first step in the evolution of high-level resistance. Despite the capacity of phosphine to disrupt a wide range of enzymes and biological processes, it is remarkable that the initial step in the selection of resistance is so similar in isolated outbreaks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reducing crop row spacing and delaying time of weed emergence may provide crops a competitive edge over weeds. Field experiments were conducted to evaluate the effects of crop row spacing (11, 15, and 23-cm) and weed emergence time (0, 20, 35, 45, 55, and 60 days after wheat emergence; DAWE) on Galium aparine and Lepidium sativum growth and wheat yield losses. Season-long weed-free and crop-free treatments were also established to compare wheat yield and weed growth, respectively. Row spacing and weed emergence time significantly affected the growth of both weed species and wheat grain yields. For both weed species, the maximum plant height, shoot biomass, and seed production were observed in the crop-free plots, and delayed emergence decreased these variables. In weed-crop competition plots, maximum weed growth was observed when weeds emerged simultaneously with the crop in rows spaced 23-cm apart. Less growth of both weed species was observed in narrow row spacing (11-cm) of wheat as compared with wider rows (15 and 23-cm). These weed species produced less than 5 seeds plant-1 in 11-cm wheat rows when they emerged at 60 DAWE. Presence of weeds in the crop especially at early stages was devastating for wheat yields. Therefore, maximum grain yield (4.91tha-1) was recorded in the weed-free treatment at 11-cm row spacing. Delay in time of weed emergence and narrow row spacing reduced weed growth and seed production and enhanced wheat grain yield, suggesting that these strategies could contribute to weed management in wheat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lethal control of wild dogs - that is Dingo (Canis lupus dingo) and Dingo/Dog (Canis lupus familiaris) hybrids - to reduce livestock predation in Australian rangelands is claimed to cause continental-scale impacts on biodiversity. Although top predator populations may recover numerically after baiting, they are predicted to be functionally different and incapable of fulfilling critical ecological roles. This study reports the impact of baiting programmes on wild dog abundance, age structures and the prey of wild dogs during large-scale manipulative experiments. Wild dog relative abundance almost always decreased after baiting, but reductions were variable and short-lived unless the prior baiting programme was particularly effective or there were follow-up baiting programmes within a few months. However, age structures of wild dogs in baited and nil-treatment areas were demonstrably different, and prey populations did diverge relative to nil-treatment areas. Re-analysed observations of wild dogs preying on kangaroos from a separate study show that successful chases that result in attacks of kangaroos by wild dogs occurred when mean wild dog ages were higher and mean group size was larger. It is likely that the impact of lethal control on wild dog numbers, group sizes and age structures compromise their ability to handle large difficult-to-catch prey. Under certain circumstances, these changes sometimes lead to increased calf loss (Bos indicus/B. taurus genotypes) and kangaroo numbers. Rangeland beef producers could consider controlling wild dogs in high-risk periods when predation is more likely and avoid baiting at other times.