10 resultados para Speed Limit Signs.
em eResearch Archive - Queensland Department of Agriculture
Resumo:
In the wheatbelt of eastern Australia, rainfall shifts from winter dominated in the south (South Australia, Victoria) to summer dominated in the north (northern New South Wales, southern Queensland). The seasonality of rainfall, together with frost risk, drives the choice of cultivar and sowing date, resulting in a flowering time between October in the south and August in the north. In eastern Australia, crops are therefore exposed to contrasting climatic conditions during the critical period around flowering, which may affect yield potential, and the efficiency in the use of water (WUE) and radiation (RUE). In this work we analysed empirical and simulated data, to identify key climatic drivers of potential water- and radiation-use efficiency, derive a simple climatic index of environmental potentiality, and provide an example of how a simple climatic index could be used to quantify the spatial and temporal variability in resource-use efficiency and potential yield in eastern Australia. Around anthesis, from Horsham to Emerald, median vapour pressure deficit (VPD) increased from 0.92 to 1.28 kPa, average temperature increased from 12.9 to 15.2°C, and the fraction of diffuse radiation (FDR) decreased from 0.61 to 0.41. These spatial gradients in climatic drivers accounted for significant gradients in modelled efficiencies: median transpiration WUE (WUEB/T) increased southwards at a rate of 2.6% per degree latitude and median RUE increased southwards at a rate of 1.1% per degree latitude. Modelled and empirical data confirmed previously established relationships between WUEB/T and VPD, and between RUE and photosynthetically active radiation (PAR) and FDR. Our analysis also revealed a non-causal inverse relationship between VPD and radiation-use efficiency, and a previously unnoticed causal positive relationship between FDR and water-use efficiency. Grain yield (range 1-7 t/ha) measured in field experiments across South Australia, New South Wales, and Queensland (n = 55) was unrelated to the photothermal quotient (Pq = PAR/T) around anthesis, but was significantly associated (r2 = 0.41, P < 0.0001) with newly developed climatic index: a normalised photothermal quotient (NPq = Pq . FDR/VPD). This highlights the importance of diffuse radiation and vapour pressure deficit as sources of variation in yield in eastern Australia. Specific experiments designed to uncouple VPD and FDR and more mechanistic crop models might be required to further disentangle the relationships between efficiencies and climate drivers.
Resumo:
We investigated the influence of rainfall patterns on the water-use efficiency of wheat in a transect between Horsham (36°S) and Emerald (23°S) in eastern Australia. Water-use efficiency was defined in terms of biomass and transpiration, WUEB/T, and grain yield and evapotranspiration, WUEY/ET. Our working hypothesis is that latitudinal trends in WUEY/ET of water-limited crops are the complex result of southward increasing WUEB/T and soil evaporation, and season-dependent trends in harvest index. Our approach included: (a) analysis of long-term records to establish latitudinal gradients of amount, seasonality, and size-structure of rainfall; and (b) modelling wheat development, growth, yield, water budget components, and derived variables including WUEB/T and WUEY/ET. Annual median rainfall declined from around 600 mm in northern locations to 380 mm in the south. Median seasonal rain (from sowing to harvest) doubled between Emerald and Horsham, whereas median off-season rainfall (harvest to sowing) ranged from 460 mm at Emerald to 156 mm at Horsham. The contribution of small events (≤ 5 mm) to seasonal rainfall was negligible at Emerald (median 15 mm) and substantial at Horsham (105 mm). Power law coefficients (τ), i.e. the slopes of the regression between size and number of events in a log-log scale, captured the latitudinal gradient characterised by an increasing dominance of small events from north to south during the growing season. Median modelled WUEB/T increased from 46 kg/ha.mm at Emerald to 73 kg/ha.mm at Horsham, in response to decreasing atmospheric demand. Median modelled soil evaporation during the growing season increased from 70 mm at Emerald to 172 mm at Horsham. This was explained by the size-structure of rainfall characterised with parameter τ, rather than by the total amount of rainfall. Median modelled harvest index ranged from 0.25 to 0.34 across locations, and had a season-dependent latitudinal pattern, i.e. it was greater in northern locations in dry seasons in association with wetter soil profiles at sowing. There was a season-dependent latitudinal pattern in modelled WUEY/ET. In drier seasons, high soil evaporation driven by a very strong dominance of small events, and lower harvest index override the putative advantage of low atmospheric demand and associated higher WUEB/T in southern locations, hence the significant southwards decrease in WUEY/ET. In wetter seasons, when large events contribute a significant proportion of seasonal rain, higher WUEB/T in southern locations may translate into high WUEY/ET. Linear boundary functions (French-Schultz type models) accounting for latitudinal gradients in its parameters, slope, and x-intercept, were fitted to scatter-plots of modelled yield v. evapotranspiration. The x-intercept of the model is re-interpreted in terms of rainfall size structure, and the slope or efficiency multiplier is described in terms of the radiation, temperature, and air humidity properties of the environment. Implications for crop management and breeding are discussed.
Resumo:
When recapturing satellite collared wild dogs that had been trapped one month previous in padded foothold traps, we noticed varying degrees of pitting on the pads of their trapped paw. Veterinary advice, based on images taken of the injuries, suggests that the necrosis was caused by vascular compromise. Five of six dingoes we recaptured had varying degrees of necrosis restricted only to the trapped foot and ranging from single 5 mm holes to 25% sections of the toe pads missing or deformed, including loss of nails. The traps used were rubber-padded, two–coiled, Victor Soft Catch #3 traps. The springs are not standard Victor springs but were Beefer springs; these modifications slightly increase trap speed and the jaw pressure on the trapped foot. Despite this modification the spring pressure is still relatively mild in comparison to conventional long spring or four-coiled wild dog traps. The five wild dogs developing necrosis were trapped in November 2006 at 5-6 months of age. Traps were checked each morning so the dogs were unlikely to have been restrained in the trap for more than 12 hours. All dogs exhibited a small degree of paw damage at capture which presented itself as a swollen paw and compression at the capture point. In contrast, eight wild dogs, 7-8 month-old, were captured two months later in February. Upon their release, on advice from a veterinarian, we massaged the trapped foot to get blood flow back in to the foot and applied a bruise treatment (Heparinoid 8.33 mg/ml) to assist restoring blood flow. These animals were subsequently recaptured several months later and showed no signs of necrosis. While post-capture foot injuries are unlikely to be an issue in conventional control programs where the animal is immediately destroyed, caution needs to be used when releasing accidentally captured domestic dogs or research animals captured in rubber-padded traps. We have demonstrated that 7-8 month old dogs can be trapped and released without any evidence of subsequent necrosis following minimal veterinary treatment. We suspect that the rubber padding on traps may increase the tourniquet effect by wrapping around the paw and recommend the evaluation of offset laminated steel jaw traps as an alternative. Offset laminated steel jaw traps have been shown to be relatively humane producing as few foot injuries as rubber-jawed traps.
Resumo:
Numerous tests have been used to measure beef cattle temperament, but limited research has addressed the relationship between such tests and whether temperament can be modified. One-hundred-and-forty-four steers were given one of three human handling and yarding experiences on six occasions during a 12-month grazing period post-weaning (backgrounding): Good handling/yarding, Poor handling/yarding and Minimal handling/yarding. At the end of this phase the cattle were lot-fed for 78 days, with no handling/yarding treatments imposed, before being transported for commercial slaughter. Temperament was assessed at the start of the experiment, during backgrounding and lot-feeding by flight speed (FS) and a fear of humans test, which measured the proximity to a stimulus person (zone average; ZA), the closest approach to the person (CA) and the amount the cattle moved around the test arena (total transitions; TT). During backgrounding, FS decreased for all treatments and at the end of backgrounding there was no difference between them. The rate of decline, however, was greatest in the Good group, smallest in the Minimal group with the Poor intermediate. In contrast, ZA was affected by treatment, with a greater reduction for the Good group than the others (P = 0.012). During lot-feeding, treatment did not affect FS, but all groups showed a decrease in ZA, with the greatest change in the Poor group, the least in the Good and the Minimal intermediate (P = 0.052). CA was positively correlated with ZA (r = 0.18 to 0.66) and negatively with TT (r = -0.180 to -0.659). FS was consistently correlated with TT only (r = 0.17 to 0.49). These findings suggest that FS and TT measure a similar characteristic, as do ZA and CA, but that these characteristics are different from one another, indicating that temperament is not a unitary trait, but has different facets. FS and TT measure one facet that we suggest is general agitation, whilst ZA and CA measure fear of people. Thus, the cattle became less agitated during backgrounding, but the effect was not permanently influenced by the quantity and quality of handling/yarding. However, Good handling/yarding reduced fearfulness of people. Fear of people was also reduced during lot-feeding, probably as a consequence of frequent exposure to humans in a situation that was neutral or positive for the cattle.
Resumo:
Sodium cyanide poison is potentially a more humane method to control wild dogs than sodium fluoroacetate (1080) poison. This study quantified the clinical signs and duration of cyanide toxicosis delivered by the M-44 ejector. The device delivered a nominal 0.88 g of sodium cyanide, which caused the animal to loose the menace reflex in a mean of 43 s, and the animal was assumed to have undergone cerebral hypoxia after the last visible breath. The mean time to cerebral hypoxia was 156 s for a vertical pull and 434 s for a side pull. The difference was possibly because some cyanide may be lost in a side pull. There were three distinct phases of cyanide toxicosis: the initial phase was characterised by head shaking, panting and salivation; the immobilisation phase by incontinence, ataxia and loss of the righting reflex; and the cerebral hypoxia phase by a tetanic seizure. Clinical signs that were exhibited in more than one phase of cyanide toxicosis included retching, agonal breathing, vocalisation, vomiting, altered levels of ocular reflex, leg paddling, tonic muscular spasms, respiratory distress and muscle fasciculations of the muzzle.
Resumo:
Australia’s northern grain-producing region is unique in that the root-lesion nematode (RLN), Pratylenchus thornei predominates. P. neglectus is also present. RLN cause substantial yield losses, particularly in wheat, but they reproduce on numerous summer and winter crops. Each nematode species prefers different crops and varieties. This project provides growers with a range of integrated management strategies to limit RLN (i.e. identify the problem, protect uninfested fields, rotate with resistant crops to keep populations low and choose tolerant crops to maximise yields). It also provides new information about soil-borne zoosporic fungi in the region.
Resumo:
Parthenium weed (Parthenium hysterophorus L.) is believed to reduce the above- and below-ground plant species diversity and the above-ground productivity in several ecosystems. We quantified the impact of this invasive weed upon species diversity in an Australian grassland and assessed the resulting shifts in plant community composition following management using two traditional approaches. A baseline plant community survey, prior to management, showed that the above-ground community was dominated by P. hysterophorus, stoloniferous grasses, with a further high frequency of species from Malvaceae, Chenopodiaceae and Amaranthaceae. In heavily invaded areas, P. hysterophorus abundance and biomass was found to negatively correlate with species diversity and native species abundance. Digitaria didactyla Willd. was present in high abundance when P. hysterophorus was not, with these two species, contributing most to the dissimilarity seen between areas. The application of selective broad leaf weed herbicides significantly reduced P. hysterophorus biomass under ungrazed conditions, but this management did not yet result in an increase in species diversity. In the above-ground community, P. hysterophorus was partly replaced by the introduced grass species Cynodon dactylon L. (Pers.) 1 year after management began, increasing the above-ground forage biomass production, while D. didactyla replaced P. hysterophorus in the below-ground community. This improvement in forage availability continued to strengthen over the time of the study resulting in a total increase of 80% after 2 years in the ungrazed treatment, demonstrating the stress that grazing was imposing upon this grassland-based agro-ecosystem and showing that it is necessary to remove grazing to obtain the best results from the chemical management approach.
Resumo:
Invasive and noxious weeds are well known as a pervasive problem, imposing significant economic burdens on all areas of agriculture. Whilst there are multiple possible pathways of weed dispersal in this industry, of particular interest to this discussion is the unintended dispersal of weed seeds within fodder. During periods of drought or following natural disasters such as wild fire or flood, there arises the urgent need for 'relief' fodder to ensure survival and recovery of livestock. In emergency situations, relief fodder may be sourced from widely dispersed geographic regions, and some of these regions may be invaded by an extensive variety of weeds that are both exotic and detrimental to the intended destination for the fodder. Pasture hay is a common source of relief fodder and it typically consists of a mixture of grassy and broadleaf species that may include noxious weeds. When required urgently, pasture hay for relief fodder can be cut, baled, and transported over long distances in a short period of time, with little opportunity for prebaling inspection. It appears that, at the present time, there has been little effort towards rapid testing of bales, post-baling, for the presence of noxious weeds, as a measure to prevent dispersal of seeds. Published studies have relied on the analysis of relatively small numbers of bales, tested to destruction, in order to reveal seed species for identification and enumeration. The development of faster, more reliable, and non-destructive sampling methods is essential to increase the fodder industry's capacity to prevent the dispersal of noxious weeds to previously unaffected locales.
Resumo:
Temperatures have increased and in-crop rainfall decreased over recent decades in many parts of the Australian wheat cropping region. With these trends set to continue or intensify, improving crop adaptation in the face of climate change is particularly urgent in this, already drought-prone, cropping region. Importantly, improved performance under water-limitation must be achieved while retaining yield potential during more favourable seasons. A multi-trait-based approach to improve wheat yield and yield stability in the face of water-limitation and heat has been instigated in northern Australia using novel phenotyping techniques and a nested association mapping (NAM) approach. An innovative laboratory technique allows rapid root trait screening of hundreds of lines. Using soil grown seedlings, the method offers significant advantages over many other lab-based techniques. Another recently developed method allows novel stay-green traits to be quantified objectively for hundreds of genotypes in standard field trial plots. Field trials in multiple locations and seasons allow evaluation of targeted trait values and identification of superior germplasm. Traits, including yield and yield components are measured for hundreds of NAM lines in rain fed environments under various levels of water-limitation. To rapidly generate lines of interest, the University of Queensland “speed breeding” method is being employed, allowing up to 7 plant generations per annum. A NAM population of over 1000 wheat recombinant inbred lines has been progressed to the F5 generation within 18 months. Genotyping the NAM lines with the genome-wide DArTseq molecular marker system provides up to 40,000 markers. They are now being used for association mapping to validate QTL previously identified in bi-parental populations and to identify novel QTL for stay-green and root traits. We believe that combining the latest techniques in physiology, phenotyping, genetics and breeding will increase genetic progress toward improved adaptation to water-limited environments.
Resumo:
Two trials were done in this project. One was a continuation of work started under a previous GRDC/SRDC-funded activity, 'Strategies to improve the integration of legumes into cane based farming systems'. This trial aimed to assess the impact of trash and tillage management options and nematicide application on nematodes and crop performance. Methods and results are contained in the following publication: Halpin NV, Stirling GR, Rehbein WE, Quinn B, Jakins A, Ginns SP. The impact of trash and tillage management options and nematicide application on crop performance and plant-parasitic nematode populations in a sugarcane/peanut farming system. Proc. Aust. Soc. Sugar Cane Technol. 37, 192-203. Nematicide application in the plant crop significantly reduced total numbers of plant parasitic nematodes (PPN) but there was no impact on yield. Application of nematicide to the ratoon crop significantly reduced sugar yield. The study confirmed other work demonstrating that implementation of strategies like reduced tillage reduced populations of total PPN, suggesting that the soil was more suppressive to PPN in those treatments. The second trial, a variety trial, demonstrated the limited value of nematicide application in sugarcane farming systems. This study has highlighted that growers shouldn’t view nematicides as a ‘cure all’ for paddocks that have historically had high PPN numbers. Nematicides have high mammalian toxicity, have the potential to contaminate ground water (Kookana et al. 1995) and are costly. The cost of nematicide used in R1 was approx. $320 - $350/ha, adding $3.50/t of cane in a 100 t/ha crop. Also, our study demonstrated that a single nematicide treatment at the application rate registered for sugarcane is not very effective in reducing populations of nematode pests. There appears to be some levels of resistance to nematodes within the current suite of varieties available to the southern canelands. For example the soil in plots that were growing Q183 had 560% more root knot nematodes / 200mL soil compared to plots that grew Q245. The authors see great value in investment into a nematode screening program that could rate varieties into groups of susceptibility to both major sugarcane nematode pests. Such a rating could then be built into a decision support ‘tree’ or tool to better enable producers to select varieties on a paddock by paddock basis.