15 resultados para Crash Test Criteria

em eResearch Archive - Queensland Department of Agriculture


Relevância:

30.00% 30.00%

Publicador:

Resumo:

More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-parametric difference tests such as triangle and duo-trio tests traditionally are used to establish differences or similarities between products. However they only supply the researcher with partial answers and often further testing is required to establish the nature, size and direction of differences. This paper looks at the advantages of the difference from control (DFC) test (also known as degree of difference test) and discusses appropriate applications of the test. The scope and principle of the test, panel composition and analysis of results are presented with the aid of suitable examples. Two of the major uses of the DFC test are in quality control and shelf-life testing. The role DFC takes in these areas and the use of other tests to complement the testing is discussed. Controls or standards are important in both these areas and the use of standard products, mental and written standards and blind controls are highlighted. The DFC test has applications in products where the duo-trio and triangle tests cannot be used because of the normal heterogeneity of the product. While the DFC test is a simple difference test it can be structured to give the researcher more valuable data and scope to make informed decisions about their product.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dwarf somaclonal variant is a major problem affecting micropropagation of the banana cultivar Williams (Musa spp. AAA; subgroup Cavendish). This problem arises from genetic changes that occur during the tissue culture process. Early identification of this problem is difficult and propagators must wait until plants are ex vitro in order to visualise the dwarfism phenotype. In this study, we have improved a SCAR-based molecular diagnostic technique, developed by Damasco et al. [Acta Hortic. 461 (1997) 157], for the early identification of dwarf off-types. We have included a positive internal control in a multiplex PCR and adapted the technique for use with small amounts of fresh in vitro leaf material as PCR template. The control product is a 500 bp fragment from 18S rRNA and is amplified in all tissues irrespective of phenotype. The use of small in vitro leaf material removing the need for genomic DNA extraction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to compare the use of indirect haemagglutination (IHA) and gel diffusion (GD) tests for serotyping Haemophilus parasuis by the Kielstein-Rapp-Gabrielson scheme. All 15 serovar reference strains, 72 Australian field isolates, nine Chinese field isolates, and seven isolates from seven experimentally infected pigs were evaluated with both tests. With the IHA test, 14 of the 15 reference strains were correctly serotyped – with serovar 10 failing to give a titre with serovar 10 antiserum. In the GD test, 13 reference strains were correctly serotyped – with antigen from serovars 7 and 8 failing to react with any antiserum. The IHA methodology serotyped a total of 45 of 81 field isolates while the GD methodology serotyped a total of 48 isolates. For 29 isolates, the GD and IHA methods gave discordant results. It was concluded that the IHA is a good additional test for the serotyping of H. parasuis by the KRG scheme if the GD methodology fails to provide a result or shows unusual cross-reactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Degradation of RNA in diagnostic specimens can cause false-negative test results and potential misdiagnosis when tests rely on the detection of specific RNA sequence. Current molecular methods of checking RNA integrity tend to be host species or group specific, necessitating libraries of primers and reaction conditions. The objective here was to develop a universal (multi-species) quality assurance tool for determining the integrity of RNA in animal tissues submitted to a laboratory for analyses. Ribosomal RNA (16S rRNA) transcribed from the mitochondrial 16S rDNA was used as template material for reverse transcription to cDNA and was amplified using polymerase chain reaction (PCR). As mitochondrial DNA has a high level of conservation, the primers used were shown to reverse transcribe and amplify RNA from every animal species tested. Deliberate degradation of rRNA template through temperature abuse of samples resulted in no reverse transcription and amplification. Samples spiked with viruses showed that single-stranded viral RNA and rRNA in the same sample degraded at similar rates, hence reverse transcription and PCR amplification of 16S rRNA could be used as a test of sample integrity and suitability for analysis that required the sample's RNA, including viral RNA. This test will be an invaluable quality assurance tool for determination of RNA integrity from tissue samples, thus avoiding erroneous test results that might occur if degraded target RNA is used unknowingly as template material for reverse transcription and subsequent PCR amplification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Infectious coryza is an upper respiratory tract disease of chickens with the major impact occurring in multi-age flocks. We investigated the relationship between the level of antibodies, as detected by a haemagglutination-inhibition (HI) assay, in infectious coryza-vaccinated chickens and the protection against challenge in those chickens. In one experiment, chickens given a single dose of either of two infectious coryza vaccines lacked a detectable HI response to vaccination but showed significant levels of protection 11 weeks after vaccination. In contrast, in chickens given two doses of an infectious coryza vaccine and challenged 3 weeks after the second vaccine dose, there was a strong serological response with 36/40 birds having a HI titre of 1/20 or greater. In this trial there was an apparent relationship between titre and subsequent protection, with none of the 32 chickens with a titre of 1/40 or 1/80 showing any clinical signs and only one of the same group yielding the challenge organism on culture. In contrast, three of the four vaccinated chickens with a HI titre less than 1/5 developed the typical clinical signs of coryza and yielded the challenge organism on culture. Overall, our results suggest that HI titres cannot be regarded as a definitive predictor of vaccine efficacy. We suggest that the vaccination-challenge trial is the gold standard for the evaluation of the immune response to infectious coryza vaccines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laboratory and field data reported in the literature are confusing with regard to “adequate” protection thresholds for borate timber preservatives. The confusion is compounded by differences in termite species, timber species and test methodology. Laboratory data indicate a borate retention of 0.5% mass/mass (m/m) boric acid equivalent (BAE) would cause >90% termite mortality and restrict mass loss in test specimens to ≤5%. Field data generally suggest that borate retentions appreciably >0.5% m/m BAE are required. We report two field experiments with varying amounts of untreated feeder material in which Coptotermes acinaciformis (Froggatt) (Isoptera: Rhinotermitidae) responses to borate-treated radiata (Monterey) pine, Pinus radiata D. Don, were measured. The apparently conflicting results between laboratory and field data are explained by the presence or absence of untreated feeder material in the test environment. In the absence of untreated feeder material, wood containing 0.5% BAE provided adequate protection from Coptotermes sp., whereas in the presence of untreated feeder material, increased retentions were required. Furthermore, the retentions required increased with increased amounts of susceptible material present. Some termites, Nasutitermes sp. and Mastotermes darwiniensis Froggatt, for example, are borate-tolerant and borate timber preservatives are not a viable management option with these species. The lack of uniform standards for termite test methodology and assessment criteria for efficacy across the world is recognized as a difficulty with research into the performance of timber preservatives with termites. The many variables in laboratory and field assays make “prescriptive” standards difficult to recommend. The use of “performance” standards to define efficacy criteria (“adequate” protection) is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The enemy release hypothesis predicts that native herbivores will either prefer or cause more damage to native than introduced plant species. We tested this using preference and performance experiments in the laboratory and surveys of leaf damage caused by the magpie moth Nyctemera amica on a co-occuring native and introduced species of fireweed (Senecio) in eastern Australia. In the laboratory, ovipositing females and feeding larvae preferred the native S. pinnatifolius over the introduced S. madagascariensis. Larvae performed equally well on foliage of S. pinnatifolius and S. madagascariensis: pupal weights did not differ between insects reared on the two species, but growth rates were significantly faster on S. pinnatifolius. In the field, foliage damage was significantly greater on native S. pinnatifolius than introduced S. madagascariensis. These results support the enemy release hypothesis, and suggest that the failure of native consumers to switch to introduced species contributes to their invasive success. Both plant species experienced reduced, rather than increased, levels of herbivory when growing in mixed populations, as opposed to pure stands in the field; thus, there was no evidence that apparent competition occurred.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Wambiana grazing trial started in 1997 to test and develop sustainable and profitable grazing strategies to manage for rainfall variability. It is important that this trial continue as the results are still relatively short-term relative to rainfall cycles and significant treatment changes are still occurring. This new proposal will maintain baseline treatments but will modify others based on trial learning’s to date. It builds on treatment differences and evidence collected over the last 12 years to deliver evidence-based guidelines and principles for sustainable and productive management. The trial also links to other projects modelling water quality, climate change, methane emissions and soil C sequestration on grazing lands.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fruit quality is one of the major factors limiting growth in avocado retail sales. Avocado growers are often unaware of their end-use fruit quality since quality problems only manifest upon fruit ripening and growers receive limited feedback from the supply chain. If growers were aware of their expected fruit quality they would be equipped to make better marketing decisions and if necessary to take remedial actions to improve their fruit quality. Avotest is being developed as a quick and easy method of determining expected end-use fruit quality before the start of the commercial fruit harvest. The test aims at distinguishing between blocks with robust fruit and those with less robust fruit. The test could also be used to predict the resulting fruit quality after the implementation of new farming practices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Foliar oils, particularly monoterpenes, can influence the susceptibility of plants to herbivory. In plants, including eucalypts, monoterpenes are often associated with plant defence. A recent analysis revealed an increase in foliar oil content with increasing latitudinal endemism, and we tested this pattern using three eucalypt taxa comprising a latitudinal replacement cline. We also examined the relative concentrations of two monoterpenes (alpha-pinene and 1,8-cineole), for which meta-analyses also showed latitudinal variation, using hybrids of these three taxa with Corymbia torelliana. These, and pure C. torelliana, were then assessed in common-garden field plots for the abundance and distribution of herbivory by four distinct herbivore taxa. Differing feeding strategies among these herbivores allowed us to test hypotheses regarding heritability of susceptibility and relationships to alpha-pinene and 1,8-cineole. We found no support for an increase in foliar oil content with increasing latitude, nor did our analysis support predictions for consistent variation in alpha-pinene and 1,8-cineole contents with latitude. However, herbivore species showed differential responses to different taxa and monoterpene contents. For example, eriophyid mites, the most monophagous of our censused herbivores, avoided the pure species, but fed on hybrid taxa, supporting hypotheses on hybrid susceptibility. The most polyphagous herbivore (leaf blister sawfly Phylacteophaga froggatti) showed no evidence of response to plant secondary metabolites, while the distribution and abundance patterns of Paropsis atomaria showed some relationship to monoterpene yields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Predicting which species are likely to cause serious impacts in the future is crucial for targeting management efforts, but the characteristics of such species remain largely unconfirmed. We use data and expert opinion on tropical and subtropical grasses naturalised in Australia since European settlement to identify naturalised and high-impact species and subsequently to test whether high-impact species are predictable. High-impact species for the three main affected sectors (environment, pastoral and agriculture) were determined by assessing evidence against pre-defined criteria. Twenty-one of the 155 naturalised species (14%) were classified as high-impact, including four that affected more than one sector. High-impact species were more likely to have faster spread rates (regions invaded per decade) and to be semi-aquatic. Spread rate was best explained by whether species had been actively spread (as pasture), and time since naturalisation, but may not be explanatory as it was tightly correlated with range size and incidence rate. Giving more weight to minimising the chance of overlooking high-impact species, a priority for biosecurity, meant a wider range of predictors was required to identify high-impact species, and the predictive power of the models was reduced. By-sector analysis of predictors of high impact species was limited by their relative rarity, but showed sector differences, including to the universal predictors (spread rate and habitat) and life history. Furthermore, species causing high impact to agriculture have changed in the past 10 years with changes in farming practice, highlighting the importance of context in determining impact. A rationale for invasion ecology is to improve the prediction and response to future threats. Although our study identifies some universal predictors, it suggests improved prediction will require a far greater emphasis on impact rather than invasiveness, and will need to account for the individual circumstances of affected sectors and the relative rarity of high-impact species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrogen (N) is an essential nutrient in mango, influencing both productivity and fruit quality. In Australia, tree N is traditionally assessed once a year in the dormant pre-flowering stage by laboratory analysis of leaf N. This single assessment is insufficient to determine tree N status at all stages of the annual phenological cycle. Development of a field-based rapid N test would allow more frequent monitoring of tree N status and improved fertiliser management. This experiment examined the accuracy and useability of several devices used in other horticultural crops to rapidly assess mango leaf N in the field; the Konica Minolta 'SPAD-502 chlorophyll meter', Horiba 'Cardy Meter' and the Merck 'RQflex 10'. Regression and correlation analyses were used to determine the relationship between total leaf N and the measurements from the rapid test devices. The relationship between the chlorophyll index measured by the SPAD-502 meter and leaf N is highly significant at late fruit set (R 2=0.72, n=40) and post-harvest (R2=0.81, n=40) stages in the mango cultivar 'Kensington Pride' and significant (R2=0.51, n=40) at the flowering stage, indicating the device can be used to rapidly assess mango leaf N in the field. Correlation analysis indicated the relationship between petiole sap measured with the Cardy or Merck devices and leaf N is non-significant. © 2013 ISHS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To establish the prevalence of anthelmintic resistance in ovine gastrointestinal nematodes in southern Queensland. Design An observational parasitological study using the faecal egg count reduction test. Methods Sheep farms (n = 20) enrolled in this study met the twin criteria of using worm testing for drench decisions and having concerns about anthelmintic efficacy. On each farm, 105 sheep were randomly allocated to one of six treatment groups or an untreated control group. Faecal samples were collected on day 0 and days 10–14 for worm egg counts and larval differentiation. Single- and multi-combination anthelmintics, persistent and non-persistent, oral liquid or capsule, pour-on and injectable formulations were tested. Monepantel was not tested. Farmers also responded to a questionnaire on drenching practices. Results Haemonchus contortus was the predominant species. Efficacy <95% was recorded on 85% of farms for one or more anthelmintics and on 10% of farms for six anthelmintics. No resistance was identified on three farms. The 4-way combination product was efficacious (n = 4 farms). Napthalophos resistance was detected on one farm only. Resistance to levamisole (42% of farms), moxidectin injection (50% of farms) and the closantel/abamectin combination (67% of farms) was identified. Moxidectin oral was efficacious against Trichostrongylus colubriformis, which was predominant on only one farm. Of the farms tested, 55% ran meat breeds, 60% dosed more than the recommended dose rate and 70% always, mostly or when possible practised a ‘drench and move’ strategy. Conclusion This level of anthelmintic resistance in southern Queensland will severely compromise worm control and force increased use of monepantel.