6 resultados para CALIBRATION CURVE

em eResearch Archive - Queensland Department of Agriculture


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The utility of near infrared spectroscopy as a non-invasive technique for the assessment of internal eating quality parameters of mandarin fruit (Citrus reticulata cv. Imperial) was assessed. The calibration procedure for the attributes of TSS (total soluble solids) and DM (dry matter) was optimised with respect to a reference sampling technique, scan averaging, spectral window, data pre-treatment (in terms of derivative treatment and scatter correction routine) and regression procedure. The recommended procedure involved sampling of an equatorial position on the fruit with 1 scan per spectrum, and modified partial least squares model development on a 720–950-nm window, pre-treated as first derivative absorbance data (gap size of 4 data points) with standard normal variance and detrend scatter correction. Calibration model performance for the attributes of TSS and DM content was encouraging (typical Rc2 of >0.75 and 0.90, respectively; typical root mean squared standard error of calibration of <0.4 and 0.6%, respectively), whereas that for juiciness and total acidity was unacceptable. The robustness of the TSS and DM calibrations across new populations of fruit is documented in a companion study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The robustness of multivariate calibration models, based on near infrared spectroscopy, for the assessment of total soluble solids (TSS) and dry matter (DM) of intact mandarin fruit (Citrus reticulata cv. Imperial) was assessed. TSS calibration model performance was validated in terms of prediction of populations of fruit not in the original population (different harvest days from a single tree, different harvest localities, different harvest seasons). Of these, calibration performance was most affected by validation across seasons (signal to noise statistic on root mean squared error of prediction of 3.8, compared with 20 and 13 for locality and harvest day, respectively). Procedures for sample selection from the validation population for addition to the calibration population (‘model updating’) were considered for both TSS and DM models. Random selection from the validation group worked as well as more sophisticated selection procedures, with approximately 20 samples required. Models that were developed using samples at a range of temperatures were robust in validation for TSS and DM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new test for pathogenic Leptospira isolates, based on RAPD-PCR and high-resolution melt (HRM) analysis (which measures the melting temperature of amplicons in real time, using a fluorescent DNA-binding dye), has recently been developed. A characteristic profile of the amplicons can be used to define serovars or detect genotypes. Ten serovars, of leptospires from the species Leptospira interrogans (serovars Australis, Robinsoni, Hardjo, Pomona, Zanoni, Copenhageni and Szwajizak), L. borgpetersenii (serovar Arborea), L. kirschneri (serovar Cynopteri) and L. weilii (serovar Celledoni), were typed against 13 previously published RAPD primers, using a real-time cycler (the Corbett Life Science RotorGene 6000) and the optimised reagents from a commercial kit (Quantace SensiMix). RAPD-HRM at specific temperatures generated defining amplicon melt profiles for each of the tested serovars. These profiles were evaluated as difference-curve graphs generated using the RotorGene software package, with a cut-off of at least 8 'U' (plus or minus). The results demonstrated that RAPD-HRM can be used to measure serovar diversity and establish identity, with a high degree of stability. The characterisation of Leptospira serotypes using a DNA-based methodology is now possible. As an objective and relatively inexpensive and rapid method of serovar identification, at least for cultured isolates, RAPD-HRM assays show convincing potentia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-resolution melt-curve analysis of random amplified polymorphic DNA (RAPD-HRM) is a novel technology that has emerged as a possible method to characterise leptospires to serovar level. RAPD-HRM has recently been used to measure intra-serovar convergence between strains of the same serovar as well as inter-serovar divergence between strains of different serovars. The results indicate that intra-serovar heterogeneity and inter-serovar homogeneity may limit the application of RAPD-HRM in routine diagnostics. They also indicate that genetic attenuation of aged, high-passage-number isolates could undermine the use of RAPD-HRM or any other molecular technology. Such genetic attenuation may account for a general decrease seen in titres of rabbit hyperimmune antibodies over time. Before RAPD-HRM can be further advanced as a routine diagnostic tool, strains more representative of the wild-type serovars of a given region need to be identified. Further, RAPD-HRM analysis of reference strains indicates that the routine renewal of reference collections, with new isolates, may be needed to maintain the genetic integrity of the collections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.