3 resultados para Sequential injection analysis
em eResearch Archive - Queensland Department of Agriculture
Resumo:
The wheat grain industry is Australia's second largest agricultural export commodity. There is an increasing demand for accurate, objective and near real-time crop production information by industry. The advent of the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite platform has augmented the capability of satellite-based applications to capture reflectance over large areas at acceptable pixel scale, cost and accuracy. The use of multi-temporal MODIS-enhanced vegetation index (EVI) imagery to determine crop area was investigated in this article. Here the rigour of the harmonic analysis of time-series (HANTS) and early-season metric approaches was assessed when extrapolating over the entire Queensland (QLD) cropping region for the 2005 and 2006 seasons. Early-season crop area estimates, at least 4 months before harvest, produced high accuracy at pixel and regional scales with percent errors of -8.6% and -26% for the 2005 and 2006 seasons, respectively. In discriminating among crops at pixel and regional scale, the HANTS approach showed high accuracy. The errors for specific area estimates for wheat, barley and chickpea were 9.9%, -5.2% and 10.9% (for 2005) and -2.8%, -78% and 64% (for 2006), respectively. Area estimates of total winter crop, wheat, barley and chickpea resulted in coefficient of determination (R(2)) values of 0.92, 0.89, 0.82 and 0.52, when contrasted against the actual shire-scale data. A significantly high coefficient of determination (0.87) was achieved for total winter crop area estimates in August across all shires for the 2006 season. Furthermore, the HANTS approach showed high accuracy in discriminating cropping area from non-cropping area and highlighted the need for accurate and up-to-date land use maps. The extrapolability of these approaches to determine total and specific winter crop area estimates, well before flowering, showed good utility across larger areas and seasons. Hence, it is envisaged that this technology might be transferable to different regions across Australia.
Resumo:
The root-lesion nematode, Pratylenchus thornei, can reduce wheat yields by >50%. Although this nematode has a broad host range, crop rotation can be an effective tool for its management if the host status of crops and cultivars is known. The summer crops grown in the northern grain region of Australia are poorly characterised for their resistance to P. thornei and their role in crop sequencing to improve wheat yields. In a 4-year field experiment, we prepared plots with high or low populations of P. thornei by growing susceptible wheat or partially resistant canaryseed (Phalaris canariensis); after an 11-month, weed-free fallow, several cultivars of eight summer crops were grown. Following another 15-month, weed-free fallow, P. thornei-intolerant wheat cv. Strzelecki was grown. Populations of P. thornei were determined to 150 cm soil depth throughout the experiment. When two partially resistant crops were grown in succession, e.g. canaryseed followed by panicum (Setaria italica), P. thornei populations were <739/kg soil and subsequent wheat yields were 3245 kg/ha. In contrast, after two susceptible crops, e.g. wheat followed by soybean, P. thornei populations were 10 850/kg soil and subsequent wheat yields were just 1383 kg/ha. Regression analysis showed a linear, negative response of wheat biomass and grain yield with increasing P. thornei populations and a predicted loss of 77% for biomass and 62% for grain yield. The best predictor of wheat yield loss was P. thornei populations at 0-90 cm soil depth. Crop rotation can be used to reduce P. thornei populations and increase wheat yield, with greatest gains being made following two partially resistant crops grown sequentially.
Resumo:
Limitations in quality bedding material have resulted in the growing need to re-use litter during broiler farming in some countries, which can be of concern from a food-safety perspective. The aim of this study was to compare the Campylobacter levels in ceca and litter across three litter treatments under commercial farming conditions. The litter treatments were (a) the use of new litter after each farming cycle; (b) an Australian partial litter re-use practice; and (c) a full litter re-use practice. The study was carried out on two farms over two years (Farm 1, from 2009–2010 and Farm 2, from 2010–2011), across three sheds (35,000 to 40,000 chickens/shed) on each farm, adopting three different litter treatments across six commercial cycles. A random sampling design was adopted to test litter and ceca for Campylobacter and Escherichia coli, prior to commercial first thin-out and final pick-up. Campylobacter levels varied little across litter practices and farming cycles on each farm and were in the range of log 8.0–9.0 CFU/g in ceca and log 4.0–6.0 MPN/g for litter. Similarly the E. coli in ceca were ∼log 7.0 CFU/g. At first thin-out and final pick-up, the statistical analysis for both litter and ceca showed that the three-way interaction (treatments by farms by times) was highly significant (P < 0.01), indicating that the patterns of Campylobacter emergence/presence across time vary between the farms, cycles and pickups. The emergence and levels of both organisms were not influenced by litter treatments across the six farming cycles on both farms. Either C. jejuni or C. coli could be the dominant species across litter and ceca, and this phenomenon could not be attributed to specific litter treatments. Irrespective of the litter treatments in place, cycle 2 on Farm 2 remained campylobacter-free. These outcomes suggest that litter treatments did not directly influence the time of emergence and levels of Campylobacter and E. coli during commercial farming.