4 resultados para MacRitchie, Finlay
em CentAUR: Central Archive University of Reading - UK
Resumo:
Maize silage nutritive quality is routinely determined by near infrared reflectance spectroscopy (NIRS). However, little is known about the impact of sample preparation on the accuracy of the calibration to predict biological traits. A sample population of 48 maize silages representing a wide range of physiological maturities was used in a study to determine the impact of different sample preparation procedures (i.e., drying regimes; the presence or absence of residual moisture; the degree of particle comminution) on resultant NIR prediction statistics. All silages were scanned using a total of 12 combinations of sample pre-treatments. Each sample preparation combination was subjected to three multivariate regression techniques to give a total of 36 predictions per biological trait. Increased sample preparations procedure, relative to scanning the unprocessed whole plant (WP) material, always resulted in a numerical minimisation of model statistics. However, the ability of each of the treatments to significantly minimise the model statistics differed. Particle comminution was the most important factor, oven-drying regime was intermediate, and residual moisture presence was the least important. Models to predict various biological parameters of maize silage will be improved if material is subjected to a high degree of particle comminution (i.e., having been passed through a 1 mm screen) and developed on plant material previously dried at 60 degrees C. The extra effort in terms of time and cost required to remove sample residual moisture cannot be justified. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Temperate-zone crops require a period of winter chilling to terminate dormancy and ensure adequate bud break the following spring. The exact chilling requirement of blackcurrant (Ribes nigrum), a commercially important crop in northern Europe, is relatively unknown. Chill unit models have been successfully utilized to determine the optimum chilling temperature of a range of crops, with one chill unit equating to I h exposure to the optimum temperature for chill satisfaction. Two-year-old R. nigrum plants of the cultivars 'Ben Gairn', 'Ben Hope' and 'Ben Tirran' were exposed to temperatures of -10.1 degrees C. -3.4 degrees C. 0.1 degrees C, 1.5 degrees C, 2.1 degrees C, 3.4 degrees C or 8.9 degrees C (+/- 0.7 degrees C) for durations of 0, 2, 4, 6, 8 or 10 weeks and multiple regression analyses used to determine the optimum temperature for chill satisfaction. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
There is concern that modern cultivars and/or agronomic practices have resulted in reduced concentrations of mineral elements essential to human nutrition in edible crops. Increased yields are often associated with reduced concentrations of mineral elements in produce, and a number of recent studies have indicated that, when grown under identical conditions, the concentrations of several mineral elements are lower in genotypes yielding more grain or shoot biomass than in older, lower-yielding genotypes. Potato is a significant crop, grown worldwide, yet few studies have investigated whether increasing yields, through agronomy or breeding, affects the concentrations of mineral elements in tubers. This article examines the hypothesis that increasing yields, either by the application of mineral fertilizers and/or by growing higher-yielding varieties, leads to decreased concentrations of mineral elements in tubers. It reports that the application of fertilizers influences tuber elemental composition in a complex manner, presumably as a consequence of soil chemistry and interactions between mineral elements within the plant, that considerable variation exists between potato genotypes in the concentrations of mineral elements in their tubers, and that, like in other crops, higher-yielding genotypes occasionally have lower concentrations of some mineral elements in their edible tissues than lower-yielding genotypes.