46 resultados para whole cereal grains
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.
Resumo:
Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.
Resumo:
An understanding of processes regulating wheat floret and grain number at higher temperatures is required to better exploit genetic variation. In this study we tested the hypothesis that at higher temperatures, a reduction in floret fertility is associated with a decrease in soluble sugars and this response is exacerbated in genotypes low in water soluble carbohydrates (WSC). Four recombinant inbred lines contrasting for stem WSC were grown at 20/10 degrees C and 11 h photoperiod until terminal spikelet, and then continued in a factorial combination of 20/10 degrees C or 28/14 degrees C with 11 h or 16 h photoperiod until anthesis. Across environments, High WSC lines had more grains per spike associated with more florets per spike. The number of fertile florets was associated with spike biomass at booting and, by extension, with glucose amount, both higher in High WSC lines. At booting, High WSC lines had higher fixed C-13 and higher levels of expression of genes involved in photosynthesis and sucrose transport and lower in sucrose degradation compared with Low WSC lines. At higher temperature, the intrinsic rate of floret development rate before booting was slower in High WSC lines. Grain set declined with the intrinsic rate of floret development before booting, with an advantage for High WSC lines at 28/14 degrees C and 16 h. Genotypic and environmental action on floret fertility and grain set was summarised in a model.
Resumo:
A high proportion of the Australian and New Zealand dairy industry is based on a relatively simple, low input and low cost pasture feedbase. These factors enable this type of production system to remain internationally competitive. However, a key limitation of pasture-based dairy systems is periodic imbalances between herd intake requirements and pasture DM production, caused by strong seasonality and high inter-annual variation in feed supply. This disparity can be moderated to a certain degree through the strategic management of the herd through altering calving dates and stocking rates, and the feedbase by conserving excess forage and irrigating to flatten seasonal forage availability. Australasian dairy systems are experiencing emerging market and environmental challenges, which includes increased competition for land and water resources, decreasing terms of trade, a changing and variable climate, an increasing environmental focus that requires improved nutrient and water-use efficiency and lower greenhouse gas emissions. The integration of complementary forages has long been viewed as a means to manipulate the home-grown feed supply, to improve the nutritive value and DM intake of the diet, and to increase the efficiency of inputs utilised. Only recently has integrating complementary forages at the whole-farm system level received the significant attention and investment required to examine their potential benefit. Recent whole-of-farm research undertaken in both Australia and New Zealand has highlighted the importance of understanding the challenges of the current feedbase and the level of complementarity between forage types required to improve profit, manage risk and/or alleviate/mitigate against adverse outcomes. This paper reviews the most recent systems-level research into complementary forages, discusses approaches to modelling their integration at the whole-farm level and highlights the potential of complementary forages to address the major challenges currently facing pasture-based dairy systems.
Resumo:
Alternative sources of N are required to bolster subtropical cereal production without increasing N2O emissions from these agro-ecosystems. The reintroduction of legumes in cereal cropping systems is a possible strategy to reduce synthetic N inputs but elevated N2O losses have sometimes been observed after the incorporation of legume residues. However, the magnitude of these losses is highly dependent on local conditions and very little data are available for subtropical regions. The aim of this study was to assess whether, under subtropical conditions, the N mineralised from legume residues can substantially decrease the synthetic N input required by the subsequent cereal crop and reduce overall N2O emissions during the cereal cropping phase. Using a fully automated measuring system, N2O emissions were monitored in a cereal crop (sorghum) following a legume pasture and compared to the same crop in rotation with a grass pasture. Each crop rotation included a nil and a fertilised treatment to assess the N availability of the residues. The incorporation of legumes provided enough readily available N to effectively support crop development but the low labile C left by these residues is likely to have limited denitrification and therefore N2O emissions. As a result, N2O emissions intensities (kgN2O-N yield-1ha-1) were considerably lower in the legume histories than in the grass. Overall, these findings indicate that the C supplied by the crop residue can be more important than the soil NO3 - content in stimulating denitrification and that introducing a legume pasture in a subtropical cereal cropping system is a sustainable practice from both environmental and agronomic perspectives.
Formulation and characterization of drug-loaded microparticles using distiller’s dried grain kafirin
Resumo:
Kafirin, a protein extracted from sorghum grain has been formulated into microparticles, and proposed for use as a delivery system due to the resistance of kafirin to upper gastrointestinal digestion. However, extracting kafirin from sorghum “distiller’s dried grains with solubles” (DDGS) may be more efficient as the carbohydrate component has been removed by fermentation. This study investigated the properties and use of kafirin extracted from DDGS to formulate microparticles. Prednisolone, an anti-inflammatory drug that could benefit from a delayed and targeted delivery system to the colon, was loaded into DDGS kafirin microparticles by phase separation using sodium chloride. Scanning electron micrographs revealed that the empty and prednisolone-loaded microparticles were round in shape and varied in size. Surface binding studies indicated prednisolone was loaded within the microparticles rather than being solely bound on the surface. These findings demonstrate DDGS kafirin can be formulated into microparticles and loaded with medication. Future studies could investigate the potential applications of DDGS kafirin microparticles as an orally administered targeted drug-delivery system.
Formulation and characterization of drug-loaded microparticles using distiller’s dried grain kafirin
Resumo:
Kafirin, a protein extracted from sorghum grain has been formulated into microparticles, and proposed for use as a delivery system due to the resistance of kafirin to upper gastrointestinal digestion. However, extracting kafirin from sorghum distillers dried grains with solubles (DDGS) may be more efficient as the carbohydrate component has been removed by fermentation. This study investigated the properties and use of kafirin extracted from DDGS to formulate microparticles. Prednisolone, an anti-inflammatory drug that could benefit from a delayed and targeted delivery system to the colon, was loaded into DDGS kafirin microparticles by phase separation using sodium chloride. Scanning electron micrographs revealed that the empty and prednisolone-loaded microparticles were round in shape and varied in size. Surface binding studies indicated prednisolone was loaded within the microparticles rather than being solely bound on the surface. These findings demonstrate DDGS kafirin can be formulated into microparticles and loaded with medication. Future studies could investigate the potential applications of DDGS kafirin microparticles as an orally administered targeted drug-delivery system.
Resumo:
The root-lesion nematodes (RLN) Pratylenchus thornei and P. neglectus are widely distributed in Australian grain producing regions and can reduce the yield of intolerant wheat cultivars by up to 65 , costing the industry ~123 M AUD/year. Consequently, researchers in the northern, southern and western regions have independently developed procedures to evaluate the resistance of cereal cultivars to RLN. To compare results, each of the three laboratories phenotyped a set of 26 and 36 cereal cultivars for relative resistance/susceptibility to P. thornei and P. neglectus respectively. The northern and southern regions also investigated the effects of planting time and experiment duration on RLN reproduction and cultivar ranking. Results show the genetic correlation between cultivars tested using the northern and southern procedures evaluating P. thornei resistance was 0.93. Genetic correlations between experiments using the same procedure, but with different planting times, were 0.99 for both northern and southern procedures. The genetic correlation between cultivars tested using the northern, southern and western procedures evaluating P. neglectus resistance ranged from 0.71 to 0.95. Genetic correlations between experiments using the same procedure but with different planting times ranged from 0.91 to 0.99. This study established that, even though experiments were conducted in different geographic locations and with different trial management practices, the diverse nematode resistance screening procedures ranked cultivars similarly. Consequently, RLN resistance data can be pooled across regions to provide national consensus ratings of cultivars.
Resumo:
Sorghum is a staple food for half a billion people and, through growth on marginal land with minimal inputs, is an important source of feed, forage and increasingly, biofuel feedstock. Here we present information about non-cellulosic cell wall polysaccharides in a diverse set of cultivated and wild Sorghum bicolor grains. Sorghum grain contains predominantly starch (64–76) but is relatively deficient in other polysaccharides present in wheat, oats and barley. Despite overall low quantities, sorghum germplasm exhibited a remarkable range in polysaccharide amount and structure. Total (1,3;1,4)-β-glucan ranged from 0.06 to 0.43 (w/w) whilst internal cellotriose:cellotetraose ratios ranged from 1.8 to 2.9:1. Arabinoxylan amounts fell between 1.5 and 3.6 (w/w) and the arabinose:xylose ratio, denoting arabinoxylan structure, ranged from 0.95 to 1.35. The distribution of these and other cell wall polysaccharides varied across grain tissues as assessed by electron microscopy. When ten genotypes were tested across five environmental sites, genotype (G) was the dominant source of variation for both (1,3;1,4)-β-glucan and arabinoxylan content (69–74), with environment (E) responsible for 5–14. There was a small G × E effect for both polysaccharides. This study defines the amount and spatial distribution of polysaccharides and reveals a significant genetic influence on cell wall composition in sorghum grain.
Resumo:
In semi-arid sub-tropical areas, a number of studies concerning no-till (NT) farming systems have demonstrated advantages in economic, environmental and soil quality aspects over conventional tillage (CT). However, adoption of continuous NT has contributed to the build-up of herbicide resistant weed populations, increased incidence of soil- and stubble-borne diseases, and stratification of nutrients and organic carbon near the soil surface. Some farmers often resort to an occasional strategic tillage (ST) to manage these problems of NT systems. However, farmers who practice strict NT systems are concerned that even one-time tillage may undo positive soil condition benefits of NT farming systems. We reviewed the pros and cons of the use of occasional ST in NT farming systems. Impacts of occasional ST on agronomy, soil and environment are site-specific and depend on many interacting soil, climatic and management conditions. Most studies conducted in North America and Europe suggest that introducing occasional ST in continuous NT farming systems could improve productivity and profitability in the short term; however in the long-term, the impact is negligible or may be negative. The short term impacts immediately following occasional ST on soil and environment include reduced protective cover, soil loss by erosion, increased runoff, loss of C and water, and reduced microbial activity with little or no detrimental impact in the long-term. A potential negative effect immediately following ST would be reduced plant available water which may result in unreliability of crop sowing in variable seasons. The occurrence of rainfall between the ST and sowing or immediately after the sowing is necessary to replenish soil water lost from the seed zone. Timing of ST is likely to be critical and must be balanced with optimising soil water prior to seeding. The impact of occasional ST varies with the tillage implement used; for example, inversion tillage using mouldboard tillage results in greater impacts as compared to chisel or disc. Opportunities for future research on occasional ST with the most commonly used implements such as tine and/or disc in Australia’s northern grains-growing region are presented in the context of agronomy, soil and the environment.
Resumo:
Development of no-tillage (NT) farming has revolutionized agricultural systems by allowing growers to manage greater areas of land with reduced energy, labour and machinery inputs to control erosion, improve soil health and reduce greenhouse gas emission. However, NT farming systems have resulted in a build-up of herbicide-resistant weeds, an increased incidence of soil- and stubble-borne diseases and enrichment of nutrients and carbon near the soil surface. Consequently, there is an increased interest in the use of an occasional tillage (termed strategic tillage, ST) to address such emerging constraints in otherwise-NT farming systems. Decisions around ST uses will depend upon the specific issues present on the individual field or farm, and profitability and effectiveness of available options for management. This paper explores some of the issues with the implementation of ST in NT farming systems. The impact of contrasting soil properties, the timing of the tillage and the prevailing climate exert a strong influence on the success of ST. Decisions around timing of tillage are very complex and depend on the interactions between soil water content and the purpose for which the ST is intended. The soil needs to be at the right water content before executing any tillage, while the objective of the ST will influence the frequency and type of tillage implement used. The use of ST in long-term NT systems will depend on factors associated with system costs and profitability, soil health and environmental impacts. For many farmers maintaining farm profitability is a priority, so economic considerations are likely to be a primary factor dictating adoption. However, impacts on soil health and environment, especially the risk of erosion and the loss of soil carbon, will also influence a grower’s choice to adopt ST, as will the impact on soil moisture reserves in rainfed cropping systems.
Resumo:
The DAYCENT biogeochemical model was used to investigate how the use of fertilizers coated with nitrification inhibitors and the introduction of legumes in the crop rotation can affect subtropical cereal production and N2O emissions. The model was validated using comprehensive multi-seasonal, high-frequency dataset from two field investigations conducted on an Oxisol, which is the most common soil type in subtropical regions. Different N fertilizer rates were tested for each N management strategy and simulated under varying weather conditions. DAYCENT was able to reliably predict soil N dynamics, seasonal N2O emissions and crop production, although some discrepancies were observed in the treatments with low or no added N inputs and in the simulation of daily N2O fluxes. Simulations highlighted that the high clay content and the relatively low C levels of the Oxisol analyzed in this study limit the chances for significant amounts of N to be lost via deep leaching or denitrification. The application of urea coated with a nitrification inhibitor was the most effective strategy to minimize N2O emissions. This strategy however did not increase yields since the nitrification inhibitor did not substantially decrease overall N losses compared to conventional urea. Simulations indicated that replacing part of crop N requirements with N mineralized by legume residues is the most effective strategy to reduce N2O emissions and support cereal productivity. The results of this study show that legumes have significant potential to enhance the sustainable and profitable intensification of subtropical cereal cropping systems in Oxisols.
Resumo:
Exotic plant pests (EPPs) threaten production, market access and sustainability of Australian plant production systems. For the grains industry there are over 600 identified EPPs of which 54 are considered high priority, posing a significant threat. Despite Australia’s geographical isolation and strong quarantine systems, the threat from EPPs has never been higher with the increasing levels of travel and trade, emphasising the need for improving our efforts in prevention, preparedness and surveillance for EPPs.