673 resultados para cereal
Resumo:
The production of adequate agricultural outputs to support the growing human population places great demands on agriculture, especially in light of ever-greater restrictions on input resources. Sorghum is a drought-adapted cereal capable of reliable production where other cereals fail, and thus represents a good candidate to address food security as agricultural inputs of water and arable land grow scarce. A long-standing issue with sorghum grain is that it has an inherently lower digestibility. Here we show that a low-frequency allele type in the starch metabolic gene, pullulanase, is associated with increased digestibility, regardless of genotypic background. We also provide evidence that the beneficial allele type is not associated with deleterious pleiotropic effects in the modern field environment. We argue that increasing the digestibility of an adapted crop is a viable way forward towards addressing food security while maximizing water and land-use efficiency.
Resumo:
Sorghum is a food and feed cereal crop adapted to heat and drought and a staple for 500 million of the world’s poorest people. Its small diploid genome and phenotypic diversity make it an ideal C4 grass model as a complement to C3 rice. Here we present high coverage (16–45 × ) resequenced genomes of 44 sorghum lines representing the primary gene pool and spanning dimensions of geographic origin, end-use and taxonomic group. We also report the first resequenced genome of S. propinquum, identifying 8 M high-quality SNPs, 1.9 M indels and specific gene loss and gain events in S. bicolor. We observe strong racial structure and a complex domestication history involving at least two distinct domestication events. These assembled genomes enable the leveraging of existing cereal functional genomics data against the novel diversity available in sorghum, providing an unmatched resource for the genetic improvement of sorghum and other grass species.
Resumo:
The effect of a change of tillage and crop residue management practice on the chemical and micro-biological properties of a cereal-producing red duplex soil was investigated by superimposing each of three management practices (CC: conventional cultivation, stubble burnt, crop conventionally sown; DD: direct-drilling, stubble retained, no cultivation, crop direct-drilled; SI: stubble incorporated with a single cultivation, crop conventionally sown), for a 3-year period on plots previously managed with each of the same three practices for 14 years. A change from DD to CC or SI practice resulted in a significant decline, in the top 0-5 cm of soil, in organic C, total N, electrical conductivity, NH4-N, NO3-N, soil moisture holding capacity, microbial biomass and CO2 respiration as well as a decline in the microbial quotient (the ratio of microbial biomass C to organic C; P <0.05). In contrast, a change from SI to DD or CC practice or a change from CC to DD or SI practice had only negligible impact on soil chemical properties (P >0.05). However, there was a significant increase in microbial biomass and the microbial quotient in the top 0-5 cm of soil following the change from CC to DD or SI practice and with the change from SI to DD practice (P <0.05). Analysis of ester-linked fatty acid methyl esters (EL-FAMEs) extracted from the 0- to 5-cm and 5- to 10-cm layers of the soils of the various treatments detected changes in the FAME profiles following a change in tillage practice. A change from DD practice to SI or CC practice was associated with a significant decline in the ratio of fungal to bacterial fatty acids in the 0- to 5-cm soil (P <0.05). The results show that a change in tillage practice, particularly the cultivation of a previously minimum-tilled (direct-drilled) soil, will result in significant changes in soil chemical and microbiological properties within a 3-year period. They also show that soil microbiological properties are sensitive indicators of a change in tillage practice.
Resumo:
This is the first study to investigate alternative fertilisation strategies to increase cereal production while reducing greenhouse gas emissions from the most common soil type in subtropical regions. The results of this research will contribute to define future farming practices to achieve global food security and mitigate climate change. The study established that introducing legumes in cropping systems is the most agronomically viable and environmentally sustainable fertilisation strategy. Importantly, this strategy can be widely adopted in subtropical regions since it is economically accessible, requires little know-how transfer and technology investment, and can be profitable in both low- and high-input cropping systems.
Resumo:
The influence of barley and oat grain supplements on hay dry matter intake (DMI), carcass components gain and meat quality in lambs fed a low quality basal diet was examined. Thirty five crossbred wether lambs (9 months of age) were divided into four groups. After adaptation to a basal diet of 85% oat hay and 15% lucerne hay for one week, an initial group of 11 was slaughtered. The weights of carcass components and digesta-free empty body weight (EBW) of this group was used to estimate the weight of carcass components of the other three experimental groups at the start of the experiment. The remaining three groups were randomly assigned to pens and fed ad libitum the basal diet alone (basal), basal with 300 g air dry barley grain (barley), basal with 300 g air dry oat grain (oat). Supplements were fed twice weekly (i.e., 900 g on Tuesday and 1200 g on Friday). After 13 weeks of feeding, animals were slaughtered and, at 24 h post-mortem meat quality and subcutaneous fat colour were measured. Samples of longissimus muscle were collected for determination of sarcomere length and meat tenderness. Hay DMI was reduced (P<0.01) by both barley and oat supplements. Lambs fed barley or oat had a higher and moderate digestibility of DM, and a higher intake of CP (P<0.05) and ME (P<0.01) than basal lambs. Final live weight of barley and oat lambs was higher (P<0.05) than basal, but this was not reflected in EBW or hot carcass weight. Lambs fed barley or oat had increases in protein (P<0.01) and water (P<0.001) in the carcass, but fat gain was not changed (P>0.05). There were no differences in eye muscle area or fat depth (total muscle and adipose tissue depth at 12th rib, 110 mm from midline; GR) among groups. The increased levels of protein and water components in the carcass of barley and oat fed lambs, associated with improved muscle production, were small and did not alter (P>0.05) any of the carcass/meat quality attributes compared to lambs fed a low quality forage diet. Feeding barley or oat grain at 0.9–1% of live weight daily to lambs consuming poor quality hay may not substantially improve carcass quality, but may be useful in maintaining body condition of lambs through the dry season for slaughter out of season
Resumo:
The intent of this study was to design, document and implement a Quality Management System (QMS) into a laboratory that incorporated both research and development (R&D) and routine analytical activities. In addition, it was necessary for the QMS to be easily and efficiently maintained to: (a) provide documented evidence that would validate the system's compliance with a certifiable standard, (b) fit the purpose of the laboratory, (c) accommodate prevailing government policies and standards, and (d) promote positive outcomes for the laboratory through documentation and verification of the procedures and methodologies implemented. Initially, a matrix was developed that documented the standards' requirements and the necessary steps to be made to meet those requirements. The matrix provided a check mechanism on the progression of the system's development. In addition, it was later utilised in the Quality Manual as a reference tool for the location of full procedures documented elsewhere in the system. The necessary documentation to build and monitor the system consisted of a series of manuals along with forms that provided auditable evidence of the workings of the QMS. Quality Management (QM), in one form or another, has been in existence since the early 1900's. However, the question still remains: is it a good thing or just a bugbear? Many of the older style systems failed because they were designed by non-users, fiercely regulatory, restrictive and generally deemed to be an imposition. It is now considered important to foster a sense of ownership of the system by the people who use the system. The system's design must be tailored to best fit the purpose of the operations of the facility if maximum benefits to the organisation are to be gained.
Resumo:
Cultivation and cropping of soils results in a decline in soil organic carbon and soil nitrogen, and can lead to reduced crop yields. The CENTURY model was used to simulate the effects of continuous cultivation and cereal cropping on total soil organic matter (C and N), carbon pools, nitrogen mineralisation, and crop yield from 6 locations in southern Queensland. The model was calibrated for each replicate from the original datasets, allowing comparisons for each replicate rather than site averages. The CENTURY model was able to satisfactorily predict the impact of long-term cultivation and cereal cropping on total organic carbon, but was less successful in simulating the different fractions and nitrogen mineralisation. The model firstly over-predicted the initial (pre-cropping) soil carbon and nitrogen concentration of the sites. To account for the unique shrinking and swelling characteristics of the Vertosol soils, the default annual decomposition rates of the slow and passive carbon pools were doubled, and then the model accurately predicted initial conditions. The ability of the model to predict carbon pool fractions varied, demonstrating the difficulty inherent in predicting the size of these conceptual pools. The strength of the model lies in the ability to closely predict the starting soil organic matter conditions, and the ability to predict the impact of clearing, cultivation, fertiliser application, and continuous cropping on total soil carbon and nitrogen.
Resumo:
Displacement of the fungus Fusarium pseudograminearum from stubble by antagonists is a potential means of biocontrol of crown rot in cereals. The role of carbon and nitrogen nutrition in interactions between the pathogen and the antagonists Fusarium equiseti, Fusarium nygamai, Trichoderma harzianum and the non-antagonistic straw fungus Alternaria infectoria was investigated. Sole carbon source utilization patterns on Biolog plates were similar among the three Fusarium species, suggesting a possible role for competition. However, carbon niche overlap was unlikely to be important in antagonism by T. harzianum. Straw medium supplemented with sugars generally reduced the inhibitory effect of antagonists on growth of F. pseudograminearum in dual culture, indicating that availability of simple carbon sources does not limit antagonism. Adding nitrogen as urea, nitrate or ammonium to straw medium had little effect on antagonism by F. equiseti and F. nygamai, but ammonium addition removed the inhibitory effect of T. harzianum on growth of F. pseudograminearum. Displacement of F. pseudograminearum from straw by all fungi in a Petri dish assay was greater when urea or nitrate was used as a nitrogen source than with ammonium. All forms of nitrogen significantly increased displacement of F. pseudograminearum from straw under simulated field conditions when straws were either inoculated with T. harzianum or exposed to resident soil microbes. However, in 2 out of 3 experiments urea and nitrate were more effective than ammonium. The results suggest that availability of nitrogen, but not carbon, is limiting the activities of antagonists of F. pseudograminearum in straw, and the way nitrogen is applied can influence the rate of displacement and mortality of the pathogen in host residues.
Resumo:
The project will provide enough data for a reliable and robust NIRs. It will more fully develop the in vitro method to enable less costly assessment of grains in the future. It will also provide a reliable assessment for DE which is the most expensive component of pig feed.
Resumo:
The production of adequate agricultural outputs to support the growing human population places great demands on agriculture, especially in light of ever-greater restrictions on input resources. Sorghum is a drought-adapted cereal capable of reliable production where other cereals fail, and thus represents a good candidate to address food security as agricultural inputs of water and arable land grow scarce. A long-standing issue with sorghum grain is that it has an inherently lower digestibility. Here we show that a low-frequency allele type in the starch metabolic gene, pullulanase, is associated with increased digestibility, regardless of genotypic background. We also provide evidence that the beneficial allele type is not associated with deleterious pleiotropic effects in the modern field environment. We argue that increasing the digestibility of an adapted crop is a viable way forward towards addressing food security while maximizing water and land-use efficiency.
Resumo:
Disadvantages of invariable cereal cropping, concern of nutrient leaching and prices of nitrogen (N) fertilizer have all increased during last decades. An undersown crop, which grows together with a main crop and after harvest, could mitigate all those questions. The aim of this study was to develop undersowing in Finnish conditions, so that it suits for spring cereal farming as well as possible and enhances taking care of soil and environment, especially when control of N is concerned. In total, 17 plant species were undersown in spring cereals during the field experiments between 1991-1999 at four sites in South and Central Finland, but after selection, eight of them were studied more thoroughly. Two legumes, one grass species and one mixture of them were included in long-term trials in order to study annually repeated undersowing. Further, simultaneous broadcasting of seeds instead of separate undersowing was studied. Grain yield response and the capacity of the undersown crop to absorb soil N or fix N from atmosphere, and the release of N were of greatest interest. Seeding rates of undersown crops and N fertilization rates during annually repeated undersowing were also studied. Italian ryegrass (Lolium multiflorum Lam., IR) absorbed soil nitrate N (NO3-N) most efficiently in autumn and timothy (Phleum pratense L.) in spring. The capacity of other grass species to absorb N was low, or it was insufficient considering the negative effect on grain yield. Red clover (Trifolium pratense L.) and white clover (Trifolium repens L.) suited well in annually repeated undersowing, supplying fixed N for cereals without markedly increased risk of N leaching. Autumn oriented growth rhythm of the studied legumes was optimal for undersowing, whereas the growth rhythm of grasses was less suited but varied between species. A model of adaptive undersowing system was outlined in order to emphasize allocation of measures according needs. After defining the goal of undersowing, many decisions are to be done. When diminishing N leaching is primarily sought, a mixture of IR and timothy is advantageous. Clovers suit for replacing N fertilization, as the positive residual effect is greater than the negative effect caused by competition. A mixture of legume and non legume is a good choice when increased diversity is the main target. Seeding rate is an efficient means for adjusting competition and N effects. Broadcasting with soil covering equipment can be used to establish an undersown crop. In addition, timing and method of cover crop termination have an important role in the outcome. Continuous observing of the system is needed as for instance conditions significantly affect growth of undersown crop and on the other hand N release from crop residues may increase in long run.
Resumo:
Variation in the reaction of cereal cultivars to crown rot caused by Fusarium spp., in particular Fusarium pseudograminearum, was identified over 50 yrs ago, however the parameters and pathways of infection by F. pseudograminearum remain poorly understood. Seedlings of wheat, barley and oat genotypes that differ in susceptibility to crown rot were inoculated with a mixture of F. pseudograminearum isolates. Seedlings were harvested from 7 to 42 days after inoculation and expanded plant parts were rated for severity of visible disease symptoms. Individual leaf sheaths were placed onto nutrient media and fungal colonies emerging from the leaf sheathes were counted to estimate the degree of fungal spread within the host tissue. Significant differences in both the timing and the severity of disease symptoms were observed in the leaf sheath tissues of different host genotypes. Across all genotypes and plant parts examined, the development of visible symptoms closely correlated with the spread of the fungus into that tissue. The degree of infection of the coleoptile and sub-crown internode varied between genotypes, but was unrelated to the putative resistance of the host. In contrast leaf sheath tissues of the susceptible barley cv. Tallon and bread wheat cv. Puseas scored higher disease ratings and consistently showed faster, earlier spread of the fungus into younger tissues than infections of the oat cv. Cleanleaf or the wheat lines 2-49 and CPI 133814. While initial infections usually spread upwards from near the base of the first leaf sheath, the pathogen did not appear to invade younger leaf sheaths only from the base, but rather spread laterally across from older leaf sheaths into younger, subtended leaf sheaths, particularly as disease progressed. Early in the infection of each leaf sheath, disease symptoms in the partially resistant genotypes were less severe than in susceptible genotypes, however as infected leaf sheaths aged, differences between genotypes lessened as disease symptoms approached maximum values. Hence, while visual scoring of disease symptoms on leaf sheaths is a reliable comparative measure of the degree of fungal infection, differences between genotypes in the development of disease symptoms are more reliably assessed using the most recently expanded leaf sheaths.
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.