13 resultados para Crop year
em CentAUR: Central Archive University of Reading - UK
Resumo:
In this study was analyzed the effect of crop year and harvesting time on the fatty acid composition of cv. Picual virgin olive oil. The study was carried out during the fruit ripening period for three crop seasons. The mean fatty acid composition of Picual oils was determined. The oils contained palmitic acid (11.9%), oleic acid (79.3%), and linoleic acid (2.95%). The content of palmitic acid and saturated fatty acids decreased during fruit ripening while oleic and linoleic acids increased. The amount of stearic and linolenic acids decreased. The amount of saturated acids, palmitic and stearic, and the polyunsaturated acids linoleic and linolenic was dependent on the time of harvest, whereas the amount of oleic acid varied with the crop year. The differences observed between crop years for both palmitic and linoleic acid may be explained by the differences in the temperature during oil biosynthesis and by the amount of summer rainfall for oleic acid content. A significant relationship was observed between the MUFA/PUFA ratio and the oxidative stability measured by the Rancimat method.
Resumo:
Aims: This experiment aimed to determine whether the soil application of organic fertilizers can help the establishment of cacao and whether shade alters its response to fertilizers. Study Design: The 1.6 ha experiment was conducted over a period of one crop year (between April 2007 and March 2008) at the Cocoa Research Institute of Ghana. It involved four cacao genotypes (T 79/501, PA 150, P 30 [POS] and SCA 6), three shade levels (‘light’, ‘medium’ and ‘heavy’) and two fertilizer treatments (‘no fertilizer’, and ‘140 kg/ha of cacao pod husk ash (CPHA) plus poultry manure at 1,800 kg/ha). The experiment was designed as a split-plot with the cacao genotypes as the main plot factor and shade x fertilizer combinations as the sub-plots. Methodology: Gliricidia sepium and plantains (Musa sapientum) were planted in different arrangements to create the three temporary shade regimes for the cacao. Data were collected on temperature and relative humidity of the shade environments, initial soil nutrients, soil moisture, leaf N, P and K+ contents, survival, photo synthesis and growth of test plants. Results: The genotypes P 30 [POS] and SCA 6 showed lower stomatal conductance under non-limiting conditions. In the rainy seasons, plants under light shade had the highest CO2 assimilation rates. However, in the dry season, plants under increased shade recorded greater photosynthetic rates (P = .03). A significant shade x fertilizer interaction (P = .001) on photosynthesis in the dry season showed that heavier shade increases the benefits that young cacao gets from fertilizer application in that season. Conversely, shade should be reduced during the wet seasons to minimize light limitation to assimilation. Conclusion: Under ideal weather conditions young cacao exhibits genetic variability on stomatal conductance. Also, to optimize plant response to fertilizer application shade must be adjusted taking the prevailing weather condition into account.
Resumo:
This paper presents the results of (a) On-farm trials (eight) over a two-year period designed to test the effectiveness of leguminous cover crops in terms of increasing maize yields in Igalaland, Nigeria. (b) A survey designed to monitor the extent of, and reasons behind, adoption of the leguminous cover crop technology in subsequent years by farmers involved, to varying degrees, in the trial programme. particular emphasis was placed on comparing adoption of leguminous cover crops with that of new crop varieties released by a non-governmental organization in the same area since the mid 1980s. While the leguminous cover crop technology boosted maize grain yields by 127 to 136% above an untreated control yield of between 141 and 171 kg ha(-1), the adoption rate (number of farmers adopting) was only 18%. By way of contrast, new crop varieties had a highly variable benefit in terms of yield advantage over local varieties, with the best average increase of around 20%. Adoption rates for new crop varieties, assessed as both the number of farmers growing the varieties and the number of plots planted to the varieties, were 40% on average. The paper discusses some key factors influencing adoption of the leguminous cover crop technology, including seed availability. Implications of these results for a local non-governmental organization, the Diocesan Development Services, concerned with promoting the leguminous cover crop technology are also discussed.
Resumo:
Samples of whole crop wheat (WCW, n = 134) and whole crop barley (WCB, n = 16) were collected from commercial farms in the UK over a 2-year period (2003/2004 and 2004/2005). Near infrared reflectance spectroscopy (NIRS) was compared with laboratory and in vitro digestibility measures to predict digestible organic matter in the dry matter (DOMD) and metabolisable energy (ME) contents measured in vivo using sheep. Spectral models using the mean spectra of two scans were compared with those using individual spectra (duplicate spectra). Overall NIRS accurately predicted the concentration of chemical components in whole crop cereals apart from crude protein. ammonia-nitrogen, water-soluble carbohydrates, fermentation acids and solubility values. In addition. the spectral models had higher prediction power for in vivo DOMD and ME than chemical components or in vitro digestion methods. Overall there Was a benefit from the use of duplicate spectra rather than mean spectra and this was especially so for predicting in vivo DOMD and ME where the sample population size was smaller. The spectral models derived deal equally well with WCW and WCB and Would he of considerable practical value allowing rapid determination of nutritive value of these forages before their use in diets of productive animals. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
A total of 133 samples (53 fermented unprocessed, 19 fermented processed. 62 urea-treated processed) of whole crop wheat (WCW) and 16 samples (five fermented unprocessed, six fermented processed, five urea-treated processed) of whole crop barley (WCB) were collected from commercial farms over two consecutive years (2003/2004 and 2004/2005). Disruption of the maize grains to increase starch availability was achieved at the point of harvest by processors fitted to the forage harvesters. All samples were subjected to laboratory analysis whilst 50 of the samples (24 front Year 1, 26 front Year 2 all WCW except four WCB in Year 2) were subjected to in vivo digestibility and energy value measurements using mature wether sheep. Urea-treated WCW had higher (P<0.05) pH, and dry matter (DM) and crude protein contents and lower concentrations of fermentation products than fermented WCW. Starch was generally lower in fermented, unprocessed WCW and no effect of crop maturity at harvest (as indicated by DM content) on starch concentrations was seen. Urea-treated WCW had higher (P<0.05) in vivo digestible organic matter contents in the DM (DOMD) in Year 1 although this was not recorded in Year 2. There was a close relationship between the digestibility values of organic matter and gross energy thus aiding the use of DOMD to predict metabolisable energy (ME) content. A wide range of ME values was observed (WCW. 8.7-11.8 MJ/kg DM; WCB 7.9-11.2 MJ/kg DM) with the overall ME/DOMD ratio (ME = 0.0156 DOMD) in line With Studies in other forages. There was no evidence that a separate ME/DOMD relationship was needed for WCB which is helpful for practical application. This ratio and other parameters were affected by year of harvest (P<0.05) highlighting the influence of environmental and Other undefined factors. The variability in the composition and nutritive value of WCW and WCB highlights the need for reliable and accurate evaluation methods to be available to assess the Value of these forages before they are included in diets for dairy cows. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Grain legumes are known to increase the soil mineral nitrogen (N) content, reduce the infection pressure of soil borne pathogens, and hence enhance subsequent cereals yields. Replicated field experiments were performed throughout W. Europe (Denmark, United Kingdom, France, Germany and Italy) to asses the effect of intercropping pea and barley on the N supply to subsequent wheat in organic cropping systems. Pea and barley were grown either as sole crops at the recommended plant density (P100 and B100, respectively) or in replacement (P50B50) or additive (P100B50) intercropping designs. In the replacement design the total relative plant density is kept constant, while the additive design uses the optimal sole crop density for pea supplementing with 'extra' barley plants. The pea and barley crops were followed by winter wheat with and without N application. Additional experiments in Denmark and the United Kingdom included subsequent spring wheat with grass-clover as catch crops. The experiment was repeated over the three cropping seasons of 2003, 2004 and 2005. Irrespective of sites and intercrop design pea-barley intercropping improved the plant resource utilization (water, light, nutrients) to grain N yield with 25-30% using the Land Equivalent ratio. In terms of absolute quantities, sole cropped pea accumulated more N in the grains as compared to the additive design followed by the replacement design and then sole cropped barley. The post harvest soil mineral N content was unaffected by the preceding crops. Under the following winter wheat, the lowest mineral N content was generally found in early spring. Variation in soil mineral N content under the winter wheat between sites and seasons indicated a greater influence of regional climatic conditions and long-term cropping history than annual preceding crop and residue quality. Just as with the soil mineral N, the subsequent crop response to preceding crop was negligible. Soil N balances showed general negative values in the 2-year period, indicating depletion of N independent of preceding crop and cropping strategy. It is recommended to develop more rotational approaches to determine subsequent crop effects in organic cropping systems, since preceding crop effects, especially when including legumes, can occur over several years of cropping.
Resumo:
Technology involving genetic modification of crops has the potential to make a contribution to rural poverty reduction in many developing countries. Thus far, insecticide-producing 'Bt' varieties of cotton have been the main GM crops under cultivation in developing nations. Several studies have evaluated the farm-level performance of Bt varieties in comparison to conventional ones by estimating production technology, and have mostly found Bt technology to be very successful in raising output and/or reducing insecticide input. However, the production risk properties of this technology have not been studied, although they are likely to be important to risk-averse smallholders. This study investigates the output risk aspects of Bt technology using a three-year farm-level dataset on smallholder cotton production in Makhathini flats, Kwa-Zulu Natal, South Africa. Stochastic dominance and stochastic production function estimation methods are used to examine the risk properties of the two technologies. Results indicate that Bt technology increases output risk by being most effective when crop growth conditions are good, but being less effective when conditions are less favourable. However, in spite of its risk increasing effect, the mean output performance of Bt cotton is good enough to make it preferable to conventional technology even for risk-averse smallholders.
Resumo:
Runoff, sediment, total phosphorus and total dissolved phosphorus losses in overland flow were measured for two years on unbounded plots cropped with wheat and oats. Half of the field was cultivated with minimum tillage (shallow tillage with a tine cultivator) and half was conventionally ploughed. Within each cultivation treatment there were different treatment areas (TAs). In the first year of the experiment, one TA was cultivated up and down the slope, one TA was cultivated on the contour, with a beetle bank acting as a vegetative barrier partway up the slope, and one had a mixed direction cultivation treatment, with cultivation and drilling conducted up and down the slope and all subsequent operations conducted on the contour. In the second year, this mixed treatment was replaced with contour cultivation. Results showed no significant reduction in runoff, sediment losses or total phosphorus losses from minimum tillage when compared to the conventional plough treatment, but there were increased losses of total dissolved phosphorus with minimum tillage. The mixed direction cultivation treatment increased surface runoff and losses of sediment and phosphorus. Increasing surface roughness with contour cultivation reduced surface runoff compared to up and down slope cultivation in both the plough and minimum tillage treatment areas, but this trend was not significant. Sediment and phosphorus losses in the contour cultivation treatment followed a very similar pattern to runoff. Combining contour cultivation with a vegetative barrier in the form of a beetle bank to reduce slope length resulted in a non-significant reduction in surface runoff, sediment and total phosphorus when compared to up and down slope cultivation, but there was a clear trend towards reduced losses. However, the addition of a beetle bank did not provide a significant reduction in runoff, sediment losses or total phosphorus losses when compared to contour cultivation, suggesting only a marginal additional benefit. The economic implications for farmers of the different treatment options are investigated in order to assess their suitability for implementation at a field scale.
Resumo:
This case study on the Sifnos island, Greece, assesses the main factors controlling vegetation succession following crop abandonment and describes the vegetation dynamics of maquis and phrygana formations in relation to alternative theories of secondary succession. Field survey data were collected and analysed at community as well as species level. The results show that vegetation succession on abandoned crop fields is determined by the combined effects of grazing intensity, soil and geological characteristics and time. The analysis determines the quantitative grazing thresholds that modify the successional pathway. Light grazing leads to dominance by maquis vegetation while overgrazing leads to phryganic vegetation. The proposed model shows that vegetation succession following crop abandonment is a complex multi-factor process where the final or the stable stage of the process is not predefined but depends on the factors affecting succession. An example of the use of succession models and disturbance thresholds as a policy assessment tool is presented by evaluating the likely vegetation impacts of the recent reform of the Common Agricultural Policy on Sifnos island over a 20-30-year time horizon. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Nineteen wheat cultivars, released from 1934 to 2000, were grown at two organic and two non-organic sites in each of 3 years. Assessments included grain yield, grain protein concentration, protein yield, disease incidence and green leaf area. The superiority of each cultivar (the sum of the squares of the differences between its mean in each environment and the mean of the best cultivar there, divided by twice the number of environments; CS) was calculated for yield, grain protein concentration and protein yield, and ranked in each environment. The yield and grain protein concentration CS were more closely correlated with cultivar release date at the non-organic sites than at organic sites. This difference may be attributed to higher yield levels with larger differences among cultivars at the non-organic sites, rather than to improved stability (i.e. similar ranks) across sites. The significant difference in the correlation of protein yield CS and cultivar age between organic and non-organic sites would support evidence that the ability to take up mineral nitrogen (N) compared to soil N has been a component of the selection conditions of more modern cultivars (released after 1989). This is supported by assessment of green leaf area (GLA), where more modern cultivars in the non-organic systems had greater late-season GLA, a trend that was not identified in organic conditions. This effect could explain the poor correlation between age and protein yield CS in organic compared to non-organic conditions where modern cultivars are selected to benefit from later nitrogen (N) availability which includes the spring nitrogen applications tailored to coincide with peak crop demand. Under organic management, N release is largely based on the breakdown of fertility-building crops incorporated (ploughed-in) in the previous autumn. The release of nutrients from these residues is dependent on the soil conditions, which includes temperature and microbial populations, in addition to the potential leaching effect of high winter rainfall in the UK. In organic cereal crops, early resource capture is a major advantage for maximizing the utilization of nutrients from residue breakdown. It is concluded that selection of cultivars under conditions of high agrochemical inputs selects for cultivars that yield well under maximal conditions in terms of nutrient availability and pest, disease and weed control. The selection conditions for breeding have a tendency to select cultivars which perform relatively better in non-organic compared to organic systems.
Resumo:
This review assesses the impacts, both direct and indirect, of man-made changes to the composition of the air over a 200 year period on the severity of arable crop disease epidemics. The review focuses on two well-studied UK arable crops,wheat and oilseed rape, relating these examples to worldwide food security. In wheat, impacts of changes in concentrations of SO2 in air on two septoria diseases are discussed using data obtained from historical crop samples and unpublished experimental work. Changes in SO2 seem to alter septoria disease spectra both through direct effects on infection processes and through indirect effects on soil S status. Work on the oilseed rape diseases phoma stem canker and light leaf spot illustrates indirect impacts of increasing concentrations of greenhouse gases, mediated through climate change. It is projected that, by the 2050s, if diseases are not controlled, climate change will increase yields in Scotland but halve yields in southern England. These projections are discussed in relation to strategies for adaptation to environmental change. Since many strategies take10–15 years to implement, it is important to take appropriate decisions soon. Furthermore, it is essential to make appropriate investment in collation of long-term data, modelling and experimental work to guide such decision-making by industry and government, as a contribution to worldwide food security.
Resumo:
Crop production is inherently sensitive to fluctuations in weather and climate and is expected to be impacted by climate change. To understand how this impact may vary across the globe many studies have been conducted to determine the change in yield of several crops to expected changes in climate. Changes in climate are typically derived from a single to no more than a few General Circulation Models (GCMs). This study examines the uncertainty introduced to a crop impact assessment when 14 GCMs are used to determine future climate. The General Large Area Model for annual crops (GLAM) was applied over a global domain to simulate the productivity of soybean and spring wheat under baseline climate conditions and under climate conditions consistent with the 2050s under the A1B SRES emissions scenario as simulated by 14 GCMs. Baseline yield simulations were evaluated against global country-level yield statistics to determine the model's ability to capture observed variability in production. The impact of climate change varied between crops, regions, and by GCM. The spread in yield projections due to GCM varied between no change and a reduction of 50%. Without adaptation yield response was linearly related to the magnitude of local temperature change. Therefore, impacts were greatest for countries at northernmost latitudes where warming is predicted to be greatest. However, these countries also exhibited the greatest potential for adaptation to offset yield losses by shifting the crop growing season to a cooler part of the year and/or switching crop variety to take advantage of an extended growing season. The relative magnitude of impacts as simulated by each GCM was not consistent across countries and between crops. It is important, therefore, for crop impact assessments to fully account for GCM uncertainty in estimating future climates and to be explicit about assumptions regarding adaptation.
Resumo:
Low variability of crop production from year to year is desirable for many reasons, including reduced income risk and stability of supplies. Therefore, it is important to understand the nature of yield variability, whether it is changing through time, and how it varies between crops and regions. Previous studies have shown that national crop yield variability has changed in the past, with the direction and magnitude dependent on crop type and location. Whilst such studies acknowledge the importance of climate variability in determining yield variability, it has been assumed that its magnitude and its effect on crop production have not changed through time and, hence, that changes to yield variability have been due to non-climatic factors. We address this assumption by jointly examining yield and climate variability for three major crops (rice, wheat and maize) over the past 50 years. National yield time series and growing season temperature and precipitation were de-trended and related using multiple linear regression. Yield variability changed significantly in half of the crop–country combinations examined. For several crop–country combinations, changes in yield variability were related to changes in climate variability.