32 resultados para cumulative sum
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Choy sum (Brassica rapa subsp. parachinensis) is a dark green leafy vegetable that contains high folate (vitamin B9) levels comparable to spinach. Folate is essential for the maintenance of human health and is obtained solely through dietary means. Analysis of the edible portion of choy sum by both microbiological assay and LC-MS/MS indicated that total folate activity remained significantly unchanged over 3 weeks storage at 4 degrees C. Inedible fractions consisted primarily of outer leaves, which showed signs of rotting after 14d, and a combination of rotting and yellowing after 21 d, contributing to 20% and 40% of product removal, respectively. Following deconjugation of the folate present in choy sum to monoglutamate and diglutamate derivatives, the principal forms (vitamers) of folate detected in choy sum were 5-methyltetrahydrofolate and 5-formyl tetrahydrofolate, followed by tetrahydrofolate (THF), 5,10-methenyl-THF, and 10-formyl folic acid. During storage, a significant decline in 5-formyl-THF was observed, with a slight but not significant increase in the combined 5-methyl-THF derivatives. The decline in 5-formyl-THF in relation to the other folate vitamers present may indicate that 5-formyl-THF is being utilised as a folate storage reserve, being interconverted to more metabolically active forms of folate, such as 5-methyl-THF. Although folate vitamer profile changed over the storage period, total folate activity did not significantly change. From a human nutritional perspective this is important, as while particular folate vitamers (e.g. 5-methyl-THF) are necessary for maintaining vital aspects of plant metabolism, it is less important to the human diet, as humans can absorb and interconvert multiple forms of folate. The current trial indicates that it is possible to store choy sum for up to 3 weeks at 4 degrees C without significantly affecting total folate concentration of the edible portion. Crown Copyright (C) 2012 Published by Elsevier B.V. All rights reserved.
Resumo:
Immediate and residual effects of two lengths of low plane of nutrition (PON) on the synthesis of milk protein and protein fractions were studied at the Mutdapilly Research Station, in south-east Queensland. Thirty-six multiparous Holstein-Friesian cows, between 46 and 102 days in milk (DIM) initially, were used in a completely randomised design experiment with three treatments. All cows were fed on a basal diet of ryegrass pasture (7.0 kg DM/cow.day), barley-sorghum concentrate mix (2.7 kg DM/cow.day) and a canola meal-mineral mix (1.3 kg DM/cow.day). To increase PON, 5.0 kg DM/cow.day supplemental maize and forage sorghum silage was added to the basal diet. The three treatments were (C) high PON (basal diet + supplemental silage); (L9) low PON (basal diet only) for a period of 9 weeks; and (L3) low PON (basal diet only) for a period of 3 weeks. The experiment comprised three periods (1) covariate – high PON, all groups (5 weeks), (2) period of low PON for either 3 weeks (L3) or 9 weeks (L9), and (3) period of high PON (all groups) to assess ability of cows to recover any production lost as a result of treatments (5 weeks). The low PON treatment periods for L3 and L9 were end-aligned so that all treatment groups began Period 3 together. Although there was a significant effect of L9 on yields of milk, protein, fat and lactose, and concentrations of true protein, whey protein and urea, these were not significantly different from L3. There were no residual effects of L3 or L9 on protein concentration or nitrogen distribution after 5 weeks of realimentation. There was no significant effect of low PON for 3 or 9 weeks on casein concentration or composition.
Resumo:
The hypothesis that contaminant plants growing amongst chickpea serve as Helicoverpa sinks by diverting oviposition pressure away from the main crop was tested under field conditions. Gain (recruitment) and loss (presumed mortality) of juvenile stages of Helicoverpa spp. on contaminant faba bean and wheat plants growing in chickpea plots were quantified on a daily basis over a 12-d period. The possibility of posteclosion movement of larvae from the contaminants to the surrounding chickpea crop was examined. Estimated total loss of the census population varied from 80 to 84% across plots and rows. The loss of brown eggs (40–47%) contributed most to the overall loss estimate, followed by loss of white eggs (27–35%) and larvae (6–9%). The cumulative number of individuals entering the white and brown egg and larval stages over the census period ranged from 15 to 58, 10–48 and 1–6 per m row, respectively. The corresponding estimates of mean stage-specific loss, expressed as a percentage of individuals entering the stage, ranged from 52 to 57% for white eggs, 87–108% for brown eggs and 71–87% for first-instar larvae. Mean larval density on chickpea plants in close proximity to the contaminant plants did not exceed the baseline larval density on chickpea further away from the contaminants across rows and plots. The results support the hypothesis that contaminant plants in chickpea plots serve as Helicoverpa sinks by diverting egg pressure from the main crop and elevating mortality of juvenile stages. Deliberate contamination of chickpea crops with other plant species merits further investigation as a cultural pest management strategy for Helicoverpa spp.
Resumo:
Piggery pond sludge (PPS) was applied, as-collected (Wet PPS) and following stockpiling for 12 months (Stockpiled PPS), to a sandy Sodosol and clay Vertosol at sites on the Darling Downs of Queensland. Laboratory measures of N availability were carried out on unamended and PPS-amended soils to investigate their value in estimating supplementary N needs of crops in Australia's northern grains region. Cumulative net N mineralised from the long-term (30 weeks) leached aerobic incubation was described by a first-order single exponential model. The mineralisation rate constant (0.057/week) was not significantly different between Control and PPS treatments or across soil types, when the amounts of initial mineral N applied in PPS treatments were excluded. Potentially mineralisable N (No) was significantly increased by the application of Wet PPS, and increased with increasing rate of application. Application of Wet PPS significantly increased the total amount of inorganic N leached compared with the Control treatments. Mineral N applied in Wet PPS contributed as much to the total mineral N status of the soil as did that which mineralised over time from organic N. Rates of C02 evolution during 30 weeks of aerobic leached incubation indicated that the Stockpiled PPS was more stabilised (19-28% of applied organic C mineralised) than the WetPPS (35-58% of applied organic C mineralised), due to higher lignin content in the former. Net nitrate-N produced following 12 weeks of aerobic non-leached incubation was highly correlated with net nitrate-N leached during 12 weeks of aerobic incubation (R^2 = 0.96), although it was <60% of the latter in both sandy and clayey soils. Anaerobically mineralisable N determined by waterlogged incubation of laboratory PPS-amended soil samples increased with increasing application rate of Wet PPS. Anaerobically minemlisable N from field-moist soil was well correlated with net N mineralised during 30 weeks of aerobic leached incubation (R^2 =0.90 sandy soil; R^2=0.93 clay soil). In the clay soil, the amount of mineral N produced from all the laboratory incubations was significantly correlated with field-measured nitrate-N in the soil profile (0-1.5 m depth) after 9 months of weed-free fallow following PPS application. In contrast, only anaerobic mineralisable N was significantly correlated with field nitrate-N in the sandy soil. Anaerobic incubation would, therefore, be suitable as a rapid practical test to estimate potentially mineralisable N following applications of different PPS materials in the field.
Resumo:
This paper aims to compare the shift in frequency distribution and skill of seasonal climate forecasting of both streamflow and rainfall in eastern Australia based on the Southern Oscillation Index (SOI) Phase system. Recent advances in seasonal forecasting of climate variables have highlighted opportunities for improving decision making in natural resources management. Forecasting of rainfall probabilities for different regions in Australia is available, but the use of similar forecasts for water resource supply has not been developed. The use of streamflow forecasts may provide better information for decision-making in irrigation supply and flow management for improved ecological outcomes. To examine the relative efficacy of seasonal forecasting of streamflow and rainfall, the shift in probability distributions and the forecast skill were evaluated using the Wilcoxon rank-sum test and the linear error in probability space (LEPS) skill score, respectively, at three river gauging stations in the Border Rivers Catchment of the Murray-Darling Basin in eastern Australia. A comparison of rainfall and streamflow distributions confirms higher statistical significance in the shift of streamflow distribution than that in rainfall distribution. Moreover, streamflow distribution showed greater skill of forecasting with 0-3 month lead time, compared to rainfall distribution.
Resumo:
Barramundi Lates calcarifer reared in cool water (20-22 degrees C) grow slowly and feed is used poorly compared with fish in warm water (28-32 degrees C). Two comparative slaughter growth assays were carried out with juvenile barramundi to see if increasing the digestible energy (DE) and/or the n-3 highly unsaturated fatty acid (n-3 HUFA) content of the feed would improve growth of fish raised in cool water. Increasing the DE content of the feed from 15 to 17 or 19 MJ kg(-1) while maintaining a constant protein to energy ratio in Experiment 1 brought about significant improvements in feed conversion ratio (FCR) (from 2.01 to 1.19) and daily growth coefficient (DGC; from 0.69 to 1.08%/day) for fish at 20 degrees C. For fish at 29 degrees C, improvements, while significant, were of a lesser magnitude: from 1.32 to 0.97 for FCR and from 3.24 to 3.65%/day for DGC. Increasing the absolute amount of dietary n-3 HUFA, expressed as the sum of eicosapentaenoic and docosahexaenoic fatty acids, from 0.5% to 2.0% in Experiment 2 improved DGC linearly and FCR curvilinearly for fish at 29 degrees C whereas at 20 degrees C, DGC was not affected while FCR improved slightly (from 1.83 to 1.68). Feed conversion ratio was optimized with a dietary n-3 HUFA of about 1.5%. Providing barramundi with a feed that is high in DE (viz 19 MJ kg(-1)) and a digestible protein to DE ratio of 22.5 g MJ(-1) is a practical strategy for improving the productivity of barramundi cultured in cool water whereas increasing dietary n-3 HUFA conferred very little additional benefit.
Resumo:
Three drafts of Bos indicus cross steers (initially 178-216 kg) grazed Leucaena-grass pasture [Leucaena leucocephala subspecies glabrata cv. Cunningham with green panic (Panicum maximum cv. trichoglume)] from late winter through to autumn during three consecutive years in the Burnett region of south-east Queensland. Measured daily weight gain (DWGActual) of the steers was generally 0.7-1.1 kg/day during the summer months. Estimated intakes of metabolisable energy and dry matter (DM) were calculated from feeding standards as the intakes required by the steers to grow at the DWGActual. Diet attributes were predicted from near infrared reflectance spectroscopy spectra of faeces (F.NIRS) using established calibration equations appropriate for northern Australian forages. Inclusion of some additional reference samples from cattle consuming Leucaena diets into F.NIRS calibrations based on grass and herbaceous legume-grass pastures improved prediction of the proportion of Leucaena in the diet. Mahalanobis distance values supported the hypothesis that the F.NIRS predictions of diet crude protein concentration and DM digestibility (DMD) were acceptable. F.NIRS indicated that the percentage of Leucaena in the diet varied widely (10-99%). Diet crude protein concentration and DMD were usually high, averaging 12.4 and 62%, respectively, and were related asymptotically to the percentage of Leucaena in the diet (R2 = 0.48 and 0.33, respectively). F.NIRS calibrations for DWG were not satisfactory to predict this variable from an individual faecal sample since the s.e. of prediction were 0.33-0.40 kg/day. Cumulative steer liveweight (LW) predicted from F.NIRS DWG calibrations, which had been previously developed with tropical grass and grass-herbaceous legume pastures, greatly overestimated the measured steer LW; therefore, these calibrations were not useful. Cumulative steer LW predicted from a modified F.NIRS DWG calibration, which included data from the present study, was strongly correlated (R2 = 0.95) with steer LW but overestimated LW by 19-31 kg after 8 months. Additional reference data are needed to develop robust F.NIRS calibrations to encompass the diversity of Leucaena pastures of northern Australia. In conclusion, the experiment demonstrated that F.NIRS could improve understanding of diet quality and nutrient intake of cattle grazing Leucaena-grass pasture, and the relationships between nutrient supply and cattle growth.
Resumo:
Despite an abundance of polyembryonic genotypes and the need for rootstocks that improve scion yield and productivity, simultaneous field testing of a wide range of mango (Mangifera indica L.) genotypes as rootstocks has not previously been reported. In this experiment, we examined the growth and yield of 'Kensington Pride' on 64 mango genotypes of diverse origin during the first four seasons of fruit production to identify those worth longer-term assessment. We also recorded morphological characteristics of seedlings of 46 of these genotypes in an attempt to relate these measures to subsequent field performance. Tree canopy development on the most vigorous rootstocks was almost double that on the least vigorous. Growth rates differed by more than 160%. Cumulative marketable yield ranged from 36 kg/tree for the lowest yielding rootstock to 181 kg/tree for the most productive. Yield efficiency also differed markedly among the 64 rootstocks with the best treatment being 3.5 times more efficient than the poorest treatment. No relationship was found between yield efficiency and tree size, suggesting it is possible to select highly efficient rootstocks of differing vigor. Two genotypes ('Brodie' and 'MYP') stood out as providing high yield efficiency with small tree size. A further two genotypes ('B' and 'Watertank') were identified as offering high yield efficiency and large tree size and should provide high early yields at traditional tree spacing. Efforts to relate the morphology of different genotype seedlings to subsequent performance as a rootstock showed that nursery performance of mango seedlings is no indication of their likely behavior as a rootstock. The economic cost of poor yields and low yield efficiencies during the early years of commercial orchard production provide a rationale for culling many of the rootstock treatments in this experiment and concentrating future assessment on the top ~20% of the 64 treatments. Of these, 'MYP', 'B', 'Watertank', 'Manzano', and 'Pancho' currently show the most promise.
Resumo:
Thirty-seven surface (0-0.10 or 0-0.20 m) soils covering a wide range of soil types (16 Vertosols, 6 Ferrosols, 6 Dermosols, 4 Hydrosols, 2 Kandosols, 1 Sodosol, 1 Rudosol, and 1 Chromosol) were exhaustively cropped in 2 glasshouse experiments. The test species were Panicum maximum cv. Green Panic in Experiment A and Avena sativa cv. Barcoo in Experiment B. Successive forage harvests were taken until the plants could no longer grow in most soils because of severe potassium (K) deficiency. Soil samples were taken prior to cropping and after the final harvest in both experiments, and also after the initial harvest in Experiment B. Samples were analysed for solution K, exchangeable K (Exch K), tetraphenyl borate extractable K for extraction periods of 15 min (TBK15) and 60 min (TBK60), and boiling nitric acid extractable K (Nitric K). Inter-correlations between the initial levels of the various soil K parameters indicated that the following pools were in sequential equilibrium: solution K, Exch K, fast release fixed K [estimated as (TBK15-Exch K)], and slow release fixed K [estimated as (TBK60-TBK15)]. Structural K [estimated as (Nitric K-TBK60)] was not correlated with any of the other pools. However, following exhaustive drawdown of soil K by cropping, structural K became correlated with solution K, suggesting dissolution of K minerals when solution K was low. The change in the various K pools following cropping was correlated with K uptake at Harvest 1 ( Experiment B only) and cumulative K uptake ( both experiments). The change in Exch K for 30 soils was linearly related to cumulative K uptake (r = 0.98), although on average, K uptake was 35% higher than the change in Exch K. For the remaining 7 soils, K uptake considerably exceeded the change in Exch K. However, the changes in TBK15 and TBK60 were both highly linearly correlated with K uptake across all soils (r = 0.95 and 0.98, respectively). The slopes of the regression lines were not significantly different from unity, and the y-axis intercepts were very small. These results indicate that the plant is removing K from the TBK pool. Although the change in Exch K did not consistently equate with K uptake across all soils, initial Exch K was highly correlated with K uptake (r = 0.99) if one Vertosol was omitted. Exchangeable K is therefore a satisfactory diagnostic indicator of soil K status for the current crop. However, the change in Exch K following K uptake is soil-dependent, and many soils with large amounts of TBK relative to Exch K were able to buffer changes in Exch K. These soils tended to be Vertosols occurring on floodplains. In contrast, 5 soils (a Dermosol, a Rudosol, a Kandosol, and 2 Hydrosols) with large amounts of TBK did not buffer decreases in Exch K caused by K uptake, indicating that the TBK pool in these soils was unavailable to plants under the conditions of these experiments. It is likely that K fertiliser recommendations will need to take account of whether the soil has TBK reserves, and the availability of these reserves, when deciding rates required to raise exchangeable K status to adequate levels.
Resumo:
Climate variability and change are risk factors for climate sensitive activities such as agriculture. Managing these risks requires "climate knowledge", i.e. a sound understanding of causes and consequences of climate variability and knowledge of potential management options that are suitable in light of the climatic risks posed. Often such information about prognostic variables (e.g. yield, rainfall, run-off) is provided in probabilistic terms (e.g. via cumulative distribution functions, CDF), whereby the quantitative assessments of these alternative management options is based on such CDFs. Sound statistical approaches are needed in order to assess whether difference between such CDFs are intrinsic features of systems dynamics or chance events (i.e. quantifying evidences against an appropriate null hypothesis). Statistical procedures that rely on such a hypothesis testing framework are referred to as "inferential statistics" in contrast to descriptive statistics (e.g. mean, median, variance of population samples, skill scores). Here we report on the extension of some of the existing inferential techniques that provides more relevant and adequate information for decision making under uncertainty.
Resumo:
A genetic solution to breech strike control is attractive, as it is potentially permanent, cumulative, would not involve increased use of chemicals and may ultimately reduce labour inputs. There appears to be significant opportunity to reduce the susceptibility of Merinos to breech strike by genetic means although it is unlikely that in the short term breeding alone will be able to confer the degree of protection provided by mulesing and tail docking. Breeding programmes that aim to replace surgical techniques of flystrike prevention could potentially: reduce breech wrinkle; increase the area of bare skin in the perineal area; reduce tail length and wool cover on and near the tail; increase shedding of breech wool; reduce susceptibility to internal parasites and diarrhoea; and increase immunological resistance to flystrike. The likely effectiveness of these approaches is reviewed and assessed here. Any breeding programme that seeks to replace surgical mulesing and tail docking will need to make sheep sufficiently resistant that the increased requirement for other strike management procedures remains within practically acceptable bounds and that levels of strike can be contained to ethically acceptable levels.
Resumo:
Marker ordering during linkage map construction is a critical component of QTL mapping research. In recent years, high-throughput genotyping methods have become widely used, and these methods may generate hundreds of markers for a single mapping population. This poses problems for linkage analysis software because the number of possible marker orders increases exponentially as the number of markers increases. In this paper, we tested the accuracy of linkage analyses on simulated recombinant inbred line data using the commonly used Map Manager QTX (Manly et al. 2001: Mammalian Genome 12, 930-932) software and RECORD (Van Os et al. 2005: Theoretical and Applied Genetics 112, 30-40). Accuracy was measured by calculating two scores: % correct marker positions, and a novel, weighted rank-based score derived from the sum of absolute values of true minus observed marker ranks divided by the total number of markers. The accuracy of maps generated using Map Manager QTX was considerably lower than those generated using RECORD. Differences in linkage maps were often observed when marker ordering was performed several times using the identical dataset. In order to test the effect of reducing marker numbers on the stability of marker order, we pruned marker datasets focusing on regions consisting of tightly linked clusters of markers, which included redundant markers. Marker pruning improved the accuracy and stability of linkage maps because a single unambiguous marker order was produced that was consistent across replications of analysis. Marker pruning was also applied to a real barley mapping population and QTL analysis was performed using different map versions produced by the different programs. While some QTLs were identified with both map versions, there were large differences in QTL mapping results. Differences included maximum LOD and R-2 values at QTL peaks and map positions, thus highlighting the importance of marker order for QTL mapping
Resumo:
Weed eradication programs often require 10 years or more to achieve their objective. It is important that progress is evaluated on a regular basis so that programs that are 'on track' can be distinguished from those that are unlikely to succeed. Earlier research has addressed conformity of eradication programs to the delimitation criterion. In this paper evaluation in relation to the containment and extirpation criteria is considered. Because strong evidence of containment failure (i.e. spread from infestations targeted for eradication) is difficult to obtain, it generally will not be practicable to evaluate how effective eradication programs are at containing the target species. However, chronic failure of containment will be reflected in sustained increases in cumulative infested area and thus a failure to delimit a weed invasion. Evaluating the degree of conformity to the delimitation and extirpation criteria is therefore sufficient to give an appraisal of progress towards the eradication objective. A significant step towards eradication occurs when a weed is no longer readily detectable at an infested site, signalling entry to the monitoring phase. This transition will occur more quickly if reproduction is prevented consistently. Where an invasion consists of multiple infestations, the monitoring profile (frequency distribution of time since detection) provides a summary of the overall effectiveness of the eradication program in meeting the extirpation criterion. Eradication is generally claimed when the target species has not been detected for a period equal to or greater than its seed longevity, although there is often considerable uncertainty in estimates of the latter. Recently developed methods, which take into consideration the cost of continued monitoring vs. the potential cost of damage should a weed escape owing to premature cessation of an eradication program, can assist managers to decide when to terminate weed eradication programs.
Resumo:
The effects on yield, botanical composition and persistence, of using a variable defoliation schedule as a means of optimising the quality of the tall fescue component of simple and complex temperate pasture mixtures in a subtropical environment was studied in a small plot cutting experiment at Gatton Research Station in south-east Queensland. A management schedule of 2-, 3- and 4-weekly defoliations in summer, autumn and spring and winter, respectively, was imposed on 5 temperate pasture mixtures: 2 simple mixtures including tall fescue (Festuca arundinacea) and white clover (Trifolium repens); 2 mixtures including perennial ryegrass (Lolium perenne), tall fescue and white clover; and a complex mixture, which included perennial ryegrass, tall fescue, white, red (T. pratense) and Persian (T. resupinatum) clovers and chicory (Cichorium intybus). Yield from the variable cutting schedule was 9% less than with a standard 4-weekly defoliation. This loss resulted from reductions in both the clover component (13%) and cumulative grass yield (6%). There was no interaction between cutting schedule and sowing mixture, with simple and complex sowing mixtures reacting in a similar manner to both cutting schedules. The experiment also demonstrated that, in complex mixtures, the cutting schedules used failed to give balanced production from all sown components. This was especially true of the grass and white clover components of the complex mixture, as chicory and Persian clover components dominated the mixtures, particularly in the first year. Quality measurements (made only in the final summer) suggested that variable management had achieved a quality improvement with increases in yields of digestible crude protein (19%) and digestible dry matter (9%) of the total forage produced in early summer. The improvements in the yields of digestible crude protein and digestible dry matter of the tall fescue component in late summer were even greater (28 and 19%, respectively). While advantages at other times of the year were expected to be smaller, the data suggested that the small loss in total yield was likely to be offset by increases in digestibility of available forage for grazing stock, especially in the critical summer period.
Resumo:
In the subtropics of Australia, the ryegrass component of irrigated perennial ryegrass (Lolium perenne) - white clover (Trifolium repens) pastures declines by approximately 40% in the summer following establishment, being replaced by summer-active C4 grasses. Tall fescue (Festuca arundinacea) is more persistent than perennial ryegrass and might resist this invasion, although tall fescue does not compete vigorously as a seedling. This series of experiments investigated the influence of ryegrass and tall fescue genotype, sowing time and sowing mixture as a means of improving tall fescue establishment and the productivity and persistence of tall fescue, ryegrass and white clover-based mixtures in a subtropical environment. Tall fescue frequency at the end of the establishment year decreased as the number of companion species sown in the mixture increased. Neither sowing mixture combinations nor sowing rates influenced overall pasture yield (of around 14 t/ha) in the establishment year but had a significant effect on botanical composition and component yields. Perennial ryegrass was less competitive than short-rotation ryegrass, increasing first-year yields of tall fescue by 40% in one experiment and by 10% in another but total yield was unaffected. The higher establishment-year yield (3.5 t/ha) allowed Dovey tall fescue to compete more successfully with the remaining pasture components than Vulcan (1.4 t/ha). Sowing 2 ryegrass cultivars in the mixture reduced tall fescue yields by 30% compared with a single ryegrass (1.6 t/ha), although tall fescue alone achieved higher yields (7.1 t/ha). Component sowing rate had little influence on composition or yield. Oversowing the ryegrass component into a 6-week-old sward of tall fescue and white clover improved tall fescue, white clover and overall yields in the establishment year by 83, 17 and 11%, respectively, but reduced ryegrass yields by 40%. The inclusion of red (T. pratense) and Persian (T. resupinatum) clovers and chicory (Cichorium intybus) increased first-year yields by 25% but suppressed perennial grass and clover components. Yields were generally maintained at around 12 t/ha/yr in the second and third years, with tall fescue becoming dominant in all 3 experiments. The lower tall fescue seeding rate used in the first experiment resulted in tall fescue dominance in the second year following establishment, whereas in Experiments 2 and 3 dominance occurred by the end of the first year. Invasion by the C4 grasses was relatively minor (<10%) even in the third year. As ryegrass plants died, tall fescue and, to a lesser extent, white clover increased as a proportion of the total sward. Treatment effects continued into the second, but rarely the third, year and mostly affected the yield of one of the components rather than total cumulative yield. Once tall fescue became dominant, it was difficult to re-introduce other pasture components, even following removal of foliage and moderate renovation. Severe renovation (reducing the tall fescue population by at least 30%) seems a possible option for redressing this situation.