13 resultados para Total factor productivity
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Seven hardwood species were tested as underplants under Pinus elliottii plantations on the coastal lowlands of south-east Queensland. The species tested were: Flindersia brayleyana (F. Muell) (Queensland maple), F. australis (R. Br.), (crow's ash), Swietenia macrophylla (King) (American mahogany), Grevillea robusta (A. cunn) (southern silky oak), Elaeocarpus grandis (F. Muell) (silver quandong), F. ifflaiana (F. Meull) (Cairns hickory) and Ceratopetalum apetalum (D. Don) (coachwood). Most species (except E. grandis) established successfully but slowly. Underplants suffered 9-16% mortality during thinning of the overstorey. By 2004 when aged c. 38 years, four underplanted species; F. brayleyana, S. macrophylla, F. ifflaiana and E. grandis, had attained predominant heights of 20 m and mean diameter at breast height of 25 cm or better. The presence of underplants increased total site productivity by up to 23% and did not have any detrimental effect on the development of the overwood.This experiment has demonstrated that some rainforest species will survive and grow healthily as underplants in exotic pine plantations plus produce small merchantable logs within a 38 year rotation. The results also indicated the importance of correct species selection if an underplanting option is to be pursued as some species have been a complete failure (notably G. robusta).
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
Detailed data on seagrass distribution, abundance, growth rates and community structure information were collected at Orman Reefs in March 2004 to estimate the above-ground productivity and carbon assimilated by seagrass meadows. Seagrass meadows were re-examined in November 2004 for comparison at the seasonal extremes of seagrass abundance. Ten seagrass species were identified in the meadows on Orman Reefs. Extensive seagrass coverage was found in March (18,700 ha) and November (21,600 ha), with seagrass covering the majority of the intertidal reef-top areas and a large proportion of the subtidal areas examined. There were marked differences in seagrass above-ground biomass, distribution and species composition between the two surveys. Major changes between March and November included a substantial decline in biomass for intertidal meadows and an expansion in area of subtidal meadows. Changes were most likely a result of greater tidal exposure of intertidal meadows prior to November leading to desiccation and temperature-related stress. The Orman Reef seagrass meadows had a total above-ground productivity of 259.8 t DW day-1 and estimated carbon assimilation of 89.4 t C day-1 in March. The majority of this production came from the intertidal meadows which accounted for 81% of the total production. Intra-annual changes in seagrass species composition, shoot density and size of meadows measured in this study were likely to have a strong influence on the total above-ground production during the year. The net estimated above-ground productivity of Orman Reefs meadows in March 2004 (1.19 g C m-2 day-1) was high compared with other tropical seagrass areas that have been studied and also higher than many other marine, estuarine and terrestrial plant communities.
Resumo:
Residue retention is an important issue in evaluating the sustainability of production forestry. However, its long-term impacts have not been studied extensively, especially in sub-tropical environments. This study investigated the long-term impact of harvest residue retention on tree nutrition, growth and productivity of a F1 hybrid (Pinus elliottii var. elliottii × Pinus caribaea var. hondurensis) exotic pine plantation in sub-tropical Australia, under three harvest residue management regimes: (1) residue removal, RR0; (2) single residue retention, RR1; and (3) double residue retention, RR2. The experiment, established in 1996, is a randomised complete block design with 4 replicates. Tree growth measurements in this study were carried out at ages 2, 4, 6, 8 and 10 years, while foliar nutrient analyses were carried out at ages 2, 4, 6 and 10 years. Litter production and litter nitrogen (N) and phosphorus (P) measurements were carried out quarterly over a 15-month period between ages 9 and 10 years. Results showed that total tree growth was still greater in residue-retained treatments compared to the RR0 treatment. However, mean annual increments of diameter at breast height (MAID) and basal area (MAIB) declined significantly after age 4 years to about 68-78% at age 10 years. Declining foliar N and P concentrations accounted for 62% (p < 0.05) of the variation of growth rates after age 4 years, and foliar N and P concentrations were either marginal or below critical concentrations. In addition, litter production, and litter N and P contents were not significantly different among the treatments. This study suggests that the impact of residue retention on tree nutrition and growth rates might be limited over a longer period, and that the integration of alternative forest management practices is necessary to sustain the benefits of harvest residues until the end of the rotation.
Resumo:
Reduced supplies of nitrogen (N) in many soils of southern Queensland that were cropped exhaustively with cereals over many decades have been the focus of much research to avoid declines in profitability and sustainability of farming systems. A 45-month period of mixed grass (purple pigeon grass, Setaria incrassata Stapf; Rhodes grass, Chloris gayana Kunth.) and legume (lucerne, Medicago sativa L.; annual medics, M. scutellata L. Mill. and M. truncatula Gaertn.) pasture was one of several options that were compared at a fertility-depleted Vertosol at Warra, southern Queensland, to improve grain yields or increase grain protein concentration of subsequent wheat crops. Objectives of the study were to measure the productivity of a mixed grass and legume pasture grown over 45 months (cut and removed over 36 months) and its effects on yield and protein concentrations of the following wheat crops. Pasture production (DM t/ha) and aboveground plant N yield (kg/ha) for grass, legume (including a small amount of weeds) and total components of pasture responded linearly to total rainfall over the duration of each of 3 pastures sown in 1986, 1987 and 1988. Averaged over the 3 pastures, each 100 mm of rainfall resulted in 0.52 t/ha of grass, 0.44 t/ha of legume and 0.97 t/ha of total pasture DM, there being little variation between the 3 pastures. Aboveground plant N yield of the 3 pastures ranged from 17.2 to 20.5 kg/ha per 100 mm rainfall. Aboveground legume N in response to total rainfall was similar (10.6 - 13.2 kg/ha. 100 mm rainfall) across the 3 pastures in spite of very different populations of legumes and grasses at establishment. Aboveground grass N yield was 5.2 - 7.0 kg/ha per 100mm rainfall. In most wheat crops following pasture, wheat yields were similar to that of unfertilised wheat except in 1990 and 1994, when grain yields were significantly higher but similar to that for continuous wheat fertilised with 75 kg N/ha. In contrast, grain protein concentrations of most wheat crops following pasture responded positively, being substantially higher than unfertilised wheat but similar to that of wheat fertilised with 75 kg N/ha. Grain protein averaged over all years of assay was increased by 25 - 40% compared with that of unfertilised wheat. Stored water supplies after pasture were < 134mm (< 55% of plant available water capacity); for most assay crops water storages were 67 - 110 mm, an equivalent wet soil depth of only 0.3 - 0.45 m. Thus, the crop assays of pasture benefits were limited by low water supply to wheat crops. Moreover, the severity of common root rot in wheat crop was not reduced by pasture - wheat rotation.
Resumo:
Farmlets, each of 20 cows, were established to field test five milk production systems and provide a learning platform for farmers and researchers in a subtropical environment. The systems were developed through desktop modelling and industry consultation in response to the need for substantial increases in farm milk production following deregulation of the industry. Four of the systems were based on grazing and the continued use of existing farmland resource bases, whereas the fifth comprised a feedlot and associated forage base developed as a greenfield site. The field evaluation was conducted over 4 years under more adverse environmental conditions than anticipated with below average rainfall and restrictions on irrigation. For the grazed systems, mean annual milk yield per cow ranged from 6330 kg/year (1.9 cows/ha) for a herd based on rain-grown tropical pastures to 7617 kg/year (3.0 cows/ha) where animals were based on temperate and tropical irrigated forages. For the feedlot herd, production of 9460 kg/cow.year (4.3 cows/ha of forage base) was achieved. For all herds, the level of production achieved required annual inputs of concentrates of similar to 3 t DM/animal and purchased conserved fodder from 0.3 to 1.5 t DM/animal. This level of supplementary feeding made a major contribution to total farm nutrient inputs, contributing 50% or more of the nitrogen, phosphorus and potassium entering the farming system, and presents challenges to the management of manure and urine that results from the higher stocking rates enabled. Mean annual milk production for the five systems ranged from 88 to 105% of that predicted by the desktop modelling. This level of agreement for the grazed systems was achieved with minimal overall change in predicted feed inputs; however, the feedlot system required a substantial increase in inputs over those predicted. Reproductive performance for all systems was poorer than anticipated, particularly over the summer mating period. We conclude that the desktop model, developed as a rapid response to assist farmers modify their current farming systems, provided a reasonable prediction of inputs required and milk production. Further model development would need to consider more closely climate variability, the limitations summer temperatures place on reproductive success and the feed requirements of feedlot herds.
Resumo:
In the subtropics of Australia, the ryegrass component of irrigated perennial ryegrass (Lolium perenne) - white clover (Trifolium repens) pastures declines by approximately 40% in the summer following establishment, being replaced by summer-active C4 grasses. Tall fescue (Festuca arundinacea) is more persistent than perennial ryegrass and might resist this invasion, although tall fescue does not compete vigorously as a seedling. This series of experiments investigated the influence of ryegrass and tall fescue genotype, sowing time and sowing mixture as a means of improving tall fescue establishment and the productivity and persistence of tall fescue, ryegrass and white clover-based mixtures in a subtropical environment. Tall fescue frequency at the end of the establishment year decreased as the number of companion species sown in the mixture increased. Neither sowing mixture combinations nor sowing rates influenced overall pasture yield (of around 14 t/ha) in the establishment year but had a significant effect on botanical composition and component yields. Perennial ryegrass was less competitive than short-rotation ryegrass, increasing first-year yields of tall fescue by 40% in one experiment and by 10% in another but total yield was unaffected. The higher establishment-year yield (3.5 t/ha) allowed Dovey tall fescue to compete more successfully with the remaining pasture components than Vulcan (1.4 t/ha). Sowing 2 ryegrass cultivars in the mixture reduced tall fescue yields by 30% compared with a single ryegrass (1.6 t/ha), although tall fescue alone achieved higher yields (7.1 t/ha). Component sowing rate had little influence on composition or yield. Oversowing the ryegrass component into a 6-week-old sward of tall fescue and white clover improved tall fescue, white clover and overall yields in the establishment year by 83, 17 and 11%, respectively, but reduced ryegrass yields by 40%. The inclusion of red (T. pratense) and Persian (T. resupinatum) clovers and chicory (Cichorium intybus) increased first-year yields by 25% but suppressed perennial grass and clover components. Yields were generally maintained at around 12 t/ha/yr in the second and third years, with tall fescue becoming dominant in all 3 experiments. The lower tall fescue seeding rate used in the first experiment resulted in tall fescue dominance in the second year following establishment, whereas in Experiments 2 and 3 dominance occurred by the end of the first year. Invasion by the C4 grasses was relatively minor (<10%) even in the third year. As ryegrass plants died, tall fescue and, to a lesser extent, white clover increased as a proportion of the total sward. Treatment effects continued into the second, but rarely the third, year and mostly affected the yield of one of the components rather than total cumulative yield. Once tall fescue became dominant, it was difficult to re-introduce other pasture components, even following removal of foliage and moderate renovation. Severe renovation (reducing the tall fescue population by at least 30%) seems a possible option for redressing this situation.
Resumo:
The diet selected in autumn by steers fistulated at the oesophageous was studied in a subset of treatments in an extensive grazing study conducted in a Heteropogon contortus pasture in central Queensland between 1988 and 2001. These treatments were a factorial array of three stocking rates (4, 3 and 2 ha/steer) and three pasture types (native pasture, legume-oversown native pasture and animal diet supplement/spring-burning native pasture). Seasonal rainfall throughout this study was below the long-term mean and mean annual pasture utilisation ranged from 30 to 61%. Steers consistently selected H. contortus with levels decreasing from 47 to 18% of the diet as stocking rate increased from 4 ha/steer to 2 ha/steer. Stylosanthes scabra cv. Seca was always selected in legume-oversown pastures with diet composition varying from 35 to 66% despite its plant density increasing from 7 to 65 plants/m(2) and pasture composition from 20 to 50%. Steers also selected a diet containing Chrysopogon fallax, forbs and sedges in higher proportions than they were present in the pasture. Greater availability of the intermediate grasses Chloris divaricata and Eragrostis spp. was associated with increased stocking rates. Bothriochloa bladhii was seldom selected in the diet, especially when other palatable species were present in the pasture, despite B. bladhii often being the major contributor to total pasture yield. It was concluded that a stocking rate of 4 ha/steer will maintain the availability of H. contortus in the pasture.
Resumo:
The effect of plastic high tunnels on the performance of two strawberry (Fragaria ×ananassa) cultivars (Festival and Rubygem) and two breeding lines was studied in southeastern Queensland, Australia, over 2 years. Production in this area is affected by rain, with direct damage to the fruit and the development of fruit disease before harvest. The main objective of the study was to determine whether plants growing under tunnels had less rain damage, a lower incidence of disease, and higher yields than plants growing outdoors. Plants growing under the tunnels or outdoors had at best only small differences in leaf, crown, root, and flower and immature fruit dry weight. These responses were associated with relatively similar temperatures and relative humidities in the two growing environments. Marketable yields were 38% higher under the tunnels compared with yields outdoors in year 1, and 24% higher in year 2, mainly due to less rain damage. There were only small differences in the incidences of grey mold (Botrytis cinerea) and small and misshaped fruit in the plants growing under the tunnels and outdoors. There were also only small differences in postharvest quality, total soluble solids, and titratable acidity between the two environments. These results highlight the potential of plastic high tunnels for strawberry plants growing in subtropical areas that receive significant rainfall during the production season.
Resumo:
The influence of grazing management on total soil organic carbon (SOC) and soil total nitrogen (TN) in tropical grasslands is an issue of considerable ecological and economic interest. Here we have used linear mixed models to investigate the effect of grazing management on stocks of SOC and TN in the top 0.5 m of the soil profile. The study site was a long-term pasture utilization experiment, 26 years after the experiment was established for sheep grazing on native Mitchell grass (Astrebla spp.) pasture in northern Australia. The pasture utilization rates were between 0% (exclosure) and 80%, assessed visually. We found that a significant amount of TN had been lost from the top 0.1 m of the soil profile as a result of grazing, with 80% pasture utilization resulting in a loss of 84 kg ha−1 over the 26-year period. There was no significant effect of pasture utilization rate on TN when greater soil depths were considered. There was no significant effect of pasture utilization rate on stocks of SOC and soil particulate organic carbon (POC), or the C:N ratio at any depth; however, visual trends in the data suggested some agreement with the literature, whereby increased grazing pressure appeared to: (i) decrease SOC and POC stocks; and, (ii) increase the C:N ratio. Overall, the statistical power of the study was limited, and future research would benefit from a more comprehensive sampling scheme. Previous studies at the site have found that a pasture utilization rate of 30% is sustainable for grazing production on Mitchell grass; however, given our results, we conclude that N inputs (possibly through management of native N2-fixing pasture legumes) should be made for long-term maintenance of soil health, and pasture productivity, within this ecosystem.
Resumo:
Tension banding castration of cattle is gaining favour because it is relatively simple to perform and is promoted by retailers of the banders as a humane castration method. Two experiments were conducted, under tropical conditions using Bos indicus bulls comparing tension banding (Band) and surgical (Surgical) castration of weaner (7–10 months old) and mature (22–25 months old) bulls with and without pain management (NSAID (ketoprofen) or saline injected intramuscularly immediately prior to castration). Welfare outcomes were assessed using a range of measures; this paper reports on some physiological, morbidity and productivity-related responses to augment the behavioural responses reported in an accompanying paper. Blood samples were taken on the day of castration (day 0) at the time of restraint (0 min) and 30 min (weaners) or 40 min (mature bulls), 2 h, and 7 h; and days 1, 2, 3, 7, 14, 21 and 28 post-castration. Plasmas from day 0 were assayed for cortisol, creatine kinase, total protein and packed cell volume. Plasmas from the other samples were assayed for cortisol and haptoglobin (plus the 0 min sample). Liveweights were recorded approximately weekly to 6 weeks and at 2 and 3 months post-castration. Castration sites were checked at these same times to 2 months post-castration to score the extent of healing and presence of sepsis. Cortisol concentrations (mean ± s.e. nmol/L) were significantly (P < 0.05) higher in the Band (67 ± 4.5) compared with Surgical weaners (42 ± 4.5) at 2 h post-castration, but at 24 h post-castration were greater in the Surgical (43 ± 3.2) compared with the Band weaners (30 ± 3.2). The main effect of ketoprofen was on the cortisol concentrations of the mature Surgical bulls; concentrations were significantly reduced at 40 min (47 ± 7.2 vs. 71 ± 7.2 nmol/L for saline) and 2 h post-castration (24 ± 7.2, vs. 87 ± 7.2 nmol/L for saline). Ketoprofen, however, had no effect on the Band mature bulls, with their cortisol concentrations averaging 54 ± 5.1 nmol/L at 40 min and 92 ± 5.1 nmol/L at 2 h. Cortisol concentrations were also significantly elevated in the Band (83 ± 3.0 nmol/L) compared with Surgical mature bulls (57 ± 3.0 nmol/L) at weeks 2–4 post-castration. The timing of this elevation coincided with significantly elevated haptoglobin concentrations (mg/mL) in the Band bulls (2.97 ± 0.102 for mature bulls and 1.71 ± 0.025 for weaners, vs. 2.10 ± 0.102 and 1.45 ± 0.025 respectively for the Surgical treatment) and evidence of slow wound healing and sepsis in both the weaner (0.81 ± 0.089 not healed at week 4 for Band, 0.13 ± 0.078 for Surgical) and mature bulls (0.81 ± 0.090 at week 4 for Band, 0.38 ± 0.104 for Surgical). Overall, liveweight gains of both age groups were not affected by castration method. The findings of acute pain, chronic inflammation and possibly chronic pain in the mature bulls at least, together with poor wound healing in the Band bulls support behavioural findings reported in the accompanying paper and demonstrate that tension banding produces inferior welfare outcomes for weaner and mature bulls compared with surgical castration.
Resumo:
The financial health of beef cattle enterprises in northern Australia has declined markedly over the last decade due to an escalation in production and marketing costs and a real decline in beef prices. Historically, gains in animal productivity have offset the effect of declining terms of trade on farm incomes. This raises the question of whether future productivity improvements can remain a key path for lifting enterprise profitability sufficient to ensure that the industry remains economically viable over the longer term. The key objective of this study was to assess the production and financial implications for north Australian beef enterprises of a range of technology interventions (development scenarios), including genetic gain in cattle, nutrient supplementation, and alteration of the feed base through introduced pastures and forage crops, across a variety of natural environments. To achieve this objective a beef systems model was developed that is capable of simulating livestock production at the enterprise level, including reproduction, growth and mortality, based on energy and protein supply from natural C4 pastures that are subject to high inter-annual climate variability. Comparisons between simulation outputs and enterprise performance data in three case study regions suggested that the simulation model (the Northern Australia Beef Systems Analyser) can adequately represent the performance beef cattle enterprises in northern Australia. Testing of a range of development scenarios suggested that the application of individual technologies can substantially lift productivity and profitability, especially where the entire feedbase was altered through legume augmentation. The simultaneous implementation of multiple technologies that provide benefits to different aspects of animal productivity resulted in the greatest increases in cattle productivity and enterprise profitability, with projected weaning rates increasing by 25%, liveweight gain by 40% and net profit by 150% above current baseline levels, although gains of this magnitude might not necessarily be realised in practice. While there were slight increases in total methane output from these development scenarios, the methane emissions per kg of beef produced were reduced by 20% in scenarios with higher productivity gain. Combinations of technologies or innovative practices applied in a systematic and integrated fashion thus offer scope for providing the productivity and profitability gains necessary to maintain viable beef enterprises in northern Australia into the future.
Resumo:
Two trials were done in this project. One was a continuation of work started under a previous GRDC/SRDC-funded activity, 'Strategies to improve the integration of legumes into cane based farming systems'. This trial aimed to assess the impact of trash and tillage management options and nematicide application on nematodes and crop performance. Methods and results are contained in the following publication: Halpin NV, Stirling GR, Rehbein WE, Quinn B, Jakins A, Ginns SP. The impact of trash and tillage management options and nematicide application on crop performance and plant-parasitic nematode populations in a sugarcane/peanut farming system. Proc. Aust. Soc. Sugar Cane Technol. 37, 192-203. Nematicide application in the plant crop significantly reduced total numbers of plant parasitic nematodes (PPN) but there was no impact on yield. Application of nematicide to the ratoon crop significantly reduced sugar yield. The study confirmed other work demonstrating that implementation of strategies like reduced tillage reduced populations of total PPN, suggesting that the soil was more suppressive to PPN in those treatments. The second trial, a variety trial, demonstrated the limited value of nematicide application in sugarcane farming systems. This study has highlighted that growers shouldn’t view nematicides as a ‘cure all’ for paddocks that have historically had high PPN numbers. Nematicides have high mammalian toxicity, have the potential to contaminate ground water (Kookana et al. 1995) and are costly. The cost of nematicide used in R1 was approx. $320 - $350/ha, adding $3.50/t of cane in a 100 t/ha crop. Also, our study demonstrated that a single nematicide treatment at the application rate registered for sugarcane is not very effective in reducing populations of nematode pests. There appears to be some levels of resistance to nematodes within the current suite of varieties available to the southern canelands. For example the soil in plots that were growing Q183 had 560% more root knot nematodes / 200mL soil compared to plots that grew Q245. The authors see great value in investment into a nematode screening program that could rate varieties into groups of susceptibility to both major sugarcane nematode pests. Such a rating could then be built into a decision support ‘tree’ or tool to better enable producers to select varieties on a paddock by paddock basis.