72 resultados para lost productivity costs
Resumo:
The modern consumer has an attitude that food safety is non-negotiable issue – the consumer simply demands food to be safe. Yet, at the same time, the modern consumer has an expectation that the food safety is the responsibility of others – the primary producer, the processing company, the supermarket, commercial food handlers and so on. Given this environment, all food animal industries have little choice but to regard food safety as a key issue. As an example, the chicken meat industry, via the two main industry funding bodies – the Rural Industries Research and Development Corporation (Chicken Meat) and the Poultry CRC – has a comprehensive research program that seeks to focus on reducing the risks of food-borne diseases at all points of the food processing chain – from the farm to the processing plant. The scale of the issue for all industries can be illustrated by an analysis of the problem of campylobacterosis – a major food-borne disease. It has been estimated that there are around 230,000 cases of campylobacterosis per year. In 1995, it was estimated that each case of food-borne campylobacterosis in the USA was costing between $(US) 350-580. Hence, a reasonable conservative estimate is that each Australian case in 2010 would result in a cost of around $500 (this includes hospital, medication and lost productivity costs). Hence, this single food-borne agent could be costing Australian society around $115 million annually. In the light of these types of estimated costs for just one food-borne pathogen, it is easy to understand the importance that all food animal industries place on food safety.
Resumo:
Green bean production accounts for 2.4% of the total value of Australian vegetable production and was Australia's tenth largest vegetable crop in 2008-2009 by value. Australian green bean production is concentrated in Queensland (51%) and Tasmania (34%) where lost productivity as a direct result of insect damage is recognised as a key threat to the industry (AUSVEG, 2011). Green beans attract a wide range of insect pests, with thrips causing the most damage to the harvestable product, the pod. Thrips populations were monitored in green bean crops in the Gatton Research Facility, Lockyer Valley, South-east Queensland, Australia from 2002-2011. Field trials were conducted to identify the thrips species present, to record fluctuation in abundance during the season and assess pod damage as a direct result of thrips. Thirteen species of thrips were recorded during this time on bean plantings, with six dominant species being collected during most of the growing season: Frankliniella occidentalis, F. schultzei, Megalurothrips usitatus, Pseudanaphothrips achaetus, Thrips imaginis and T. tabaci. Thrips numbers ranged from less than one thrips per flower to as high as 5.39 thrips per flower. The highest incidence of thrips presence found in October/November 2008, resulted in 10.74% unmarketable pods due to thrips damage, while the lowest number of thrips recorded in April 2008 caused a productivity loss of 36.65% of pods as a result of thrips damage.
Resumo:
Increase water use efficiency and productivity, and reduce energy and water usage and costs, of dairy and fodder enterprises, to reduce costs of milk production.
Resumo:
The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.
Resumo:
The financial health of beef cattle enterprises in northern Australia has declined markedly over the last decade due to an escalation in production and marketing costs and a real decline in beef prices. Historically, gains in animal productivity have offset the effect of declining terms of trade on farm incomes. This raises the question of whether future productivity improvements can remain a key path for lifting enterprise profitability sufficient to ensure that the industry remains economically viable over the longer term. The key objective of this study was to assess the production and financial implications for north Australian beef enterprises of a range of technology interventions (development scenarios), including genetic gain in cattle, nutrient supplementation, and alteration of the feed base through introduced pastures and forage crops, across a variety of natural environments. To achieve this objective a beef systems model was developed that is capable of simulating livestock production at the enterprise level, including reproduction, growth and mortality, based on energy and protein supply from natural C4 pastures that are subject to high inter-annual climate variability. Comparisons between simulation outputs and enterprise performance data in three case study regions suggested that the simulation model (the Northern Australia Beef Systems Analyser) can adequately represent the performance beef cattle enterprises in northern Australia. Testing of a range of development scenarios suggested that the application of individual technologies can substantially lift productivity and profitability, especially where the entire feedbase was altered through legume augmentation. The simultaneous implementation of multiple technologies that provide benefits to different aspects of animal productivity resulted in the greatest increases in cattle productivity and enterprise profitability, with projected weaning rates increasing by 25%, liveweight gain by 40% and net profit by 150% above current baseline levels, although gains of this magnitude might not necessarily be realised in practice. While there were slight increases in total methane output from these development scenarios, the methane emissions per kg of beef produced were reduced by 20% in scenarios with higher productivity gain. Combinations of technologies or innovative practices applied in a systematic and integrated fashion thus offer scope for providing the productivity and profitability gains necessary to maintain viable beef enterprises in northern Australia into the future.
Resumo:
Productivity decline in sown grass pastures is widespread in northern Australia and reduces production by approximately 50%, a farm gate cost to industry of > $17B over the next 30 years. Buffel grass is the most widely established sown species (>75% of plantings) and has been estimated to be “dominant” on 5.8 M hectares and “common” on a further 25.9 M hectares of Queensland. Legumes are the most cost effective mitigation option and can reclaim 30-50% of lost production. Commercial use of legumes has achieved mixed results with notable successes but many failures. There is significant opportunity to improve commercial results from legumes using existing technologies, however there is a need for targeted research to improve the reliability of establishment and productivity of legumes. This review recommends the grazing industry invest in targeted R,D&E to assist industry in improving production and sustainability of rundown pastures.
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
Results from the humid tropics of Australia demonstrate that diverse plantations can achieve greater productivity than monocultures. We found that increases in both the observed species number and the effective species richness were significantly related to increased levels of productivity as measured by stand basal area or mean individual tree basal area. Four of five plantation species were more productive in mixtures with other species than in monocultures, offering on average, a 55% increase in mean tree basal area. A general linear model suggests that species richness had a significant effect on mean individual tree basal area when environmental variables were included in the model. As monoculture plantations are currently the preferred reforestation method throughout the tropics these results suggest that significant productivity and ecological gains could be made if multi-species plantations are more broadly pursued.
Resumo:
Detailed data on seagrass distribution, abundance, growth rates and community structure information were collected at Orman Reefs in March 2004 to estimate the above-ground productivity and carbon assimilated by seagrass meadows. Seagrass meadows were re-examined in November 2004 for comparison at the seasonal extremes of seagrass abundance. Ten seagrass species were identified in the meadows on Orman Reefs. Extensive seagrass coverage was found in March (18,700 ha) and November (21,600 ha), with seagrass covering the majority of the intertidal reef-top areas and a large proportion of the subtidal areas examined. There were marked differences in seagrass above-ground biomass, distribution and species composition between the two surveys. Major changes between March and November included a substantial decline in biomass for intertidal meadows and an expansion in area of subtidal meadows. Changes were most likely a result of greater tidal exposure of intertidal meadows prior to November leading to desiccation and temperature-related stress. The Orman Reef seagrass meadows had a total above-ground productivity of 259.8 t DW day-1 and estimated carbon assimilation of 89.4 t C day-1 in March. The majority of this production came from the intertidal meadows which accounted for 81% of the total production. Intra-annual changes in seagrass species composition, shoot density and size of meadows measured in this study were likely to have a strong influence on the total above-ground production during the year. The net estimated above-ground productivity of Orman Reefs meadows in March 2004 (1.19 g C m-2 day-1) was high compared with other tropical seagrass areas that have been studied and also higher than many other marine, estuarine and terrestrial plant communities.
Resumo:
Residue retention is an important issue in evaluating the sustainability of production forestry. However, its long-term impacts have not been studied extensively, especially in sub-tropical environments. This study investigated the long-term impact of harvest residue retention on tree nutrition, growth and productivity of a F1 hybrid (Pinus elliottii var. elliottii × Pinus caribaea var. hondurensis) exotic pine plantation in sub-tropical Australia, under three harvest residue management regimes: (1) residue removal, RR0; (2) single residue retention, RR1; and (3) double residue retention, RR2. The experiment, established in 1996, is a randomised complete block design with 4 replicates. Tree growth measurements in this study were carried out at ages 2, 4, 6, 8 and 10 years, while foliar nutrient analyses were carried out at ages 2, 4, 6 and 10 years. Litter production and litter nitrogen (N) and phosphorus (P) measurements were carried out quarterly over a 15-month period between ages 9 and 10 years. Results showed that total tree growth was still greater in residue-retained treatments compared to the RR0 treatment. However, mean annual increments of diameter at breast height (MAID) and basal area (MAIB) declined significantly after age 4 years to about 68-78% at age 10 years. Declining foliar N and P concentrations accounted for 62% (p < 0.05) of the variation of growth rates after age 4 years, and foliar N and P concentrations were either marginal or below critical concentrations. In addition, litter production, and litter N and P contents were not significantly different among the treatments. This study suggests that the impact of residue retention on tree nutrition and growth rates might be limited over a longer period, and that the integration of alternative forest management practices is necessary to sustain the benefits of harvest residues until the end of the rotation.
Resumo:
Maize (Zea mays L.) is a chill-susceptible crop cultivated in northern latitude environments. The detrimental effects of cold on growth and photosynthetic activity have long been established. However, a general overview of how important these processes are with respect to the reduction of productivity reported in the field is still lacking. In this study, a model-assisted approach was used to dissect variations in productivity under suboptimal temperatures and quantify the relative contributions of light interception (PARc) and radiation use efficiency (RUE) from emergence to flowering. A combination of architectural and light transfer models was used to calculate light interception in three field experiments with two cold-tolerant lines and at two sowing dates. Model assessment confirmed that the approach was suitable to infer light interception. Biomass production was strongly affected by early sowings. RUE was identified as the main cause of biomass reduction during cold events. Furthermore, PARc explained most of the variability observed at flowering, its relative contributions being more or less important according to the climate experienced. Cold temperatures resulted in lower PARc, mainly because final leaf length and width were significantly reduced for all leaves emerging after the first cold occurrence. These results confirm that virtual plants can be useful as fine phenotyping tools. A scheme of action of cold on leaf expansion, light interception and radiation use efficiency is discussed with a view towards helping breeders define relevant selection criteria. This paper originates from a presentation at the 5th International Workshop on Functional–Structural Plant Models, Napier, New Zealand, November 2007.
Resumo:
Reduced supplies of nitrogen (N) in many soils of southern Queensland that were cropped exhaustively with cereals over many decades have been the focus of much research to avoid declines in profitability and sustainability of farming systems. A 45-month period of mixed grass (purple pigeon grass, Setaria incrassata Stapf; Rhodes grass, Chloris gayana Kunth.) and legume (lucerne, Medicago sativa L.; annual medics, M. scutellata L. Mill. and M. truncatula Gaertn.) pasture was one of several options that were compared at a fertility-depleted Vertosol at Warra, southern Queensland, to improve grain yields or increase grain protein concentration of subsequent wheat crops. Objectives of the study were to measure the productivity of a mixed grass and legume pasture grown over 45 months (cut and removed over 36 months) and its effects on yield and protein concentrations of the following wheat crops. Pasture production (DM t/ha) and aboveground plant N yield (kg/ha) for grass, legume (including a small amount of weeds) and total components of pasture responded linearly to total rainfall over the duration of each of 3 pastures sown in 1986, 1987 and 1988. Averaged over the 3 pastures, each 100 mm of rainfall resulted in 0.52 t/ha of grass, 0.44 t/ha of legume and 0.97 t/ha of total pasture DM, there being little variation between the 3 pastures. Aboveground plant N yield of the 3 pastures ranged from 17.2 to 20.5 kg/ha per 100 mm rainfall. Aboveground legume N in response to total rainfall was similar (10.6 - 13.2 kg/ha. 100 mm rainfall) across the 3 pastures in spite of very different populations of legumes and grasses at establishment. Aboveground grass N yield was 5.2 - 7.0 kg/ha per 100mm rainfall. In most wheat crops following pasture, wheat yields were similar to that of unfertilised wheat except in 1990 and 1994, when grain yields were significantly higher but similar to that for continuous wheat fertilised with 75 kg N/ha. In contrast, grain protein concentrations of most wheat crops following pasture responded positively, being substantially higher than unfertilised wheat but similar to that of wheat fertilised with 75 kg N/ha. Grain protein averaged over all years of assay was increased by 25 - 40% compared with that of unfertilised wheat. Stored water supplies after pasture were < 134mm (< 55% of plant available water capacity); for most assay crops water storages were 67 - 110 mm, an equivalent wet soil depth of only 0.3 - 0.45 m. Thus, the crop assays of pasture benefits were limited by low water supply to wheat crops. Moreover, the severity of common root rot in wheat crop was not reduced by pasture - wheat rotation.
Resumo:
Tick fever is an important disease of cattle where Rhipicephalus (Boophilus) microplus acts as a vector for the three causal organisms Babesia bovis, Babesia bigemina and Anaplasma marginale. Bos indicus cattle and their crosses are more resistant to the clinical effects of infection with B. bovis and B. bigemina than are Bos taurus cattle. Resistance is not complete, however, and herds of B. indicus-cross cattle are still at risk of babesiosis in environments where exposure to B. bovis is light in most years but occasionally high. The susceptibility of B. indicus cattle and their crosses to infection with A. marginale is similar to that of B. taurus cattle. In herds of B. indicus cattle and their crosses the infection rate of Babesia spp. and A. marginale is lowered because fewer ticks are likely to attach per day due to reduced numbers of ticks in the field (long-term effect on population, arising from high host resistance) and because a smaller proportion of ticks that do develop to feed on infected cattle will in turn be infected (due to lower parasitaemia). As a consequence, herds of B. indicus cattle are less likely than herds of B. taurus cattle to have high levels of population immunity to babesiosis or anaplasmosis. The effects of acaricide application on the probability of clinical disease due to anaplasmosis and babesiosis are unpredictable and dependent on the prevalence of infection in ticks and in cattle at the time of application. Attempting to manipulate population immunity through the toleration of specific threshold numbers of ticks with the aim of controlling tick fever is not reliable and the justification for acaricide application should be for the control of ticks rather than for tick fever. Vaccination of B. indicus cattle and their crosses is advisable in all areas where ticks exist, although vaccination against B. bigemina is probably not essential in pure B. indicus animals.
Resumo:
Plugs or containerized plants can offer several advantages over traditional bare-rooted runner plants for strawberry (Fragaria x ananassa) production. Some of these benefits include easier planting, better establishment, fewer pests and diseases, and lower water use during plant establishment resulting in less leaching of applied fertilizers. Plugs also offer the potential for mechanical planting. In some areas of Europe and North America, plugs provide earlier production, greater productivity and larger fruit than runners. Research has also shown that the plants can be grown under short days and low temperatures to manipulate flower initiation and fruiting. Plugs are more expensive to buy compared with runner plants, and will only be adopted by industry if the extra costs are matched by convenience, resource conservation, increased fruiting and returns to producers. We investigated the productivity of 'Festival' and 'Sugarbaby' propagated as plugs (75 cm3 containers) and runners from Stanthorpe in southern Queensland (elevation of 872 m), and grown at Nambour on the Sunshine Coast (elevation 29 m). At planting, the plug plants weighed 0.8 ± 0.1 g DW compared with 53 ± 0.5 g DW for the runner plants. 'Sugarbaby' plugs were larger than 'Festival' plugs (33 ± 0.6 g versus 2.9 ± 0.6 g). The differences in growth at planting were maintained until the third week of July (day 94), with the plug plants weighing 17.8 ± 2.2 g, and the runner plants 21.4 ± 23 g. The proportion of plant dry matter allocated to the leaves increased over time from 59 to 70%, while the proportion allocated to the roots decreased from 21 to 10%. Harvest commenced after 60 days, with the plug plants yielding only 60% of the yields of the runner plants up until 8 August or day 109 (14.2 ± 1.4 g plant -1 week-1 versus 23.6 ± 1.9 g plant-1 week-1). 'Festival' (22.2 ± 2.0 g plant-1 week -1) had higher yields than 'Sugarbaby' (15.5 ± 1.5 g plant-1 week-1), even though plants of the latter were larger. Average fruit weight was 15.6 ± 0.3 g, with no effect of cultivar, plant type or harvest time. In other words, the differences in yield between the various treatments were due to differences in fruit set The lower yields of the plug plants probably reflect their small size at planting. Future research should determine whether plugs grown in larger cells (150 to 300 cm3 as in the USA and Europe) are more productive. Tips to be grown in larger containers should be harvested earlier than those for small cells to maximize root growth of the plug plant. This will probably extend the time required from harvest of the tips and potting them from the current four to five weeks, to eight to ten weeks.