16 resultados para Evaporative water loss
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Sheep and cattle are frequently subjected to feed and water deprivation (FWD) for about 12 h before, and then during, transport to reduce digesta load in the gastrointestinal tract. This FWD is marked by weight loss as urine and faeces mainly in the first 24 h but continuing at a reduced rate subsequently. The weight of rumen contents falls although water loss is to some extent masked by saliva inflow. FWD is associated with some stress, particularly when transportation is added. This is indicated by increased levels of plasma cortisol that may be partly responsible for an observed increase in the output of water and N in urine and faeces. Loss of body water induces dehydration that may induce feelings of thirst by effects on the hypothalamus structures through the renin-angiotensin-aldosterone system. There are suggestions that elevated cortisol levels depress angiotensin activity and prevent sensations of thirst in dehydrated animals, but further research in this area is needed. Dehydration coupled with the discharge of Na in urine challenges the maintenance of homeostasis. In FWD, Na excretion in urine is reduced and, with the reduction in digesta load, Na is gradually returned from the digestive tract to the extracellular fluid space. Control of enteropathogenic bacteria by normal rumen microbes is weakened by FWD and resulting infections may threaten animal health and meat safety. Recovery time is required after transport to restore full feed intake and to ensure that adequate glycogen is present in muscle pre-slaughter to maintain meat quality.
Resumo:
Macadamias, adapted to the fringes of subtropical rainforests of coastal, eastern Australia, are resilient to mild water stress. Even after prolonged drought, it is difficult to detect stress in commercial trees. Despite this, macadamia orchards in newer irrigated regions produce more consistent crops than those from traditional, rain-fed regions. Crop fluctuations in the latter tend to follow rainfall patterns. The benefit of irrigation in lower rainfall areas is undisputed, but there are many unanswered questions about the most efficient use of irrigation water. Water is used more efficiently when it is less readily available, causing partial stomatal closure that restricts transpiration more than it restricts photosynthesis. Limited research suggests that macadamias can withstand mild stress. In fact, water use efficiency can be increased by strategic deficit irrigation. However, macadamias are susceptible to stress during oil accumulation. There may be benefits of applying more water at critical times, less at others, and this may vary with cultivar. Currently, it is common for macadamia growers to apply about 20-40 L tree-1 day-1 of water to their orchards in winter and 70-90 L tree-1 day-1 in summer. Research reported water use at 20-30 L tree-1 day-1 during winter and 40-50 L tree-1 day-1 in summer using the Granier sap flow technique. The discrepancy between actual water use and farmer practice may be due to water loss via evaporation from the ground, deep drainage and/or greater transpiration due to luxury water consumption. More irrigation research is needed to develop efficient water use and to set practical limits for deficit irrigation management.
Resumo:
Small hive beetles (SHBs) are a global pest of European honeybee colonies. In the laboratory, the survival of adult SHBs was evaluated in relation to relative humidity (RH = 56, 64, 73, 82 and 96 %) and treatment with diatomaceous earth (DE) across 4 days. Low RH reduced survival. The application of DE reduced survival in addition to RH. Adults treated with corn flour (control) showed no difference in survival from untreated beetles. Scanning electron microscopy images showed no scarification of adult beetle cuticle after exposure to DE; therefore, water loss is likely facilitated through non-abrasive means such as the adsorption of cuticular lipids. The data agree with the hypothesis that DE causes mortality through water loss from treated insects. Egress, ingress, mortality and the egg-laying behaviours of beetles were observed in relation to a popular in-hive trench trap with and without the addition of DE. Traps filled with DE resulted in 100 % mortality of beetles compared with 8.6 % mortality when no DE was present. A simple method for visually determining beetle sex was used and documented.
Resumo:
n determining vase life (VL), it is often not considered that the measured VL in a particular experiment may greatly depend on both the preharvest and evaluation environmental conditions. This makes the comparison between studies difficult and may lead to erroneous interpretation of results. In this review, we critically discuss the effect of the growth environment on the VL of cut roses. This effect is mainly related to changes in stomatal responsiveness, regulating water loss, whereas cut flower carbohydrate status appears less critical. When comparing cultivars, postharvest water loss and VL often show no correlation, indicating that components such as variation in the tissue resistance to cavitate and/or collapse at low water potential play an important role in the incidence of water stress symptoms. The effect of the growth environment on these components remains unknown. Botrytis cinerea sporulation and infection, as well as cut rose susceptibility to the pathogen are also affected by the growth environment, with the latter being largely unexplored. A huge variability in the choices made with respect to the experimental setup (harvest/conditioning methods, test room conditions and VL terminating symptoms) is reported. We highlight that these decisions, though frequently overlooked, influence the outcome of the study. Specifications for each of these factors are proposed as necessary to achieve a common VL protocol. Documentation of both preharvest conditions and a number of postharvest factors, including the test room conditions, is recommended not only for assisting comparisons between studies, but also to identify factors with major effects on VL.
Resumo:
The effect of a pre-shipment hypochlorite treatment on botrytis incidence was evaluated in a large number of rose cultivars and under different long-term storage conditions. Application parameters, stability and sources of hypochlorite were investigated. Irrespective of the type of packaging and shipment conditions, roses that received a pre-shipment treatment with 100 to 150 mg/L hypochlorite showed a significantly decreased botrytis incidence compared to non-hypochlorite treated roses. The hypochlorite treatment generally was more effective than a comparable treatment with commercial fungicides. Dipping the flower heads for approximately one second in a hypochlorite solution was more effective than spraying the heads. In few cases minor hypochlorite-induced damage on the petal tips was observed at higher concentrations (>200 mg/L). Apart from the effect on botrytis incidence, the treatment resulted in reduced water loss that may have an additional beneficial effect on the eventual flower quality. It is concluded that, apart from other obvious measures to reduce botrytis incidence (prevention of high humidity at the flower heads) a pre-shipment floral dip in 100 to 150 mg/L hypochlorite from commercial household bleach is an easy and cost effective way to reduce botrytis incidence following long term storage/transportation of roses. © 2015, International Society for Horticultural Science. All rights reserved.
Resumo:
Postharvest treatments with nano-silver (NS) alleviate bacteria-related stem blockage of some cut flowers to extend their longevity. Gladiolus (Gladiolus hybridus) is a commercially important cut flower species. For the first time, the effects of NS pulses on cut gladiolus ‘Eerde’ spikes were investigated towards reducing bacterial colonization of and biofilm formation on their stems. As compared with a deionized water (DIW) control, pulse treatments with NS at 10, 25 and 50 mg L−1 for 24 h significantly (P ≤ 0.05) prolonged the vase life of cut gladiolus spikes moved into vases containing DIW. The NS treatments enhanced floret ‘opening rate’ and ‘daily ornamental value’. Although there were no significant differences among NS treatments, a 25 mg L−1 NS pulse treatment tended to give the longest vase life and the best ‘display quality’. All NS pulse treatments significantly improved water uptake by and reduced water loss from flowering spikes, thereby delaying the loss of water balance and maintaining relative fresh weight. Fifty (50) mg L−1 NS pulse-treated cut gladiolus spikes tended to exhibit the most water uptake and highest water balance over the vase period. However, there was no significant difference between 25 and 50 mg L−1 NS pulse treatments. Observations of stem-end bacterial proliferation during the vase period on cut gladiolus spikes either with or without NS pulse treatments were performed by confocal laser scanning microscopy (CLSM) and scanning electron microscopy (SEM). As compared to the control treatment, they revealed that the 25 mg L−1 NS pulse treatment effectively inhibited bacterial colonization and biofilm formation on the stem-end cut surface and in the xylem vessels, respectively. In vitro culture of the bacterial microflora and analysis of biofilm architecture using CLSM revealed that NS treatment restricted bacterial biofilm formation. After static culture for 24 h at 35 °C with 25 mg L−1 NS in the medium, no biofilm form or structure was evident. Rather, only limited bacterial cell number and scanty extracellular polysaccharide (EPS) material were observed. In contrast, mature bacterial biofilm architecture comprised of abundant bacteria interwoven with EPS formed in the absence of NS.
Resumo:
Runoff and sediment loss from forest roads were monitored for a two-year period in a Pinus plantation in southeast Queensland. Two classes of road were investigated: a gravelled road, which is used as a primary daily haulage route for the logging area, and an ungravelled road, which provides the main access route for individual logging compartments and is intensively used as a haulage route only during the harvest of these areas (approximately every 30 years). Both roads were subjected to routine traffic loads and maintenance during the study. Surface runoff in response to natural rainfall was measured and samples taken for the determination of sediment and nutrient (total nitrogen, total phosphorus, dissolved organic carbon and total iron) loads from each road. Results revealed that the mean runoff coefficient (runoff depth/rainfall depth) was consistently higher from the gravelled road plot with 0.57, as compared to the ungravelled road with 0.38. Total sediment loss over the two-year period was greatest from the gravelled road plot at 5.7 t km−1 compared to the ungravelled road plot with 3.9 t km−1. Suspended solids contributed 86% of the total sediment loss from the gravelled road, and 72% from the ungravelled road over the two years. Nitrogen loads from the two roads were both relatively constant throughout the study, and averaged 5.2 and 2.9 kg km−1 from the gravelled and ungravelled road, respectively. Mean annual phosphorus loads were 0.6 kg km−1 from the gravelled road and 0.2 kg km−1 from the ungravelled road. Organic carbon and total iron loads increased in the second year of the study, which was a much wetter year, and are thought to reflect the breakdown of organic matter in roadside drains and increased sediment generation, respectively. When road and drain maintenance (grading) was performed runoff and sediment loss were increased from both road types. Additionally, the breakdown of the gravel road base due to high traffic intensity during wet conditions resulted in the formation of deep (10 cm) ruts which increased erosion. The Water Erosion Prediction Project (WEPP):Road model was used to compare predicted to observed runoff and sediment loss from the two road classes investigated. For individual rainfall events, WEPP:Road predicted output showed strong agreement with observed values of runoff and sediment loss. WEPP:Road predictions for annual sediment loss from the entire forestry road network in the study area also showed reasonable agreement with the extrapolated observed values.
Resumo:
Increased sediment and nutrient losses resulting from unsustainable grazing management in the Burdekin River catchment are major threats to water quality in the Great Barrier Reef Lagoon. To test the effects of grazing management on soil and nutrient loss, five 1 ha mini-catchments were established in 1999 under different grazing strategies on a sedimentary landscape near Charters Towers. Reference samples were also collected from watercourses in the Burdekin catchment during major flow events.Soil and nutrient loss were relatively low across all grazing strategies due to a combination of good cover, low slope and low rainfall intensities. Total soil loss varied from 3 to 20 kg haˉ¹ per event while losses of N and P ranged from 10 to 1900 g haˉ¹ and from 1 to 71 g haˉ¹ per event respectively. Water quality of runoff was considered moderate across all strategies with relatively low levels of total suspended sediment (range: 8-1409 mg lˉ¹), total N (range: 101-4000 ug lˉ¹) and total P (range: 14-609 ug lˉ¹). However, treatment differences are likely to emerge with time as the impacts of the different grazing strategies on land condition become more apparent.Samples collected opportunistically from rivers and creeks during flow events displayed significantly higher levels of total suspended sediment (range: 10-6010 mg lˉ¹), total N (range: 650-6350 ug lˉ¹) and total P (range: 50-1500 ug lˉ¹) than those collected at the grazing trial. These differences can largely be attributed to variation in slope, geology and cover between the grazing trial and different catchments. In particular, watercourses draining hillier, grano-diorite landscapes with low cover had markedly higher sediment and nutrient loads compared to those draining flatter, sedimentary landscapes.These preliminary data suggest that on relatively flat, sedimentary landscapes, extensive cattle grazing is compatible with achieving water quality targets, provided high levels of ground cover are maintained. In contrast, sediment and nutrient loss under grazing on more erodable land types is cause for serious concern. Long-term empirical research and monitoring will be essential to quantify the impacts of changed land management on water quality in the spatially and temporally variable Burdekin River catchment.
Resumo:
The loss and recovery of intertidal seagrass meadows were assessed following the flood related catastrophic loss of seagrass meadows in February 1999 in the Sandy Strait, Queensland. Region wide recovery rates of intertidal meadows following the catastrophic disturbance were assessed by mapping seagrass abundance in the northern Great Sandy Strait region prior to and on 3 occasions after widespread loss of seagrass. Meadow-scale assessments of seagrass loss and recovery focussed on two existing Zostera capricorni monitoring meadows in the region. Mapping surveys showed that approximately 90% of intertidal seagrasses in the northern Great Sandy Strait disappeared after the February 1999 flooding of the Mary River. Full recovery of all seagrass meadows took 3 years. At the two study sites (Urangan and Wanggoolba Creek) the onset of Z. capricorni germination following the loss of seagrass occurred 14 months post-flood at Wanggoolba Creek, and at Urangan it took 20 months for germination to occur. By February 2001 (24 months post-flood) seagrass abundance at Wanggoolba Creek sites was comparable to pre-flood abundance levels and full recovery at Urangan sites was complete in August 2001 (31 months post-flood). Reduced water quality characterised by 2–3 fold increases in turbidity and nutrient concentrations during the 6 months following the flood was followed by a 95% loss of seagrass meadows in the region. Reductions in available light due to increased flood associated turbidity in February 1999 were the likely cause of seagrass loss in the Great Sandy Strait region, southern Queensland. Although seasonal cues influence the germination of Z. capricorni, the temporal variation in the onset of seed germination between sites suggests that germination following seagrass loss may be dependent on other factors (eg. physical and chemical characteristics of sediments and water). Elevated dissolved nitrogen concentrations during 1999 at Wanggoolba Creek suggest that this site received higher loads of sediments and nutrients from flood waters than Urangan. The germination of seeds at Wanggoolba Creek one year prior to Urangan coincides with relatively low suspended sediment concentrations in Wanggoolba Creek waters. The absence of organic rich sediments at Urangan for many months following their removal during the 1999 flood may also have inhibited seed germination. Data from population cohort analyses and population growth rates showed that rhizome weight and rhizome elongation rates increased over time, consistent with rapid growth during increases in temperature and light availability from May to October
Resumo:
Physiological and genetic studies of leaf growth often focus on short-term responses, leaving a gap to whole-plant models that predict biomass accumulation, transpiration and yield at crop scale. To bridge this gap, we developed a model that combines an existing model of leaf 6 expansion in response to short-term environmental variations with a model coordinating the development of all leaves of a plant. The latter was based on: (1) rates of leaf initiation, appearance and end of elongation measured in field experiments; and (2) the hypothesis of an independence of the growth between leaves. The resulting whole-plant leaf model was integrated into the generic crop model APSIM which provided dynamic feedback of environmental conditions to the leaf model and allowed simulation of crop growth at canopy level. The model was tested in 12 field situations with contrasting temperature, evaporative demand and soil water status. In observed and simulated data, high evaporative demand reduced leaf area at the whole-plant level, and short water deficits affected only leaves developing during the stress, either visible or still hidden in the whorl. The model adequately simulated whole-plant profiles of leaf area with a single set of parameters that applied to the same hybrid in all experiments. It was also suitable to predict biomass accumulation and yield of a similar hybrid grown in different conditions. This model extends to field conditions existing knowledge of the environmental controls of leaf elongation, and can be used to simulate how their genetic controls flow through to yield.
Resumo:
The present review identifies various constraints relating to poor adoption of ley-pastures in south-west Queensland, and suggests changes in research, development and extension efforts for improved adoption. The constraints include biophysical, economic and social constraints. In terms of biophysical constraints, first, shallower soil profiles with subsoil constraints (salt and sodicity), unpredictable rainfall, drier conditions with higher soil temperature and evaporative demand in summer, and frost and subzero temperature in winter, frequently result in a failure of established, or establishing, pastures. Second, there are limited options for legumes in a ley-pasture, with the legumes currently being mostly winter-active legumes such as lucerne and medics. Winter-active legumes are ineffective in improving soil conditions in a region with summer-dominant rainfall. Third, most grain growers are reluctant to include grasses in their ley-pasture mix, which can be uneconomical for various reasons, including nitrogen immobilisation, carryover of cereal diseases and depressed yields of the following cereal crops. Fourth, a severe depletion of soil water following perennial ley-pastures (grass + legumes or lucerne) can reduce the yields of subsequent crops for several seasons, and the practice of longer fallows to increase soil water storage may be uneconomical and damaging to the environment. Economic assessments of integrating medium- to long-term ley-pastures into cropping regions are generally less attractive because of reduced capital flow, increased capital investment, economic loss associated with establishment and termination phases of ley-pastures, and lost opportunities for cropping in a favourable season. Income from livestock on ley-pastures and soil productivity gains to subsequent crops in rotation may not be comparable to cropping when grain prices are high. However, the economic benefits of ley-pastures may be underestimated, because of unaccounted environmental benefits such as enhanced water use, and reduced soil erosion from summer-dominant rainfall, and therefore, this requires further investigation. In terms of social constraints, the risk of poor and unreliable establishment and persistence, uncertainties in economic and environmental benefits, the complicated process of changing from crop to ley-pastures and vice versa, and the additional labour and management requirements of livestock, present growers socially unattractive and complex decision-making processes for considering adoption of an existing medium- to long-term ley-pasture technology. It is essential that research, development and extension efforts should consider that new ley-pasture options, such as incorporation of a short-term summer forage legume, need to be less risky in establishment, productive in a region with prevailing biophysical constraints, economically viable, less complex and highly flexible in the change-over processes, and socially attractive to growers for adoption in south-west Queensland.
Resumo:
Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.
Resumo:
There is a world-wide trend for deteriorating water quality and light levels in the coastal zone, and this has been linked to declines in seagrass abundance. Localized management of seagrass meadow health requires that water quality guidelines for meeting seagrass growth requirements are available. Tropical seagrass meadows are diverse and can be highly dynamic and we have used this dynamism to identify light thresholds in multi-specific meadows dominated by Halodule uninervis in the northern Great Barrier Reef, Australia. Seagrass cover was measured at similar to 3 month intervals from 2008 to 2011 at three sites: Magnetic Island (MI) Dunk Island (DI) and Green Island (GI). Photosynthetically active radiation was continuously measured within the seagrass canopy, and three light metrics were derived. Complete seagrass loss occurred at MI and DI and at these sites changes in seagrass cover were correlated with the three light metrics. Mean daily irradiance (I-d) above 5 and 8.4 mol m(-2) d(-1) was associated with gains in seagrass at MI and DI, however a significant correlation (R = 0.649, p < 0.05) only occurred at MI. The second metric, percent of days below 3 mol m(-2) d(-1), correlated the most strongly (MI, R = -0.714, p < 0.01 and DI, R = -0.859, p = <0.001) with change in seagrass cover with 16-18% of days below 3 mol m(-2) d(-1) being associated with more than 50% seagrass loss. The third metric, the number of hours of light saturated irradiance (H-sat) was calculated using literature-derived data on saturating irradiance (E-k). H-sat correlated well (R = 0.686, p <0.01; and DI, R = 0.704, p < 0.05) with change in seagrass abundance, and was very consistent between the two sites as 4 H-sat was associated with increases in seagrass abundance at both sites, and less than 4 H-sat with more than 50% loss. At the third site (GI), small seasonal losses of seagrass quickly recovered during the growth season and the light metrics did not correlate (p > 0.05) with change in percent cover, except for I-d which was always high, but correlated with change in seagrass cover. Although distinct light thresholds were observed, the departure from threshold values was also important. For example, light levels that are well below the thresholds resulted in more severe loss of seagrass than those just below the threshold. Environmental managers aiming to achieve optimal seagrass growth conditions can use these threshold light metrics as guidelines; however, other environmental conditions, including seasonally varying temperature and nutrient availability, will influence seagrass responses above and below these thresholds. (C) 2012 Published by Elsevier Ltd.
Resumo:
The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.
Resumo:
Pratylenchus thornei is a major pathogen of wheat crops in the northern grain region of Eastern Australia with an estimated annual yield loss of $38 million. Damaged crops show symptoms of water and nutrient stress that suggest uptake is significantly affected. In order to understand the mechanisms involved in reducing water uptake and consequently plant yield, detailed measurements of water extraction and leaf area were conducted on a range of wheat cultivars with differing levels of tolerance and resistance to P. thornei.