8 resultados para 690202 Coastal water transport

em eResearch Archive - Queensland Department of Agriculture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maintaining a high rate of water uptake is crucial for maximum longevity of cut stems. Physiological gel/tylosis formation decreases water transport efficiency in the xylem. The primary mechanism of action for post-harvest Cu2+ treatments in improving cut flower and foliage longevity has been elusive. The effect of Cu2+ on wound-induced xylem vessel occlusion was investigated for Acacia holosericea A. Cunn. ex G. Don. Experiments were conducted using a Cu2+ pulse (5 h, 2.2 mM) and a Cu2+ vase solution (0.5 mM) vs a deionized water (DIW) control. Development of xylem blockage in the stem-end region 10 mm proximal to the wounded stem surface was examined over 21 days by light and transmission electron microscopy. Xylem vessels of stems stood into DIW were occluded with gels secreted into vessel lumens via pits from surrounding axial parenchyma cells. Gel secretion was initiated within 1-2 days post-wounding and gels were detected in the xylem from day 3. In contrast, Cu2+ treatments disrupted the surrounding parenchyma cells, thereby inhibiting gel secretion and maintaining the vessel lumens devoid of occlusions. The Cu2+ treatments significantly improved water uptake by the cut stems as compared to the control. © 2013 Scandinavian Plant Physiology Society.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Runoff and sediment loss from forest roads were monitored for a two-year period in a Pinus plantation in southeast Queensland. Two classes of road were investigated: a gravelled road, which is used as a primary daily haulage route for the logging area, and an ungravelled road, which provides the main access route for individual logging compartments and is intensively used as a haulage route only during the harvest of these areas (approximately every 30 years). Both roads were subjected to routine traffic loads and maintenance during the study. Surface runoff in response to natural rainfall was measured and samples taken for the determination of sediment and nutrient (total nitrogen, total phosphorus, dissolved organic carbon and total iron) loads from each road. Results revealed that the mean runoff coefficient (runoff depth/rainfall depth) was consistently higher from the gravelled road plot with 0.57, as compared to the ungravelled road with 0.38. Total sediment loss over the two-year period was greatest from the gravelled road plot at 5.7 t km−1 compared to the ungravelled road plot with 3.9 t km−1. Suspended solids contributed 86% of the total sediment loss from the gravelled road, and 72% from the ungravelled road over the two years. Nitrogen loads from the two roads were both relatively constant throughout the study, and averaged 5.2 and 2.9 kg km−1 from the gravelled and ungravelled road, respectively. Mean annual phosphorus loads were 0.6 kg km−1 from the gravelled road and 0.2 kg km−1 from the ungravelled road. Organic carbon and total iron loads increased in the second year of the study, which was a much wetter year, and are thought to reflect the breakdown of organic matter in roadside drains and increased sediment generation, respectively. When road and drain maintenance (grading) was performed runoff and sediment loss were increased from both road types. Additionally, the breakdown of the gravel road base due to high traffic intensity during wet conditions resulted in the formation of deep (10 cm) ruts which increased erosion. The Water Erosion Prediction Project (WEPP):Road model was used to compare predicted to observed runoff and sediment loss from the two road classes investigated. For individual rainfall events, WEPP:Road predicted output showed strong agreement with observed values of runoff and sediment loss. WEPP:Road predictions for annual sediment loss from the entire forestry road network in the study area also showed reasonable agreement with the extrapolated observed values.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Sheep and cattle are frequently subjected to feed and water deprivation (FWD) for about 12 h before, and then during, transport to reduce digesta load in the gastrointestinal tract. This FWD is marked by weight loss as urine and faeces mainly in the first 24 h but continuing at a reduced rate subsequently. The weight of rumen contents falls although water loss is to some extent masked by saliva inflow. FWD is associated with some stress, particularly when transportation is added. This is indicated by increased levels of plasma cortisol that may be partly responsible for an observed increase in the output of water and N in urine and faeces. Loss of body water induces dehydration that may induce feelings of thirst by effects on the hypothalamus structures through the renin-angiotensin-aldosterone system. There are suggestions that elevated cortisol levels depress angiotensin activity and prevent sensations of thirst in dehydrated animals, but further research in this area is needed. Dehydration coupled with the discharge of Na in urine challenges the maintenance of homeostasis. In FWD, Na excretion in urine is reduced and, with the reduction in digesta load, Na is gradually returned from the digestive tract to the extracellular fluid space. Control of enteropathogenic bacteria by normal rumen microbes is weakened by FWD and resulting infections may threaten animal health and meat safety. Recovery time is required after transport to restore full feed intake and to ensure that adequate glycogen is present in muscle pre-slaughter to maintain meat quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land application of piggery effluent (containing urine, faeces, water, and wasted feed) is under close scrutiny as a potential source of water resource contamination with phosphorus (P). This paper investigates two case studies of the impact of long-term piggery effluent-P application to soil. A Natrustalf (Sodosol) at P1 has received a net load of 3700 kg effluent P/ha over 19 years. The Haplustalf (Dermosol) selected (P2) has received a net load of 310 000 kg P/ha over 30 years. Total, bicarbonate extractable, and soluble P forms were determined throughout the soil profiles for paired (irrigated and unirrigated) sites at P1 and P2, as well as P sorption and desorption characteristics. Surface bicarbonate (PB, 0 - 0.05 m depth) and dilute CaCl2 extractable molybdate-reactive P (PC) have been significantly elevated by effluent irrigation (P1: PB unirrigated 23±1, irrigated 290±6; PC unirrigated 0.03±0.00, irrigated 23.9±0.2. P2: PB unirrigated 72±48, irrigated 3950±1960; PC unirrigated 0.7±0.0, irrigated 443±287 mg P/kg; mean±s.d.). Phosphorus enrichment to 1.5 m, detected as PB, was observed at P2. Elevated concentrations of CaCl2 extractable organic P forms (POC; estimated by non-molybdate reactive P in centrifuged supernatants) were observed from the soil surface of P1 to a depth of 0.4 m. Despite the extent of effluent application at both of these sites, only P1 displayed evidence of significant accumulation of POC. The increase in surface soil total P (0 - 0.05 m depth) due to effluent irrigation was much greater than laboratory P sorption (>25 times for P1; >57 times for P2) for a comparable range of final solution concentrations (desorption extracts ranged from 1-5 mg P/L for P1 and 50-80 mg P/L for P2). Precipitation of sparingly soluble P phases was evidenced in the soils of the P2 effluent application area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Australia communities are concerned about atrazine being detected in drinking water supplies. It is important to understand mechanisms by which atrazine is transported from paddocks to waterways if we are to reduce movement of agricultural chemicals from the site of application. Two paddocks cropped with grain sorghum on a Black Vertosol were monitored for atrazine, potassium chloride (KCl) extractable atrazine, desethylatrazine (DEA), and desisopropylatrazine (DIA) at 4 soil depths (0-0.05, 0.05-0.10, 0.10-0.20, and 0.20-0.30 m) and in runoff water and runoff sediment. Atrazine + DEA + DIA (total atrazine) had a half-life in soil of 16-20 days, more rapid dissipation than in many earlier reports. Atrazine extracted in dilute potassium chloride, considered available for weed control, was initially 34% of the total and had a half-life of 15-20 days until day 30, after which it dissipated rapidly with a half life of 6 days. We conclude that, in this region, atrazine may not pose a risk for groundwater contamination, as only 0.5% of applied atrazine moved deeper than 0.20 m into the soil, where it dissipated rapidly. In runoff (including suspended sediment) atrazine concentrations were greatest during the first runoff event (57 days after application) (85 μg/L) and declined with time. After 160 days, the total atrazine lost in runoff was 0.4% of the initial application. The total atrazine concentration in runoff was strongly related to the total concentration in soil, as expected. Even after 98% of the KCl-extractable atrazine had dissipated (and no longer provided weed control), runoff concentrations still exceeded the human health guideline value of 40 μg/L. For total atrazine in soil (0-0.05 m), the range for coefficient of soil sorption (Kd) was 1.9-28.4 mL/g and for soil organic carbon sorption (KOC) was 100-2184 mL/g, increasing with time of contact with the soil and rapid dissipation of the more soluble, available phase. Partition coefficients in runoff for total atrazine were initially 3, increasing to 32 and 51 with time, values for DEA being half these. To minimise atrazine losses, cultural practices that maximise rain infiltration, and thereby minimise runoff, and minimise concentrations in the soil surface should be adopted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pond apple invades riparian and coastal environments with water acting as the main vector for dispersal. As seeds float and can reach the ocean, a seed tracking model driven by near surface ocean currents was used to develop maps of potential seed dispersal. Seeds were ‘released’ in the model from sites near the mouths of major North Queensland rivers. Most seeds reach land within three months of release, settling predominately on windward-facing locations. During calm and monsoonal conditions, seeds were generally swept in a southerly direction, however movement turns northward during south easterly trade winds. Seeds released in February from the Johnstone River were capable of being moved anywhere from 100 km north to 150 km south depending on prevailing conditions. Although wind driven currents are the primary mechanism influencing seed dispersal, tidal currents, the East Australian Current, and other factors such as coastline orientation, release location and time also play an important role in determining dispersal patterns. In extreme events such as tropical cyclone Justin in 1997, north east coast rivers could potentially transport seed over 1300 km to the Torres Strait, Papua New Guinea and beyond.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a world-wide trend for deteriorating water quality and light levels in the coastal zone, and this has been linked to declines in seagrass abundance. Localized management of seagrass meadow health requires that water quality guidelines for meeting seagrass growth requirements are available. Tropical seagrass meadows are diverse and can be highly dynamic and we have used this dynamism to identify light thresholds in multi-specific meadows dominated by Halodule uninervis in the northern Great Barrier Reef, Australia. Seagrass cover was measured at similar to 3 month intervals from 2008 to 2011 at three sites: Magnetic Island (MI) Dunk Island (DI) and Green Island (GI). Photosynthetically active radiation was continuously measured within the seagrass canopy, and three light metrics were derived. Complete seagrass loss occurred at MI and DI and at these sites changes in seagrass cover were correlated with the three light metrics. Mean daily irradiance (I-d) above 5 and 8.4 mol m(-2) d(-1) was associated with gains in seagrass at MI and DI, however a significant correlation (R = 0.649, p < 0.05) only occurred at MI. The second metric, percent of days below 3 mol m(-2) d(-1), correlated the most strongly (MI, R = -0.714, p < 0.01 and DI, R = -0.859, p = <0.001) with change in seagrass cover with 16-18% of days below 3 mol m(-2) d(-1) being associated with more than 50% seagrass loss. The third metric, the number of hours of light saturated irradiance (H-sat) was calculated using literature-derived data on saturating irradiance (E-k). H-sat correlated well (R = 0.686, p <0.01; and DI, R = 0.704, p < 0.05) with change in seagrass abundance, and was very consistent between the two sites as 4 H-sat was associated with increases in seagrass abundance at both sites, and less than 4 H-sat with more than 50% loss. At the third site (GI), small seasonal losses of seagrass quickly recovered during the growth season and the light metrics did not correlate (p > 0.05) with change in percent cover, except for I-d which was always high, but correlated with change in seagrass cover. Although distinct light thresholds were observed, the departure from threshold values was also important. For example, light levels that are well below the thresholds resulted in more severe loss of seagrass than those just below the threshold. Environmental managers aiming to achieve optimal seagrass growth conditions can use these threshold light metrics as guidelines; however, other environmental conditions, including seasonally varying temperature and nutrient availability, will influence seagrass responses above and below these thresholds. (C) 2012 Published by Elsevier Ltd.