26 resultados para runoff-rainfall erosivity parameter
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Runoff and sediment loss from forest roads were monitored for a two-year period in a Pinus plantation in southeast Queensland. Two classes of road were investigated: a gravelled road, which is used as a primary daily haulage route for the logging area, and an ungravelled road, which provides the main access route for individual logging compartments and is intensively used as a haulage route only during the harvest of these areas (approximately every 30 years). Both roads were subjected to routine traffic loads and maintenance during the study. Surface runoff in response to natural rainfall was measured and samples taken for the determination of sediment and nutrient (total nitrogen, total phosphorus, dissolved organic carbon and total iron) loads from each road. Results revealed that the mean runoff coefficient (runoff depth/rainfall depth) was consistently higher from the gravelled road plot with 0.57, as compared to the ungravelled road with 0.38. Total sediment loss over the two-year period was greatest from the gravelled road plot at 5.7 t km−1 compared to the ungravelled road plot with 3.9 t km−1. Suspended solids contributed 86% of the total sediment loss from the gravelled road, and 72% from the ungravelled road over the two years. Nitrogen loads from the two roads were both relatively constant throughout the study, and averaged 5.2 and 2.9 kg km−1 from the gravelled and ungravelled road, respectively. Mean annual phosphorus loads were 0.6 kg km−1 from the gravelled road and 0.2 kg km−1 from the ungravelled road. Organic carbon and total iron loads increased in the second year of the study, which was a much wetter year, and are thought to reflect the breakdown of organic matter in roadside drains and increased sediment generation, respectively. When road and drain maintenance (grading) was performed runoff and sediment loss were increased from both road types. Additionally, the breakdown of the gravel road base due to high traffic intensity during wet conditions resulted in the formation of deep (10 cm) ruts which increased erosion. The Water Erosion Prediction Project (WEPP):Road model was used to compare predicted to observed runoff and sediment loss from the two road classes investigated. For individual rainfall events, WEPP:Road predicted output showed strong agreement with observed values of runoff and sediment loss. WEPP:Road predictions for annual sediment loss from the entire forestry road network in the study area also showed reasonable agreement with the extrapolated observed values.
Resumo:
Rainfall simulation experiments were carried out to measure runoff and soil water fluxes of suspended solids, total nitrogen, total phosphorus, dissolved organic carbon and total iron from sites in Pinus plantations on the coastal lowlands of south-eastern Queensland subjected to various operations (treatments). The operations investigated were cultivated and nil-cultivated site preparation, fertilised site preparation, clearfall harvesting and prescribed burning; these treatments were compared with an 8-y-old established plantation. Flow-weighted mean concentrations of total nitrogen and total phosphorus in surface runoff from the cultivated and nil-cultivated site-preparation, clearfall harvest, prescribed burning and 8-y-old established plantation treatments were very similar. However, both the soil water and the runoff from the fertilised site preparation treatment contained more nitrogen (N) and phosphorus (P) than the other treatments - with 3.10 mg N L-1 and 4.32 mg P L-1 (4 and 20 times more) in the runoff. Dissolved organic carbon concentrations in runoff from the nil-cultivated site-preparation and prescribed burn treatments were elevated. Iron concentrations were highest in runoff from the nil-cultivated site-preparation and 8-y-old established plantation treatments. Concentrations of suspended solids in runoff were higher from cultivated site preparation and prescribed burn treatments, and reflect the great disturbance of surface soil at these sites. The concentrations of all analytes were highest in initial runoff from plots, and generally decreased with time. Total nitrogen (mean 7.28, range 0.11-13.27 mg L-1) and total phosphorus (mean 11.60, range 0.06-83.99 mg L-1) concentrations in soil water were between 2 and 10 times greater than in surface runoff, which highlights the potential for nutrient fluxes in interflow (i.e. in the soil above the water table) through the general plantation area. Implications in regard to forest management are discussed, along with results of larger catchment-scale studies.
Resumo:
We investigated the influence of rainfall patterns on the water-use efficiency of wheat in a transect between Horsham (36°S) and Emerald (23°S) in eastern Australia. Water-use efficiency was defined in terms of biomass and transpiration, WUEB/T, and grain yield and evapotranspiration, WUEY/ET. Our working hypothesis is that latitudinal trends in WUEY/ET of water-limited crops are the complex result of southward increasing WUEB/T and soil evaporation, and season-dependent trends in harvest index. Our approach included: (a) analysis of long-term records to establish latitudinal gradients of amount, seasonality, and size-structure of rainfall; and (b) modelling wheat development, growth, yield, water budget components, and derived variables including WUEB/T and WUEY/ET. Annual median rainfall declined from around 600 mm in northern locations to 380 mm in the south. Median seasonal rain (from sowing to harvest) doubled between Emerald and Horsham, whereas median off-season rainfall (harvest to sowing) ranged from 460 mm at Emerald to 156 mm at Horsham. The contribution of small events (≤ 5 mm) to seasonal rainfall was negligible at Emerald (median 15 mm) and substantial at Horsham (105 mm). Power law coefficients (τ), i.e. the slopes of the regression between size and number of events in a log-log scale, captured the latitudinal gradient characterised by an increasing dominance of small events from north to south during the growing season. Median modelled WUEB/T increased from 46 kg/ha.mm at Emerald to 73 kg/ha.mm at Horsham, in response to decreasing atmospheric demand. Median modelled soil evaporation during the growing season increased from 70 mm at Emerald to 172 mm at Horsham. This was explained by the size-structure of rainfall characterised with parameter τ, rather than by the total amount of rainfall. Median modelled harvest index ranged from 0.25 to 0.34 across locations, and had a season-dependent latitudinal pattern, i.e. it was greater in northern locations in dry seasons in association with wetter soil profiles at sowing. There was a season-dependent latitudinal pattern in modelled WUEY/ET. In drier seasons, high soil evaporation driven by a very strong dominance of small events, and lower harvest index override the putative advantage of low atmospheric demand and associated higher WUEB/T in southern locations, hence the significant southwards decrease in WUEY/ET. In wetter seasons, when large events contribute a significant proportion of seasonal rain, higher WUEB/T in southern locations may translate into high WUEY/ET. Linear boundary functions (French-Schultz type models) accounting for latitudinal gradients in its parameters, slope, and x-intercept, were fitted to scatter-plots of modelled yield v. evapotranspiration. The x-intercept of the model is re-interpreted in terms of rainfall size structure, and the slope or efficiency multiplier is described in terms of the radiation, temperature, and air humidity properties of the environment. Implications for crop management and breeding are discussed.
Resumo:
Surface losses of nitrogen from horticulture farms in coastal Queensland, Australia, may have the potential to eutrophy sensitive coastal marine habitats nearby. A case-study of the potential extent of such losses was investigated in a coastal macadamia plantation. Nitrogen losses were quantified in 5 consecutive runoff events during the 13-month study. Irrigation did not contribute to surface flows. Runoff was generated by storms at combined intensities and durations that were 20–40 mm/h for >9 min. These intensities and durations were within expected short-term (1 year) and long-term (up to 20 years) frequencies of rainfall in the study area. Surface flow volumes were 5.3 ± 1.1% of the episodic rainfall generated by such storms. Therefore, the largest part of each rainfall event was attributed to infiltration and drainage in this farm soil (Kandosol). The estimated annual loss of total nitrogen in runoff was 0.26 kg N/ha.year, representing a minimal loading of nitrogen in surface runoff when compared to other studies. The weighted average concentrations of total sediment nitrogen (TSN) and total dissolved nitrogen (TDN) generated in the farm runoff were 2.81 ± 0.77% N and 1.11 ± 0.27 mg N/L, respectively. These concentrations were considerably greater than ambient levels in an adjoining catchment waterway. Concentrations of TSN and TDN in the waterway were 0.11 ± 0.02% N and 0.50 ± 0.09 mg N/L, respectively. The steep concentration gradient of TSN and TDN between the farm runoff and the waterway demonstrated the occurrence of nutrient loading from the farming landscapes to the waterway. The TDN levels in the stream exceeded the current specified threshold of 0.2–0.3 mg N/L for eutrophication of such a waterway. Therefore, while the estimate of annual loading of N from runoff losses was comparatively low, it was evident that the stream catchment and associated agricultural land uses were already characterised by significant nitrogen loadings that pose eutrophication risks. The reported levels of nitrogen and the proximity of such waterways (8 km) to the coastline may have also have implications for the nearshore (oligotrophic) marine environment during periods of turbulent flow.
Resumo:
Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.
Resumo:
The off-site transport of agricultural chemicals, such as herbicides, into freshwater and marine ecosystems is a world-wide concern. The adoption of farm management practices that minimise herbicide transport in rainfall-runoff is a priority for the Australian sugarcane industry, particularly in the coastal catchments draining into the World Heritage listed Great Barrier Reef (GBR) lagoon. In this study, residual herbicide runoff and infiltration were measured using a rainfall simulator in a replicated trial on a brown Chromosol with 90–100% cane trash blanket cover in the Mackay Whitsunday region, Queensland. Management treatments included conventional 1.5 m spaced sugarcane beds with a single row of sugarcane (CONV) and 2 m spaced, controlled traffic sugarcane beds with dual sugarcane rows (0.8 m apart) (2mCT). The aim was to simulate the first rainfall event after the application of the photosynthesis inhibiting (PSII) herbicides ametryn, atrazine, diuron and hexazinone, by broadcast (100% coverage, on bed and furrow) and banding (50–60% coverage, on bed only) methods. These events included heavy rainfall 1 day after herbicide application, considered a worst case scenario, or rainfall 21 days after application. The 2mCT rows had significantly (P < 0.05) less runoff (38%) and lower peak runoff rates (43%) than CONV rows for a rainfall average of 93 mm at 100 mm h−1 (1:20 yr Average Return Interval). Additionally, final infiltration rates were higher in 2mCT rows than CONV rows, with 72 and 52 mm h−1 respectively. This resulted in load reductions of 60, 55, 47, and 48% for ametryn, atrazine, diuron and hexazinone from 2mCT rows, respectively. Herbicide losses in runoff were also reduced by 32–42% when applications were banded rather than broadcast. When rainfall was experienced 1 day after application, a large percentage of herbicides were washed off the cane trash. However, by day 21, concentrations of herbicide residues on cane trash were lower and more resistant to washoff, resulting in lower losses in runoff. Consequently, ametryn and atrazine event mean concentrations in runoff were approximately 8 fold lower at day 21 compared with day 1, whilst diuron and hexazinone were only 1.6–1.9 fold lower, suggesting longer persistence of these chemicals. Runoff collected at the end of the paddock in natural rainfall events indicated consistent though smaller treatment differences to the rainfall simulation study. Overall, it was the combination of early application, banding and controlled traffic that was most effective in reducing herbicide losses in runoff. Crown copyright © 2012
Resumo:
Objective: To describe the clinical signs, gross pathology, serology, bacteriology, histopathology, electron microscopy and immunohistochemistry findings associated with toxoplasmosis in four Indo-Pacific humpbacked dolphins (Sousa chinensis) that stranded in Queensland in 2000 and 2001. Design: Clinical assessment, gross necropsy, and laboratory examinations. Procedure: Necropsies were performed on four S chinensis to determine cause of death. Laboratory tests including serology, bacteriology, histopathology and transmission electron microscopy were done on the four dolphins. lmmunohistochemistry was done on the brain, heart, liver, lung, spleen and adrenal gland from various dolphins to detect Toxoplasma gondii antigens. Results: Necropsies showed all of four S chinensis that stranded in Queensland in 2000 and 2001 had evidence of predatory shark attack and three were extremely emaciated. Histopathological examinations showed all four dolphins had toxoplasmosis with tissue cysts resembling T gondii in the brain. Tachyzoite stages of T gondii were detected in the lungs, heart, liver, spleen and adrenal gland, variously of all four dolphins. Electron microscopy studies and immunohistochemistry confirmed the tissues cysts were those of Tgondii. All four dolphins also had intercurrent disease including pneumonia, three had peritonitis and one had pancreatitis. Conclusion: Four S chinensis necropsied in Queensland in 2000 and 2001 were found to be infected with toxoplasmosis. It is uncertain how these dolphins became infected and further studies are needed to determine how S chinensis acquire toxoplasmosis. All four dolphins stranded after periods of heavy rainfall, and coastal freshwater runoff may be a risk factor for T gondii infection in S chinensis. This disease should be of concern to wildlife managers since S chinensis is a rare species and its numbers appear to be declining.
Resumo:
The accuracy of synoptic-based weather forecasting deteriorates rapidly after five days and is not routinely available beyond 10 days. Conversely, climate forecasts are generally not feasible for periods of less than 3 months, resulting in a weather-climate gap. The tropical atmospheric phenomenon known as the Madden-Julian Oscillation (MJO) has a return interval of 30 to 80 days that might partly fill this gap. Our near-global analysis demonstrates that the MJO is a significant phenomenon that can influence daily rainfall patterns, even at higher latitudes, via teleconnections with broadscale mean sea level pressure (MSLP) patterns. These weather states provide a mechanistic basis for an MJO-based forecasting capacity that bridges the weather-climate divide. Knowledge of these tropical and extra-tropical MJO-associated weather states can significantly improve the tactical management of climate-sensitive systems such as agriculture.
Resumo:
This paper aims to compare the shift in frequency distribution and skill of seasonal climate forecasting of both streamflow and rainfall in eastern Australia based on the Southern Oscillation Index (SOI) Phase system. Recent advances in seasonal forecasting of climate variables have highlighted opportunities for improving decision making in natural resources management. Forecasting of rainfall probabilities for different regions in Australia is available, but the use of similar forecasts for water resource supply has not been developed. The use of streamflow forecasts may provide better information for decision-making in irrigation supply and flow management for improved ecological outcomes. To examine the relative efficacy of seasonal forecasting of streamflow and rainfall, the shift in probability distributions and the forecast skill were evaluated using the Wilcoxon rank-sum test and the linear error in probability space (LEPS) skill score, respectively, at three river gauging stations in the Border Rivers Catchment of the Murray-Darling Basin in eastern Australia. A comparison of rainfall and streamflow distributions confirms higher statistical significance in the shift of streamflow distribution than that in rainfall distribution. Moreover, streamflow distribution showed greater skill of forecasting with 0-3 month lead time, compared to rainfall distribution.
Resumo:
Adoption of conservation tillage practices on Red Ferrosol soils in the inland Burnett area of south-east Queensland has been shown to reduce runoff and subsequent soil erosion. However, improved infiltration resulting from these measures has not improved crop performance and there are suggestions of increased loss of soil water via deep drainage. This paper reports data monitoring soil water under real and artificial rainfall events in commercial fields and long-term tillage experiments, and uses the data to explore the rate and mechanisms of deep drainage in this soil type. Soils were characterised by large drainable porosities (≥0.10 m3/m3) in all parts of the profile to depths of 1.50 m, with drainable porosity similar to available water content (AWC) at 0.25 and 0.75 m, but >60% higher than AWC at 1.50 m. Hydraulic conductivity immediately below the tilled layer in both continuously cropped soils and those after a ley pasture phase was shown to decline with increasing soil moisture content, although the rate of decline was much greater in continuously cropped soil. At moisture contents approaching the drained upper limit (pore water pressure = -100cm H2O), estimates of saturated hydraulic conductivity after a ley pasture were 3-5 times greater than in continuously cropped soil, suggesting much greater rates of deep drainage in the former when soils are moist. Hydraulic tensiometers and fringe capacitance sensors monitored during real and artificial rainfall events showed evidence of soils approaching saturation in the surface layers (top 0.30-0.40 m), but there was no evidence of soil moistures exceeding the drained upper limit (i.e. pore water pressures ≤ -100 cm H2O) in deeper layers. Recovery of applied soil water within the top 1.00-1.20 m of the profile during or immediately after rainfall events declined as the starting profile moisture content increased. These effects were consistent with very rapid rates of internal drainage. Sensors deeper in the profile were unable to detect this drainage due to either non-uniformity of conducting macropores (i.e. bypass flow) or unsaturated conductivities in deeper layers that far exceed the saturated hydraulic conductivity of the infiltration throttle at the bottom of the cultivated layer. Large increases in unsaturated hydraulic conductivities are likely with only small increases in water content above the drained upper limit. Further studies with drainage lysimeters and large banks of hydraulic tensiometers are planned to quantify drainage risk in these soil types.
Resumo:
Partial least squares regression models on NIR spectra are often optimised (for wavelength range, mathematical pretreatment and outlier elimination) in terms of calibration terms of validation performance with reference to totally independent populations.
Resumo:
In Australia communities are concerned about atrazine being detected in drinking water supplies. It is important to understand mechanisms by which atrazine is transported from paddocks to waterways if we are to reduce movement of agricultural chemicals from the site of application. Two paddocks cropped with grain sorghum on a Black Vertosol were monitored for atrazine, potassium chloride (KCl) extractable atrazine, desethylatrazine (DEA), and desisopropylatrazine (DIA) at 4 soil depths (0-0.05, 0.05-0.10, 0.10-0.20, and 0.20-0.30 m) and in runoff water and runoff sediment. Atrazine + DEA + DIA (total atrazine) had a half-life in soil of 16-20 days, more rapid dissipation than in many earlier reports. Atrazine extracted in dilute potassium chloride, considered available for weed control, was initially 34% of the total and had a half-life of 15-20 days until day 30, after which it dissipated rapidly with a half life of 6 days. We conclude that, in this region, atrazine may not pose a risk for groundwater contamination, as only 0.5% of applied atrazine moved deeper than 0.20 m into the soil, where it dissipated rapidly. In runoff (including suspended sediment) atrazine concentrations were greatest during the first runoff event (57 days after application) (85 μg/L) and declined with time. After 160 days, the total atrazine lost in runoff was 0.4% of the initial application. The total atrazine concentration in runoff was strongly related to the total concentration in soil, as expected. Even after 98% of the KCl-extractable atrazine had dissipated (and no longer provided weed control), runoff concentrations still exceeded the human health guideline value of 40 μg/L. For total atrazine in soil (0-0.05 m), the range for coefficient of soil sorption (Kd) was 1.9-28.4 mL/g and for soil organic carbon sorption (KOC) was 100-2184 mL/g, increasing with time of contact with the soil and rapid dissipation of the more soluble, available phase. Partition coefficients in runoff for total atrazine were initially 3, increasing to 32 and 51 with time, values for DEA being half these. To minimise atrazine losses, cultural practices that maximise rain infiltration, and thereby minimise runoff, and minimise concentrations in the soil surface should be adopted.
Resumo:
Rainfall variability is a challenge to sustainable and pro. table cattle production in northern Australia. Strategies recommended to manage for rainfall variability, like light or variable stocking, are not widely adopted. This is due partly to the perception that sustainability and profitability are incompatible. A large, long-term grazing trial was initiated in 1997 in north Queensland, Australia, to test the effect of different grazing strategies on cattle production. These strategies are: (i) constant light stocking (LSR) at long-term carrying capacity (LTCC); (ii) constant heavy stocking (HSR) at twice LTCC; (iii) rotational wet-season spelling (R/Spell) at 1.5 LTCC; (iv) variable stocking (VAR), with stocking rates adjusted in May based on available pasture; and (v) a Southern Oscillation Index (SOI) variable strategy, with stocking rates adjusted in November, based on available pasture and SOI seasonal forecasts. Animal performance varied markedly over the 10 years for which data is presented, due to pronounced differences in rainfall and pasture availability. Nonetheless, lighter stocking at or about LTCC consistently gave the best individual liveweight gain (LWG), condition score and skeletal growth; mean LWG per annum was thus highest in the LSR (113 kg), intermediate in the R/Spell (104 kg) and lowest in the HSR(86 kg). MeanLWGwas 106 kg in the VAR and 103 kg in the SOI but, in all years, the relative performance of these strategies was dependent upon the stocking rate applied. After 2 years on the trial, steers from lightly stocked strategies were 60-100 kg heavier and received appreciable carcass price premiums at the meatworks compared to those under heavy stocking. In contrast, LWG per unit area was greatest at stocking rates of about twice LTCC; mean LWG/ha was thus greatest in the HSR (21 kg/ha), but this strategy required drought feeding in four of the 10 years and was unsustainable. Although LWG/ha was lower in the LSR (mean 14 kg/ha), or in strategies that reduced stocking rates in dry years like the VAR(mean 18 kg/ha) and SOI (mean 17 kg/ha), these strategies did not require drought feeding and appeared sustainable. The R/Spell strategy (mean 16 kg/ha) was compromised by an ill-timed fire, but also performed satisfactorily. The present results provide important evidence challenging the assumption that sustainable management in a variable environment is unprofitable. Further research is required to fully quantify the long-term effects of these strategies on land condition and profitability and to extrapolate the results to breeder performance at the property level.
Resumo:
The Davis Growth Model (a dynamic steer growth model encompassing 4 fat deposition models) is currently being used by the phenotypic prediction program of the Cooperative Research Centre (CRC) for Beef Genetic Technologies to predict P8 fat (mm) in beef cattle to assist beef producers meet market specifications. The concepts of cellular hyperplasia and hypertrophy are integral components of the Davis Growth Model. The net synthesis of total body fat (kg) is calculated from the net energy available after accounting tor energy needs for maintenance and protein synthesis. Total body fat (kg) is then partitioned into 4 fat depots (intermuscular, intramuscular, subcutaneous, and visceral). This paper reports on the parameter estimation and sensitivity analysis of the DNA (deoxyribonucleic acid) logistic growth equations and the fat deposition first-order differential equations in the Davis Growth Model using acslXtreme (Hunstville, AL, USA, Xcellon). The DNA and fat deposition parameter coefficients were found to be important determinants of model function; the DNA parameter coefficients with days on feed >100 days and the fat deposition parameter coefficients for all days on feed. The generalized NL2SOL optimization algorithm had the fastest processing time and the minimum number of objective function evaluations when estimating the 4 fat deposition parameter coefficients with 2 observed values (initial and final fat). The subcutaneous fat parameter coefficient did indicate a metabolic difference for frame sizes. The results look promising and the prototype Davis Growth Model has the potential to assist the beef industry meet market specifications.
Resumo:
Seed persistence is poorly quantified for invasive plants of subtropical and tropical environments and Lantana camara, one of the world's worst weeds, is no exception. We investigated germination, seedling emergence, and seed survival of two lantana biotypes (Pink and pink-edged red [PER]) in southeastern Queensland, Australia. Controlled experiments were undertaken in 2002 and repeated in 2004, with treatments comprising two differing environmental regimes (irrigated and natural rainfall) and sowing depths (0 and 2 cm). Seed survival and seedling emergence were significantly affected by all factors (time, biotype, environment, sowing depth, and cohort) (P < 0.001). Seed dormancy varied with treatment (environment, sowing depth, biotype, and cohort) (P < 0.001), but declined rapidly after 6 mo. Significant differential responses by the two biotypes to sowing depth and environment were detected for both seed survival and seedling emergence (P < 0.001). Seed mass was consistently lower in the PER biotype at the population level (P < 0.001), but this variation did not adequately explain the differential responses. Moreover, under natural rainfall the magnitude of the biotype effect was unlikely to result in ecologically significant differences. Seed survival after 36 mo under natural rainfall ranged from 6.8 to 21.3%. Best fit regression analysis of the decline in seed survival over time yielded a five-parameter exponential decay model with a lower asymptote approaching −0.38 (% seed survival = [( 55 − (−0.38)) • e (k • t)] + −0.38; R2 = 88.5%; 9 df). Environmental conditions and burial affected the slope parameter or k value significantly (P < 0.01). Seed survival projections from the model were greatest for buried seeds under natural rainfall (11 yr) and least under irrigation (3 yr). Experimental data and model projections suggest that lantana has a persistent seed bank and this should be considered in management programs, particularly those aimed at eradication.