163 resultados para rainfall


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Approximately 130,000 ha of hardwood plantations have been established in north-eastern Australia in the last 15 years. As a result of poor taxa selection approximately 25,000 ha have failed due to drought, pest and disease or extreme weather events (drought and cyclones). Given the predicted impacts of climate change in north-eastern Australia (reduced rainfall, increased temperatures and an increase in extreme weather conditions, particularly drought, storms and cyclones), selection of the right taxa for plantation development is even more critical as the taxon planted needs to be able to perform well under the environments experienced at planting as well as those that may develop over in 30 years time as a result of changing climate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigated the effects of annual burning since 1952, triennial burning since 1973, fire exclusion since 1946 and infrequent wildfire (one fire in 61 years) on woody understorey vegetation in a dry sclerophyll eucalypt forest, south-eastern Queensland, Australia. We determined the influence of these treatments, and other site variables (rainfall, understorey density, topsoil C : N ratio, tree basal area, distance to watercourse and burn coverage) on plant taxa density, richness and composition. The richness of woody understorey taxa 0–1 m in height was not affected by burning treatments, but richness of woody plants 1–7.5 m in height was lower in the annually burnt treatment than in the triennially burnt treatment from 1989 to 2007. Fire frequency and other site variables explained 34% of the variation in taxa composition (three taxon groups and 10 species), of which 33% of the explained variance was explained by fire treatment and 46% was explained by other site variables. Annual burning between 1974 and 1993 was associated with lower understorey densities mainly due to reduced densities of eucalypts 1–7.5 m in height. Triennial burning during the same period was associated with higher densities of eucalypts 0–7.5 m in height relative to the annually burnt and unburnt treatments. Most woody taxa persisted in the frequently burnt treatments through resprouting mechanisms (e.g. lignotuberous regeneration), and fire patchiness associated with low-intensity burning was also found to be important. Persistence of plants <1 m tall demonstrates the resilience of woody taxa to repeated burning in this ecosystem, although they mainly exist in a suppressed growth state under annual burning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Loss of nitrogen in deep drainage from agriculture is an important issue for environmental and economic reasons, but limited field data is available for tropical crops. In this study, nitrogen (N) loads leaving the root zone of two major humid tropical crops in Australia, sugarcane and bananas, were measured. The two field sites, 57 km apart, had a similar soil type (a well drained Dermosol) and rainfall (∼2700 mm year -1) but contrasting crops and management. A sugarcane crop in a commercial field received 136-148 kg N ha -1 year -1 applied in one application each year and was monitored for 3 years (first to third ratoon crops). N treatments of 0-600 kg ha -1 year -1 were applied to a plant and following ratoon crop of bananas. N was applied as urea throughout the growing season in irrigation water through mini-sprinklers. Low-suction lysimeters were installed at a depth of 1 m under both crops to monitor loads of N in deep drainage. Drainage at 1 m depth in the sugarcane crops was 22-37% of rainfall. Under bananas, drainage in the row was 65% of rainfall plus irrigation for the plant crop, and 37% for the ratoon. Nitrogen leaching loads were low under sugarcane (<1-9 kg ha -1 year -1) possibly reflecting the N fertiliser applications being reasonably matched to crop requirements and at least 26 days between fertiliser application and deep drainage. Under bananas, there were large loads of N in deep drainage when N application rates were in excess of plant demand, even when applied fortnightly. The deep drainage loss of N attributable to N fertiliser, calculated by subtracting the loss from unfertilised plots, was 246 and 641 kg ha -1 over 2 crop cycles, which was equivalent to 37 and 63% of the fertiliser application for treatments receiving 710 and 1065 kg ha -1, respectively. Those rates of fertiliser application resulted in soil acidification to a depth of 0.6 m by as much as 0.6 of a unit at 0.1-0.2 m depth. The higher leaching losses from bananas indicated that they should be a priority for improved N management. Crown Copyright © 2012.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Significant interactions have been demonstrated between production factors and postharvest quality of fresh fruit. Accordingly, there is an attendant need for adaptive postharvest actions to modulate preharvest effects. The most significant preharvest effects appear to be mediated through mineral nutrition influences on the physical characteristics of fruit. Examples of specific influencers include fertilisers, water availability, rootstock, and crop load effects on fruit quality attributes such as skin colour, susceptibility to diseases and physiological disorders, and fruit nutritional composition. Also, rainfall before and during harvest can markedly affect fruit susceptibility to skin blemishes, physical damage, and diseases. Knowledge of preharvest-postharvest interactions can help determine the basis for variability in postharvest performance and thereby allow refinement of postharvest practices to minimise quality loss after harvest. This knowledge can be utilised in predictive management systems. Such systems can benefit from characterisation of fruit nutritional status, particularly minerals, several months before and/or at harvest to allow informed decisions on postharvest handling and marketing options. Other examples of proactive management practices include adjusting harvesting and packing systems to account for rainfall effects before and/or during harvest. Improved understanding of preharvest-postharvest interactions is contributing to the delivery of consistently higher quality of fruit to consumers. This paper focuses on the state of knowledge for sub-tropical and tropical fruits, in particular avocado and mango.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The off-site transport of agricultural chemicals, such as herbicides, into freshwater and marine ecosystems is a world-wide concern. The adoption of farm management practices that minimise herbicide transport in rainfall-runoff is a priority for the Australian sugarcane industry, particularly in the coastal catchments draining into the World Heritage listed Great Barrier Reef (GBR) lagoon. In this study, residual herbicide runoff and infiltration were measured using a rainfall simulator in a replicated trial on a brown Chromosol with 90–100% cane trash blanket cover in the Mackay Whitsunday region, Queensland. Management treatments included conventional 1.5 m spaced sugarcane beds with a single row of sugarcane (CONV) and 2 m spaced, controlled traffic sugarcane beds with dual sugarcane rows (0.8 m apart) (2mCT). The aim was to simulate the first rainfall event after the application of the photosynthesis inhibiting (PSII) herbicides ametryn, atrazine, diuron and hexazinone, by broadcast (100% coverage, on bed and furrow) and banding (50–60% coverage, on bed only) methods. These events included heavy rainfall 1 day after herbicide application, considered a worst case scenario, or rainfall 21 days after application. The 2mCT rows had significantly (P < 0.05) less runoff (38%) and lower peak runoff rates (43%) than CONV rows for a rainfall average of 93 mm at 100 mm h−1 (1:20 yr Average Return Interval). Additionally, final infiltration rates were higher in 2mCT rows than CONV rows, with 72 and 52 mm h−1 respectively. This resulted in load reductions of 60, 55, 47, and 48% for ametryn, atrazine, diuron and hexazinone from 2mCT rows, respectively. Herbicide losses in runoff were also reduced by 32–42% when applications were banded rather than broadcast. When rainfall was experienced 1 day after application, a large percentage of herbicides were washed off the cane trash. However, by day 21, concentrations of herbicide residues on cane trash were lower and more resistant to washoff, resulting in lower losses in runoff. Consequently, ametryn and atrazine event mean concentrations in runoff were approximately 8 fold lower at day 21 compared with day 1, whilst diuron and hexazinone were only 1.6–1.9 fold lower, suggesting longer persistence of these chemicals. Runoff collected at the end of the paddock in natural rainfall events indicated consistent though smaller treatment differences to the rainfall simulation study. Overall, it was the combination of early application, banding and controlled traffic that was most effective in reducing herbicide losses in runoff. Crown copyright © 2012

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A suite of co-occurring eriophyid mite species are significant pests in subtropical Australia, causing severe discolouration, blistering, necrosis and leaf loss to one of the region's most important hardwood species, Corymbia citriodora subsp. variegata (F. Muell.) K. D. Hill & L. A. S. Johnson (Myrtaceae). In this study, we examined mite population dynamics and leaf damage over a 1-year period in a commercial plantation of C. citriodora subsp. variegata. Our aims were to link the incidence and severity of mite damage, and mite numbers, to leaf physical traits (moisture content and specific leaf weight (SLW)); to identify any seasonal changes in leaf surface occupancy (upper vs. lower lamina); and host tree canopy strata (upper, mid or lower canopy). We compared population trends with site rainfall, temperature and humidity. We also examined physical and anatomical changes in leaf tissue in response to mite infestation to characterize the plants' physiological reaction to feeding, and how this might affect photosynthesis. Our main findings included positive correlations with leaf moisture content and mite numbers and with mite numbers and damage severity. Wet and dry leaf mass and SLW were greater for damaged tissue than undamaged tissue. Mites were distributed equally throughout the canopy and on both leaf surfaces. No relationships with climatic factors were found. Damage symptoms occurred equally and were exactly mirrored on both leaf surfaces. Mite infestation increased the overall epidermal thickness and the number and size of epidermal cells and was also associated with a rapid loss of chloroplasts from mesophyll cells beneath damage sites. The integrity of the stomatal complex was severely compromised in damaged tissues. These histological changes suggest that damage by these mites will negatively impact the photosynthetic efficiency of susceptible plantation species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is an increasing need to understand what makes vegetation at some locations more sensitive to climate change than others. For savanna rangelands, this requires building knowledge of how forage production in different land types will respond to climate change, and identifying how location-specific land type characteristics, climate and land management control the magnitude and direction of its responses to change. Here, a simulation analysis is used to explore how forage production in 14 land types of the north-eastern Australian rangelands responds to three climate change scenarios of +3A degrees C, +17% rainfall; +2A degrees C, -7% rainfall; and +3A degrees C, -46% rainfall. Our results demonstrate that the controls on forage production responses are complex, with functional characteristics of land types interacting to determine the magnitude and direction of change. Forage production may increase by up to 60% or decrease by up to 90% in response to the extreme scenarios of change. The magnitude of these responses is dependent on whether forage production is water or nitrogen (N) limited, and how climate changes influence these limiting conditions. Forage production responds most to changes in temperature and moisture availability in land types that are water-limited, and shows the least amount of change when growth is restricted by N availability. The fertilisation effects of doubled atmospheric CO2 were found to offset declines in forage production under 2A degrees C warming and a 7% reduction in rainfall. However, rising tree densities and declining land condition are shown to reduce potential opportunities from increases in forage production and raise the sensitivity of pastures to climate-induced water stress. Knowledge of these interactions can be applied in engaging with stakeholders to identify adaptation options.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wild dogs (Canis lupus dingo and hybrids) are routinely controlled to protect beef cattle from predation yet beef producers are sometimes ambivalent as to whether wild dogs are a significant problem or not. This paper reports the loss of calves between birth and weaning in pregnancy-tested herds located on two beef cattle properties in south-central and far north Queensland for up to 4 consecutive years. Comparisons of lactation failures (identified when dams that previously tested pregnant were found non-lactating at weaning) were made between adjoining test herds grazed in places with or without annual (or twice annual) wild dog poison baiting programs. No correlation between wild dog relative abundance and lactation failures was apparent. Calf loss was frequently higher (three in 7 site-years, 11–32%) in baited areas than in non-baited areas (9% in 1 of 7 site-years). Predation loss of calves (in either area) only occurred in seasons of below-average rainfall, but was not related to herd nutrition. These data suggest that controlling wild dogs to protect calves on extensive beef cattle enterprises is unnecessary in most years because wild dogs do not routinely prey on calves. In those seasons when wild dog predation might occur, baiting can be counter-productive. Baiting appears to produce perturbations that change the way surviving or re-colonising wild dog populations select and handle prey and/or how they interact with livestock.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context. Irregular plagues of house mice cause high production losses in grain crops in Australia. If plagues can be forecast through broad-scale monitoring or model-based prediction, then mice can be proactively controlled by poison baiting. Aims. To predict mouse plagues in grain crops in Queensland and assess the value of broad-scale monitoring. Methods. Regular trapping of mice at the same sites on the Darling Downs in southern Queensland has been undertaken since 1974. This provides an index of abundance over time that can be related to rainfall, crop yield, winter temperature and past mouse abundance. Other sites have been trapped over a shorter time period elsewhere on the Darling Downs and in central Queensland, allowing a comparison of mouse population dynamics and cross-validation of models predicting mouse abundance. Key results. On the regularly trapped 32-km transect on the Darling Downs, damaging mouse densities occur in 50% of years and a plague in 25% of years, with no detectable increase in mean monthly mouse abundance over the past 35 years. High mouse abundance on this transect is not consistently matched by high abundance in the broader area. Annual maximum mouse abundance in autumn–winter can be predicted (R2 = 57%) from spring mouse abundance and autumn–winter rainfall in the previous year. In central Queensland, mouse dynamics contrast with those on the Darling Downs and lack the distinct annual cycle, with peak abundance occurring in any month outside early spring.Onaverage, damaging mouse densities occur in 1 in 3 years and a plague occurs in 1 in 7 years. The dynamics of mouse populations on two transects ~70 km apart were rarely synchronous. Autumn–winter rainfall can indicate mouse abundance in some seasons (R2 = ~52%). Conclusion. Early warning of mouse plague formation in Queensland grain crops from regional models should trigger farm-based monitoring. This can be incorporated with rainfall into a simple model predicting future abundance that will determine any need for mouse control. Implications. A model-based warning of a possible mouse plague can highlight the need for local monitoring of mouse activity, which in turn could trigger poison baiting to prevent further mouse build-up.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The results of research into the water relations and irrigation requirements of lychee are collated and reviewed. The stages of plant development are summarised, with an emphasis on factors influencing the flowering process. This is followed by reviews of plant water relations, water requirements, water productivity and, finally, irrigation systems. The lychee tree is native to the rainforests of southern China and northern Vietnam, and the main centres of production remain close to this area. In contrast, much of the research on the water relations of this crop has been conducted in South Africa, Australia and Israel where the tree is relatively new. Vegetative growth occurs in a series of flushes. Terminal inflorescences are borne on current shoot growth under cool (<15 °C), dry conditions. Trees generally do not produce fruit in the tropics at altitudes below 300 m. Poor and erratic flowering results in low and irregular fruit yields. Drought can enhance flowering in locations with dry winters. Roots can extract water from depths greater than 2 m. Diurnal trends in stomatal conductance closely match those of leaf water status. Both variables mirror changes in the saturation deficit of the air. Very little research on crop water requirements has been reported. Crop responses to irrigation are complex. In areas with low rainfall after harvest, a moderate water deficit before floral initiation can increase flowering and yield. In contrast, fruit set and yield can be reduced by a severe water deficit after flowering, and the risk of fruit splitting increased. Water productivity has not been quantified. Supplementary irrigation in South-east Asia is limited by topography and competition for water from the summer rice crop, but irrigation is practised in Israel, South Africa, Australia and some other places. Research is needed to determine the benefits of irrigation in different growing areas. Copyright © Cambridge University Press 2013.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Previous short-term studies predict that the use of fire to manage lantana (Lantana camara) may promote its abundance. We tested this prediction by examining long-term recruitment patterns of lantana in a dry eucalypt forest in Australia from 1959 to 2007 in three fire frequency treatments: repeated annual burning, repeated triennial burning and long unburnt. The dataset was divided into two periods (1959–1972, 1974–2007) due to logging that occurred at the study site between 1972 and 1974 and the establishment of the triennial burn treatment in 1973. Our results showed that repeated burning decreased lantana regeneration under an annual burn regime in the pre- and post-logging periods and maintained low levels of regeneration in the triennial burn compartment during the post-logging period. In the absence of fire, lantana recruitment exhibited a dome-shaped response over time, with the total population peaking in 1982 before declining to 2007. In addition to fire regime, soil pH and carbon to nitrogen ratio, the density of taller conspecifics and the interaction between rainfall and fire regime were found to influence lantana regeneration change over time. The results suggest that the reported positive association between fire disturbance and abundance of lantana does not hold for all forest types and that fire should be considered as part of an integrated weed management strategy for lantana in more fire-tolerant ecosystems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Historical stocking methods of continuous, season-long grazing of pastures with little account of growing conditions have caused some degradation within grazed landscapes in northern Australia. Alternative stocking methods have been implemented to address this degradation and raise the productivity and profitability of the principal livestock, cattle. Because information comparing stocking methods is limited, an evaluation was undertaken to quantify the effects of stocking methods on pastures, soils and grazing capacity. The approach was to monitor existing stocking methods on nine commercial beef properties in north and south Queensland. Environments included native and exotic pastures and eucalypt (lighter soil) and brigalow (heavier soil) land types. Breeding and growing cattle were grazed under each method. The owners/managers, formally trained in pasture and grazing management, made all management decisions affecting the study sites. Three stocking methods were compared: continuous (with rest), extensive rotation and intensive rotation (commonly referred to as 'cell grazing'). There were two or three stocking methods examined on each property: in total 21 methods (seven continuous, six extensive rotations and eight intensive rotations) were monitored over 74 paddocks, between 2006 and 2009. Pasture and soil surface measurements were made in the autumns of 2006, 2007 and 2009, while the paddock grazing was analysed from property records for the period from 2006 to 2009. The first 2 years had drought conditions (rainfall average 3.4 decile) but were followed by 2 years of above-average rainfall. There were no consistent differences between stocking methods across all sites over the 4 years for herbage mass, plant species composition, total and litter cover, or landscape function analysis (LFA) indices. There were large responses to rainfall in the last 2 years with mean herbage mass in the autumn increasing from 1970 kg DM ha(-1) in 2006-07 to 3830 kg DM ha(-1) in 2009. Over the same period, ground and litter cover and LFA indices increased. Across all sites and 4 years, mean grazing capacity was similar for the three stocking methods. There were, however, significant differences in grazing capacity between stocking methods at four sites but these differences were not consistent between stocking methods or sites. Both the continuous and intensive rotation methods supported the highest average annual grazing capacity at different sites. The results suggest that cattle producers can obtain similar ecological responses and carry similar numbers of livestock under any of the three stocking methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Alternaria leaf blotch and fruit spot caused by Alternaria spp. cause annual losses to the Australian apple industry. Control options are limited, mainly due to a lack of understanding of the disease cycle. Therefore, this study aimed to determine potential sources of Alternaria spp. inoculum in the orchard and examine their relative contribution throughout the production season. Leaf residue from the orchard floor, canopy leaves, twigs and buds were collected monthly from three apple orchards for two years and examined for the number of spores on their surface. In addition, the effects of climatic factors on spore production dynamics in each plant part were examined. Although all four plant parts tested contributed to the Alternaria inoculum in the orchard, significant higher numbers of spores were obtained from leaf residue than the other plant parts supporting the hypothesis that overwintering of Alternaria spp. occurred mainly in leaf residue and minimally on twigs and buds. The most significant period of spore production on leaf residue occurred from dormancy until bloom and on canopy leaves and twigs during the fruit growth stage. Temperature was the single most significant factor influencing the amount of Alternaria inoculum and rainfall and relative humidity showed strong associations with temperature influencing the spore production dynamics in Australian orchards. The practical implications of this study include the eradication of leaf residue from the orchard floor and sanitation of the canopy after harvest to remove residual spores from the trees.