185 resultados para CPE-KG-62°S
Resumo:
Surface losses of nitrogen from horticulture farms in coastal Queensland, Australia, may have the potential to eutrophy sensitive coastal marine habitats nearby. A case-study of the potential extent of such losses was investigated in a coastal macadamia plantation. Nitrogen losses were quantified in 5 consecutive runoff events during the 13-month study. Irrigation did not contribute to surface flows. Runoff was generated by storms at combined intensities and durations that were 20–40 mm/h for >9 min. These intensities and durations were within expected short-term (1 year) and long-term (up to 20 years) frequencies of rainfall in the study area. Surface flow volumes were 5.3 ± 1.1% of the episodic rainfall generated by such storms. Therefore, the largest part of each rainfall event was attributed to infiltration and drainage in this farm soil (Kandosol). The estimated annual loss of total nitrogen in runoff was 0.26 kg N/ha.year, representing a minimal loading of nitrogen in surface runoff when compared to other studies. The weighted average concentrations of total sediment nitrogen (TSN) and total dissolved nitrogen (TDN) generated in the farm runoff were 2.81 ± 0.77% N and 1.11 ± 0.27 mg N/L, respectively. These concentrations were considerably greater than ambient levels in an adjoining catchment waterway. Concentrations of TSN and TDN in the waterway were 0.11 ± 0.02% N and 0.50 ± 0.09 mg N/L, respectively. The steep concentration gradient of TSN and TDN between the farm runoff and the waterway demonstrated the occurrence of nutrient loading from the farming landscapes to the waterway. The TDN levels in the stream exceeded the current specified threshold of 0.2–0.3 mg N/L for eutrophication of such a waterway. Therefore, while the estimate of annual loading of N from runoff losses was comparatively low, it was evident that the stream catchment and associated agricultural land uses were already characterised by significant nitrogen loadings that pose eutrophication risks. The reported levels of nitrogen and the proximity of such waterways (8 km) to the coastline may have also have implications for the nearshore (oligotrophic) marine environment during periods of turbulent flow.
Resumo:
The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.
Resumo:
The variation in liveweight gain in grazing beef cattle as influenced by pasture type, season and year effects has important economic implications for mixed crop-livestock systems and the ability to better predict such variation would benefit beef producers by providing a guide for decision making. To identify key determinants of liveweight change of Brahman-cross steers grazing subtropical pastures, measurements of pasture quality and quantity, and diet quality in parallel with liveweight were made over two consecutive grazing seasons (48 and 46 weeks, respectively), on mixed Clitoria ternatea/grass, Stylosanthes seabrana/grass and grass swards (grass being a mixture of Bothriochloa insculpta cv. Bisset, Dichanthium sericeum and Panicum maximum var. trichoglume cv. Petrie). Steers grazing the legume-based pastures had the highest growth rate and gained between 64 and 142 kg more than those grazing the grass pastures in under 12 months. Using an exponential model, green leaf mass, green leaf %, adjusted green leaf % (adjusted for inedible woody legume stems), faecal near infrared reflectance spectroscopy predictions of diet crude protein and diet dry matter digestibility, accounted for 77, 74, 80, 63 and 60%, respectively, of the variation in daily weight gain when data were pooled across pasture types and grazing seasons. The standard error of the regressions indicated that 95% prediction intervals were large (+/- 0.42-0.64 kg/head.day) suggesting that derived regression relationships have limited practical application for accurately estimating growth rate. In this study, animal factors, especially compensatory growth effects, appeared to have a major influence on growth rate in relation to pasture and diet attributes. It was concluded that predictions of growth rate based only on pasture or diet attributes are unlikely to be accurate or reliable. Nevertheless, key pasture attributes such as green leaf mass and green leaf% provide a robust indication of what proportion of the potential growth rate of the grazing animals can be achieved.
Resumo:
Faecal Egg Count Reduction Tests (FECRTs) for macrocyclic lactone (ML) and levamisole (LEV) drenches were conducted on two dairy farms in the subtropical, summer rainfall region of eastern Australia to determine if anthelmintic failure contributed to severe gastrointestinal nematode infections observed in weaner calves. Subtropical Cooperia spp. were the dominant nematodes on both farms although significant numbers of Haemonchus placei were also present on Farm 2. On Farm 1, moxidectin pour-on (MXD) drenched at 0.5 mg kg-1 liveweight (LW) reduced the overall Cooperia burden by 82% (95% confidence limits, 37-95%) at day 7 post-drench. As worm burdens increased rapidly in younger animals in the control group (n = 4), levamisole was used as a salvage drench and these calves withdrawn from the trial on animal welfare grounds after sample collection at day 7. Levamisole (LEV) dosed at 6.8 mg kg-1 LW reduced the worm burden in these calves by 100%, 7 days after drenching. On Farm 2, MXD given at 0.5 mg kg-1 LW reduced the faecal worm egg count of cooperioids at day 8 by 96% (71-99%), ivermectin oral (IVM) at 0.2 mg kg-1 LW by 1.6% (-224 to 70%) and LEV oral at 7.1 mg kg-1 LW by 100%. For H. placei the reductions were 98% (85-99.7%) for MXD, 0.7% (-226 to 70%) for IVM and 100% for LEV. This is the first report in Australia of the failure of macrocyclic lactone treatments to control subtropical Cooperia spp. and suspected failure to control H. placei in cattle.
Resumo:
Resistance to the root-lesion nematode Pratylenchus thornei was sought in wheat from the West Asia and North Africa (WANA) region in the Watkins Collection (148 bread and 139 durum wheat accessions) and the McIntosh Collection (59 bread and 43 durum wheat accessions). It was considered that landraces from this region, encompassing the centres of origin of wheat and where P. thornei also occurs, could be valuable sources of resistance for use in wheat breeding. Resistance was determined by number of P. thornei/kg soil after the growth of the plants in replicated glasshouse experiments. On average, durum accessions produced significantly lower numbers of P. thornei than bread wheat accessions in both the Watkins and McIntosh Collections. Selected accessions with low P. thornei numbers were re-tested and 13 bread wheat and 10 durum accessions were identified with nematode numbers not significantly different from GS50a, a partially resistant bread wheat line used as a reference standard. These resistant accessions, which originated in Iran, Iraq, Syria, Egypt, Sudan, Morocco, and Tunisia, represent a resource of resistance genes in the primary wheat gene pool, which could be used in Australian wheat breeding programs to reduce the economic loss from P. thornei.
Resumo:
Steer liveweight gains were measured in an extensive grazing study conducted in a Heteropogon contortus (black speargrass) pasture in central Queensland between 1988 and 2001. Treatments included a range of stocking rates in native pastures, legume-oversown native pasture and animal diet supplement/spring-burning pastures. Seasonal rainfall throughout this study was below the long-term mean. Mean annual pasture utilisation ranged from 13 to 61%. Annual liveweight gains per head in native pasture were highly variable among years and ranged from a low of 43 kg/steer at 2 ha/steer to a high of 182 kg/steer at 8 ha/steer. Annual liveweight gains were consistently highest at light stocking and decreased with increasing stocking rate. Annual liveweight gain per hectare increased linearly with stocking rate. These stocking rate trends were also evident in legume-oversown pastures although both the intercept and slope of the regressions for legume-oversown pastures were higher than that for native pasture. The highest annual liveweight gain for legume-oversown pasture was 221 kg/steer at 4 ha/steer. After 13 years, annual liveweight gain per unit area occurred at the heaviest stocking rate despite deleterious changes in the pasture. Across all years, the annual liveweight advantage for legume-oversown pastures was 37 kg/steer. Compared with native pasture, changes in annual liveweight gain with burning were variable. It was concluded that cattle productivity is sustainable when stocking rates are maintained at 4 ha/steer or lighter (equivalent to a utilisation rate around 30%). Although steer liveweight gain occurred at all stocking rates and economic returns were highest at heaviest stocking rates, stocking rates heavier than 4 ha/steer are unsustainable because of their long-term impact on pasture productivity.
Resumo:
Farmlets, each of 20 cows, were established to field test five milk production systems and provide a learning platform for farmers and researchers in a subtropical environment. The systems were developed through desktop modelling and industry consultation in response to the need for substantial increases in farm milk production following deregulation of the industry. Four of the systems were based on grazing and the continued use of existing farmland resource bases, whereas the fifth comprised a feedlot and associated forage base developed as a greenfield site. The field evaluation was conducted over 4 years under more adverse environmental conditions than anticipated with below average rainfall and restrictions on irrigation. For the grazed systems, mean annual milk yield per cow ranged from 6330 kg/year (1.9 cows/ha) for a herd based on rain-grown tropical pastures to 7617 kg/year (3.0 cows/ha) where animals were based on temperate and tropical irrigated forages. For the feedlot herd, production of 9460 kg/cow.year (4.3 cows/ha of forage base) was achieved. For all herds, the level of production achieved required annual inputs of concentrates of similar to 3 t DM/animal and purchased conserved fodder from 0.3 to 1.5 t DM/animal. This level of supplementary feeding made a major contribution to total farm nutrient inputs, contributing 50% or more of the nitrogen, phosphorus and potassium entering the farming system, and presents challenges to the management of manure and urine that results from the higher stocking rates enabled. Mean annual milk production for the five systems ranged from 88 to 105% of that predicted by the desktop modelling. This level of agreement for the grazed systems was achieved with minimal overall change in predicted feed inputs; however, the feedlot system required a substantial increase in inputs over those predicted. Reproductive performance for all systems was poorer than anticipated, particularly over the summer mating period. We conclude that the desktop model, developed as a rapid response to assist farmers modify their current farming systems, provided a reasonable prediction of inputs required and milk production. Further model development would need to consider more closely climate variability, the limitations summer temperatures place on reproductive success and the feed requirements of feedlot herds.
Resumo:
BACKGROUND: Piperonyl butoxide (PB)-synergised natural pyrethrins (pyrethrin:PB ratio 1:4) were evaluated both as a grain protectant and a disinfestant against four Liposcelidid psocids: Liposcelis bostrychophila Badonnel, L. entomophila (Enderlein), L. decolor (Pearman) and L. paeta Pearman. These are key storage pests in Australia that are difficult to control with the registered grain protectants and are increasingly being reported as pests of stored products in other countries. Firstly, mortality and reproduction of adults were determined in wheat freshly treated at 0.0, 0.75, 1.5, 3 and 6 mg kg-1 of pyrethrins + PB (1:4) at 301C and 702% RH. Next, wheat treated at 0.0, 1.5, 3 and 6 mg kg-1 of pyrethrins + PB (1:4) was stored at 301C and 702% RH and mortality and reproduction of psocids were assessed after 0, 1.5, 3 and 4.5 months of storage. Finally, the potential of synergised pyrethrins as a disinfestant was assessed by establishing time to endpoint mortality for adult psocids exposed to wheat treated at 3 and 6 mg kg-1 of synergised pyrethrins after 0, 3, 6, 9 and 12 h of exposure. RESULTS: Synergised pyrethrins at 6 mg kg-1 provided 3 months of protection against all four Liposcelis spp., and at this rate complete adult mortality of these psocids can be achieved within 6 h of exposure. CONCLUSION: Piperonyl butoxide-synergised pyrethrins have excellent potential both as a grain protectant and as a disinfestant against Liposcelidid.
Resumo:
When exposed to hot (22-35 degrees C) and dry climatic conditions in the field during the final 4-6 weeks of pod filling, peanuts (Arachis hypogaea L.) can accumulate highly carcinogenic and immuno-suppressing aflatoxins. Forecasting of the risk posed by these conditions can assist in minimizing pre-harvest contamination. A model was therefore developed as part of the Agricultural Production Systems Simulator (APSIM) peanut module, which calculated an aflatoxin risk index (ARI) using four temperature response functions when fractional available soil water was <0.20 and the crop was in the last 0.40 of the pod-filling phase. ARI explained 0.95 (P <= 0.05) of the variation in aflatoxin contamination, which varied from 0 to c. 800 mu g/kg in 17 large-scale sowings in tropical and four sowings in sub-tropical environments carried out in Australia between 13 November and 16 December 2007. ARI also explained 0.96 (P <= 0.01) of the variation in the proportion of aflatoxin-contaminated loads (>15 mu g/kg) of peanuts in the Kingaroy region of Australia during the period between the 1998/99 and 2007/08 seasons. Simulation of ARI using historical climatic data from 1890 to 2007 indicated a three-fold increase in its value since 1980 compared to the entire previous period. The increase was associated with increases in ambient temperature and decreases in rainfall. To facilitate routine monitoring of aflatoxin risk by growers in near real time, a web interface of the model was also developed. The ARI predicted using this interface for eight growers correlated significantly with the level of contamination in crops (r=095, P <= 0.01). These results suggest that ARI simulated by the model is a reliable indicator of aflatoxin contamination that can be used in aflatoxin research as well as a decision-support tool to monitor pre-harvest aflatoxin risk in peanuts.
Resumo:
Cattle grazing in arid rangelands of Australia suffer periodic extensive and serious poisoning by the plant species Pimelea trichostachya, P. simplex, and P. elongata. Pimelea poisoning (also known as St. George disease and Marree disease) has been attributed to the presence of the diterpenoid orthoester simplexin in these species. However, literature relating to previous studies is complicated by taxonomic revisions, and the presence of simplexin has not previously been verified in all currently recognized taxa capable of inducing pimelea poisoning syndrome, with no previous chemical studies of P. trichostachya (as currently classified) or P. simplex subsp. continua. We report here the isolation of simplexin from P. trichostachya and the development of a liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS) method to measure simplexin concentrations in pimelea plant material. Simplexin was quantified by positive-ion atmospheric pressure chemical ionization (APCI) LC-MS/MS with selected reaction monitoring (SRM) of the m/z 533.3 > 253.3 transition. LC-MS/MS analysis of the four poisonous taxa P. trichostachya, P. elongata, P. simplex subsp. continua, and P. simplex subsp. simplex showed similar profiles with simplexin as the major diterpenoid ester component in all four taxa accompanied by varying amounts of related orthoesters. Similar analyses of P. decora, P. haematostachya, and P. microcephala also demonstrated the presence of simplexin in these species but at far lower concentrations, consistent with the limited reports of stock poisoning associated with these species. The less common, shrubby species P. penicillaris contained simplexin at up to 55 mg/kg dry weight and would be expected to cause poisoning if animals consumed sufficient plant material.
Resumo:
Interest in cashew production in Australia has been stimulated by domestic and export market opportunities and suitability of large areas of tropical Australia. Economic models indicate that cashew production is profitable at 2.8 t ha-1 nut-in-shell (NIS). Balanced plant nutrition is essential to achieve economic yields in Australia, with nitrogen (N) of particular importance because of its capacity to modify growth, affect nut yield and cause environmental degradation through soil acidification and off-site contamination. The study on a commercial cashew plantation at Dimbulah, Australia, investigated the effect of N rate and timing on cashew growth, nut production, N leaching and soil chemical properties over five growth cycles (1995-1999). Nitrogen was applied during the main periods of vegetative (December-April) and reproductive (June-October) growth. Commercial NIS yields (up to 4.4 t ha-1 from individual trees) that exceeded the economic threshold of 2.8 t ha-1 were achieved. The yield response was mainly determined by canopy size as mean nut weight, panicle density and nuts per panicle were largely unaffected by N treatments. Nitrogen application confined to the main period of vegetative growth (December-April) produced a seasonal growth pattern that corresponded most consistently with highest NIS yield. This N timing also reduced late season flowering and undesirable post-November nut drop. Higher yields were not produced at N rates greater than 17 g m-2 of canopy surface area (equating to 210 kg N ha-1 for mature size trees). High yields were attained when N concentrations in Mveg leaves in May-June were about 2%, but this assessment occurs at a time when it is not feasible to correct N deficiency. The Mflor leaf of the preceding November, used in conjunction with the Mveg leaf, was proposed as a diagnostic tool to guide N rate decisions. Leaching of nitrate-N and acidification of the soil profile was recorded to 0.9 m. This is an environmental and sustainability hazard, and demonstrates that improved methods of N management are required.
Resumo:
Root-lesion nematode (Pratylenchus thornei) significantly reduces wheat yields in the northern Australian grain region. Canola is thought to have a 'biofumigation' potential to control nematodes; therefore, a field experiment was designed to compare canola with other winter crops or clean-fallow for reducing P. thornei population densities and improving growth of P. thornei-intolerant wheat (cv. Batavia) in the following year. Immediately after harvest of the first-year crops, populations of P. thornei were lowest following various canola cultivars or clean-fallow (1957-5200 P. thornei/kg dry soil) and were highest following susceptible wheat cultivars (31 033-41 294/kg dry soil). Unexpectedly, at planting of the second-year wheat crop, nematode populations were at more uniform lower levels (<5000/kg dry soil), irrespective of the previous season's treatment, and remained that way during the growing season, which was quite dry. Growth and grain yield of the second-year wheat crop were poorest on plots previously planted with canola or left fallow due to poor colonisation with arbuscular mycorrhizal (AM) fungi, with the exception of canola cv. Karoo, which had high AM fungal colonisation and low wheat yields. There were significant regressions between growth and yield parameters of the second-year wheat and levels of AMF following the pre-crop treatments. Thus, canola appears to be a good crop for reducing P. thornei populations, but AM fungal-dependence of subsequent crops should be considered, particularly in the northern Australian grain region.
Resumo:
Summary Poor land condition resulting from unsustainable grazing practices can reduce enterprise profitability and increase water, sediment and associated nutrient runoff from properties and catchments. This paper presents the results of a 6 year field study that used a series of hillslope flume experiments to evaluate the impact of improved grazing land management (GLM) on hillslope runoff and sediment yields. The study was carried out on a commercial grazing property in a catchment draining to the Burdekin River in northern Australia. During this study average ground cover on hillslopes increased from ~35% to ~75%, although average biomass and litter levels are still relatively low for this landscape type (~60 increasing to 1100 kg of dry matter per hectare). Pasture recovery was greatest on the upper and middle parts of hillslopes. Areas that did not respond to the improved grazing management had <10% cover and were on the lower slopes associated with the location of sodic soil and the initiation of gullies. Comparison of ground cover changes and soil conditions with adjacent properties suggest that grazing management, and not just improved rainfall conditions, were responsible for the improvements in ground cover in this study. The ground cover improvements resulted in progressively lower runoff coefficients for the first event in each wet season, however, runoff coefficients were not reduced at the annual time scale. The hillslope annual sediment yields declined by ~70% on two out of three hillslopes, although where bare patches (with <10% cover) were connected to gullies and streams, annual sediment yields increased in response to higher rainfall in latter years of the study. It appears that bare patches are the primary source areas for both runoff and erosion on these hillslopes. Achieving further reductions in runoff and erosion in these landscapes may require management practices that improve ground cover and biomass in bare areas, particularly when they are located adjacent to concentrated drainage lines.
Resumo:
Sorghum ergot produces dihydroergosine (DHES) and related alkaloids, which cause hyperthermia in cattle. Proportions of infected panicles (grain heads), leaves and stems were determined in two forage sorghum crops extensively infected 2 to 4 weeks prior to sampling and the panicles were assayed for DHES. Composite samples from each crop, plus a third grain variety crop, were coarsely chopped and half of each sealed in plastic buckets for 6 weeks to simulate ensilation. The worst-infected panicles contained up to 55 mg DHES/kg, but dilution reduced average concentrations of DHES in crops to approximately 1 mg/kg, a relatively safe level for cattle. Ensilation significantly (P = 0.043) reduced mean DHES concentrations from 0.85 to 0.46 mg/kg.
Resumo:
The nitrogen-driven trade-off between nitrogen utilisation efficiency (yield per unit nitrogen uptake) and water use efficiency (yield per unit evapotranspiration) is widespread and results from well established, multiple effects of nitrogen availability on the water, carbon and nitrogen economy of crops. Here we used a crop model (APSIM) to simulate the yield, evapotranspiration, soil evaporation and nitrogen uptake of wheat, and analysed yield responses to water, nitrogen and climate using a framework analogous to the rate-duration model of determinate growth. The relationship between modelled grain yield (Y) and evapotranspiration (ET) was fitted to a linear-plateau function to derive three parameters: maximum yield (Ymax), the ET break-point when yield reaches its maximum (ET#), and the rate of yield response in the linear phase ([Delta]Y/[Delta]ET). Against this framework, we tested the hypothesis that nitrogen deficit reduces maximum yield by reducing both the rate ([Delta]Y/[Delta]ET) and the range of yield response to evapotranspiration, i.e. ET# - Es, where Es is modelled median soil evaporation. Modelled data reproduced the nitrogen-driven trade-off between nitrogen utilisation efficiency and water use efficiency in a transect from Horsham (36°S) to Emerald (23°S) in eastern Australia. Increasing nitrogen supply from 50 to 250 kg N ha-1 reduced yield per unit nitrogen uptake from 29 to 12 kg grain kg-1 N and increased yield per unit evapotranspiration from 6 to 15 kg grain ha-1 mm-1 at Emerald. The same increment in nitrogen supply reduced yield per unit nitrogen uptake from 30 to 25 kg grain kg-1 N and increased yield per unit evapotranspiration from 6 to 25 kg grain ha-1 mm-1 at Horsham. Maximum yield ranged from 0.9 to 6.4 t ha-1. Consistent with our working hypothesis, reductions in maximum yield with nitrogen deficit were associated with both reduction in the rate of yield response to ET and compression of the range of yield response to ET. Against the notion of managing crops to maximise water use efficiency in low rainfall environments, we emphasise the trade-off between water use efficiency and nitrogen utilisation efficiency, particularly under conditions of high nitrogen-to-grain price ratio. The rate-range framework to characterise the relationship between yield and evapotranspiration is useful to capture this trade-off as the parameters were responsive to both nitrogen supply and climatic factors.