182 resultados para season
Resumo:
One major benefit of land application of biosolids is to supply nitrogen (N) for agricultural crops, and understanding mineralisation processes is the key for better N-management strategies. Field studies were conducted to investigate the process of mineralisation of three biosolids products (aerobic, anaerobic, and thermally dried biosolids) incorporated into four different soils at rates of 7-90 wet t/ha in subtropical Queensland. Two of these studies also examined mineralisation rates of commonly used organic amendments (composts, manures, and sugarcane mill muds). Organic N in all biosolids products mineralised very rapidly under ambient conditions in subtropical Queensland, with rates much faster than from other common amendments. Biosolids mineralisation rates ranged from 30 to 80% of applied N during periods ranging from 3.5 to 18 months after biosolids application; these rates were much higher than those suggested in the biosolids land application guidelines established by the NSW EPA (15% for anaerobic and 25% for aerobic biosolids). There was no consistently significant difference in mineralisation rate between aerobic and anaerobic biosolids in our studies. When applied at similar rates of N addition, other organic amendments supplied much less N to the soil mineral N and plant N pools during the crop season. A significant proportion of the applied biosolids total N (up to 60%) was unaccounted for at the end of the observation period. High rates of N addition in calculated Nitrogen Limited Biosolids Application Rates (850-1250 kg N/ha) resulted in excessive accumulation of mineral N in the soil profile, which increases the environmental risks due to leaching, runoff, or gaseous N losses. Moreover, the rapid mineralisation of the biosolids organic N in these subtropical environments suggests that biosolids should be applied at lower rates than in temperate areas, and that care must be taken with the timing to maximise plant uptake and minimise possible leaching, runoff, or denitrification losses of mineralised N.
Resumo:
Although rust (caused by Puccinia purpurea) is a common disease in Australian grain sorghum crops, particularly late in the growing season (April onwards), its potential to reduce yield has not been quantified. Field trials were conducted in Queensland between 2003 and 2005 to evaluate the effect of sorghum rust on grain yield of two susceptible sorghum hybrids (Tx610 and Pride). Rust was managed from 28-35 days after sowing until physiological maturity by applying oxycarboxin (1 kg active ingredient/100 L of water/ha) every 10 days. When data were combined for the hybrids, yield losses ranged from 13.1% in 2005 to 3.2% in 2003 but differences in yield the between sprayed and unsprayed treatments were statistically significant (P a parts per thousand currency signaEuro parts per thousand 0.05) only in 2005. Final area under the disease progress curve (AUDPC) values reflected the yield losses in each year. The higher yield loss in 2005 can be attributed primarily to the early development of the rust epidemic and the higher inoculum levels in spreader plots at the time of planting of the trials.
Resumo:
The spot or strip application of poisoned protein bait is a lure-and-kill technique used for the management of fruit flies. Knowledge of where flies occur in the crop environment is an important part of maximizing the efficacy of this tool. Bactrocera tryoni is a polyphagous pest of horticulture for which very little is known about its distribution within crops. With particular reference to edge effects, we monitored the abundance of B. tryoni in two crops of different architecture; strawberry and apple. In strawberries, we found more flies on the crop edge early in the fruiting season, which lessened gradually and eventually disappeared as the season progressed. In apple orchards, no such edge effect was observed and flies were found equally throughout the orchard. We postulated these differences may be due to differences in crop height (high vs. short) and/or crop canopy architecture (opened and branched in apple, dense and closed in strawberry). In a field cage trial, we tested these predictions using artificial plants of different height and canopy condition. Height and canopy structure type had no significant effects on fly oviposition and protein feeding, but the ‘apple’ type canopy significantly influenced resting. We thus postulate that there was an edge effect in strawberry because the crop was not providing resting sites and flies were doing so in vegetation around the field margins. The finding that B. tryoni shows different resting site preferences based on plant architecture offers the potential for strategic manipulation of the fly through specific border or inter-row plantings.
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.
Resumo:
Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.
Resumo:
Data from 9296 calves born to 2078 dams over 9 years across five sites were used to investigate factors associated with calf mortality for tropically adapted breeds (Brahman and Tropical Composite) recorded in extensive production systems, using multivariate logistic regression. The average calf mortality pre-weaning was 9.5% of calves born, varying from 1.5% to 41% across all sites and years. In total, 67% of calves that died did so within a week of their birth, with cause of death most frequently recorded as unknown. The major factors significantly (P < 0.05) associated with mortality for potentially large numbers of calves included the specific production environment represented by site-year, low calf birthweight (more so than high birthweight) and horn status at branding. Almost all calf deaths post-branding (assessed from n = 8348 calves) occurred in calves that were dehorned, totalling 2.1% of dehorned calves and 15.9% of all calf deaths recorded. Breed effects on calf mortality were primarily the result of breed differences in calf birthweight and, to a lesser extent, large teat size of cows; however, differences in other breed characteristics could be important. Twin births and calves assisted at birth had a very high risk of mortality, but <1% of calves were twins and few calves were assisted at birth. Conversely, it could not be established how many calves would have benefitted from assistance at birth. Cow age group and outcome from the previous season were also associated with current calf mortality; maiden or young cows (<4 years old) had increased calf losses overall. More mature cows with a previous outcome of calf loss were also more likely to have another calf loss in the subsequent year, and this should be considered for culling decisions. Closer attention to the management of younger cows is warranted to improve calf survival.
Resumo:
Global cereal production will need to increase by 50% to 70% to feed a world population of about 9 billion by 2050. This intensification is forecast to occur mostly in subtropical regions, where warm and humid conditions can promote high N2O losses from cropped soils. To secure high crop production without exacerbating N2O emissions, new nitrogen (N) fertiliser management strategies are necessary. This one-year study evaluated the efficacy of a nitrification inhibitor (3,4-dimethylpyrazole phosphate—DMPP) and different N fertiliser rates to reduce N2O emissions in a wheat–maize rotation in subtropical Australia. Annual N2O emissions were monitored using a fully automated greenhouse gas measuring system. Four treatments were fertilized with different rates of urea, including a control (40 kg-N ha−1 year−1), a conventional N fertiliser rate adjusted on estimated residual soil N (120 kg-N ha−1 year−1), a conventional N fertiliser rate (240 kg-N ha−1 year−1) and a conventional N fertiliser rate (240 kg-N ha−1 year−1) with nitrification inhibitor (DMPP) applied at top dressing. The maize season was by far the main contributor to annual N2O emissions due to the high soil moisture and temperature conditions, as well as the elevated N rates applied. Annual N2O emissions in the four treatments amounted to 0.49, 0.84, 2.02 and 0.74 kg N2O–N ha−1 year−1, respectively, and corresponded to emission factors of 0.29%, 0.39%, 0.69% and 0.16% of total N applied. Halving the annual conventional N fertiliser rate in the adjusted N treatment led to N2O emissions comparable to the DMPP treatment but extensively penalised maize yield. The application of DMPP produced a significant reduction in N2O emissions only in the maize season. The use of DMPP with urea at the conventional N rate reduced annual N2O emissions by more than 60% but did not affect crop yields. The results of this study indicate that: (i) future strategies aimed at securing subtropical cereal production without increasing N2O emissions should focus on the fertilisation of the summer crop; (ii) adjusting conventional N fertiliser rates on estimated residual soil N is an effective practice to reduce N2O emissions but can lead to substantial yield losses if the residual soil N is not assessed correctly; (iii) the application of DMPP is a feasible strategy to reduce annual N2O emissions from sub-tropical wheat–maize rotations. However, at the N rates tested in this study DMPP urea did not increase crop yields, making it impossible to recoup extra costs associated with this fertiliser. The findings of this study will support farmers and policy makers to define effective fertilisation strategies to reduce N2O emissions from subtropical cereal cropping systems while maintaining high crop productivity. More research is needed to assess the use of DMPP urea in terms of reducing conventional N fertiliser rates and subsequently enable a decrease of fertilisation costs and a further abatement of fertiliser-induced N2O emissions.
Resumo:
Historical stocking methods of continuous, season-long grazing of pastures with little account of growing conditions have caused some degradation within grazed landscapes in northern Australia. Alternative stocking methods have been implemented to address this degradation and raise the productivity and profitability of the principal livestock, cattle. Because information comparing stocking methods is limited, an evaluation was undertaken to quantify the effects of stocking methods on pastures, soils and grazing capacity. The approach was to monitor existing stocking methods on nine commercial beef properties in north and south Queensland. Environments included native and exotic pastures and eucalypt (lighter soil) and brigalow (heavier soil) land types. Breeding and growing cattle were grazed under each method. The owners/managers, formally trained in pasture and grazing management, made all management decisions affecting the study sites. Three stocking methods were compared: continuous (with rest), extensive rotation and intensive rotation (commonly referred to as 'cell grazing'). There were two or three stocking methods examined on each property: in total 21 methods (seven continuous, six extensive rotations and eight intensive rotations) were monitored over 74 paddocks, between 2006 and 2009. Pasture and soil surface measurements were made in the autumns of 2006, 2007 and 2009, while the paddock grazing was analysed from property records for the period from 2006 to 2009. The first 2 years had drought conditions (rainfall average 3.4 decile) but were followed by 2 years of above-average rainfall. There were no consistent differences between stocking methods across all sites over the 4 years for herbage mass, plant species composition, total and litter cover, or landscape function analysis (LFA) indices. There were large responses to rainfall in the last 2 years with mean herbage mass in the autumn increasing from 1970 kg DM ha(-1) in 2006-07 to 3830 kg DM ha(-1) in 2009. Over the same period, ground and litter cover and LFA indices increased. Across all sites and 4 years, mean grazing capacity was similar for the three stocking methods. There were, however, significant differences in grazing capacity between stocking methods at four sites but these differences were not consistent between stocking methods or sites. Both the continuous and intensive rotation methods supported the highest average annual grazing capacity at different sites. The results suggest that cattle producers can obtain similar ecological responses and carry similar numbers of livestock under any of the three stocking methods.
Resumo:
Alternaria leaf blotch and fruit spot caused by Alternaria spp. cause annual losses to the Australian apple industry. Control options are limited, mainly due to a lack of understanding of the disease cycle. Therefore, this study aimed to determine potential sources of Alternaria spp. inoculum in the orchard and examine their relative contribution throughout the production season. Leaf residue from the orchard floor, canopy leaves, twigs and buds were collected monthly from three apple orchards for two years and examined for the number of spores on their surface. In addition, the effects of climatic factors on spore production dynamics in each plant part were examined. Although all four plant parts tested contributed to the Alternaria inoculum in the orchard, significant higher numbers of spores were obtained from leaf residue than the other plant parts supporting the hypothesis that overwintering of Alternaria spp. occurred mainly in leaf residue and minimally on twigs and buds. The most significant period of spore production on leaf residue occurred from dormancy until bloom and on canopy leaves and twigs during the fruit growth stage. Temperature was the single most significant factor influencing the amount of Alternaria inoculum and rainfall and relative humidity showed strong associations with temperature influencing the spore production dynamics in Australian orchards. The practical implications of this study include the eradication of leaf residue from the orchard floor and sanitation of the canopy after harvest to remove residual spores from the trees.
Resumo:
The Florida manatee, Trichechus manatus latirostris, is a hindgut-fermenting herbivore. In winter, manatees migrate to warm water overwintering sites where they undergo dietary shifts and may suffer from cold-induced stress. Given these seasonally induced changes in diet, the present study aimed to examine variation in the hindgut bacterial communities of wild manatees overwintering at Crystal River, west Florida. Faeces were sampled from 36 manatees of known sex and body size in early winter when manatees were newly arrived and then in mid-winter and late winter when diet had probably changed and environmental stress may have increased. Concentrations of faecal cortisol metabolite, an indicator of a stress response, were measured by enzyme immunoassay. Using 454-pyrosequencing, 2027 bacterial operational taxonomic units were identified in manatee faeces following amplicon pyrosequencing of the 16S rRNA gene V3/V4 region. Classified sequences were assigned to eight previously described bacterial phyla; only 0.36% of sequences could not be classified to phylum level. Five core phyla were identified in all samples. The majority (96.8%) of sequences were classified as Firmicutes (77.3 ± 11.1% of total sequences) or Bacteroidetes (19.5 ± 10.6%). Alpha-diversity measures trended towards higher diversity of hindgut microbiota in manatees in mid-winter compared to early and late winter. Beta-diversity measures, analysed through permanova, also indicated significant differences in bacterial communities based on the season.
Resumo:
Lemurs are the most olfactory-oriented of primates, yet there is still only a basic level of understanding of what their scent marks communicate. We analyzed scent secretions from Milne-Edwards' sifakas (Propithecus edwardsi) collected in their natural habitat of Ranomafana National Park, Madagascar. We sought to test whether the scent mark could signal genetic relatedness in addition to species, sex, season, and individuality. We not only found correlations (r 2 = 0.38, P = 0.017) between the total olfactory fingerprint and genetic relatedness but also between relatedness and specific components of the odor, despite the complex environmental signals from differences in diet and behavior in a natural setting. To the best of our knowledge, this is the first demonstration of an association between genetic relatedness and chemical communication in a wild primate population. Furthermore, we found a variety of compounds that were specific to each sex and each sampling period. This research shows that scent marks could act as a remote signal to avoid inbreeding, optimize mating opportunities, and potentially aid kin selection. © 2013 Springer-Verlag Berlin Heidelberg.
Resumo:
Long-fallow disorder is expressed as exacerbated deficiencies of phosphorus (P) and/or zinc (Zn) in field crops growing after long periods of weed-free fallow. The hypothesis that arbuscular-mycorrhizal fungi (AMF) improve the P and Zn nutrition, and thereby biomass production and seed yield of linseed (Linum usitatissimum) was tested in a field experiment. A factorial combination of treatments consisting of +/- fumigation, +/- AMF inoculation with Glomus spp., +/- P and +/- Zn fertilisers was used on a long-fallowed vertisol. The use of such methods allowed an absolute comparison of plants growing with and without AMF in the field for the first time in a soil disposed to long-fallow disorder. Plant biomass, height, P and Zn concentrations and contents, boll number and final seed yield were (a) least in fumigated soil with negligible AMF colonisation of the roots, (b) low initially in long-fallow soil but increased with time as AMF colonisation of the roots developed, and (c) greatest in soil inoculated with AMF cultures. The results showed for the first time in the field that inflows of both P and Zn into linseed roots were highly dependent on %AMF-colonisation (R-2 = 0.95 for P and 0.85 for Zn, P < 0.001) in a soil disposed to long-fallow disorder. Relative field mycorrhizal dependencies without and with P+Zn fertiliser were 85 % and 86 % for biomass and 68 % and 52 % for seed yield respectively. This research showed in the field that AMF greatly improved the P and Zn nutrition, biomass production and seed yield of linseed growing in a soil disposed to long-fallow disorder. The level of mycorrhizal colonisation of plants suffering from long-fallow disorder can increase during the growing season resulting in improved plant growth and residual AMF inoculum in the soil, and thus it is important for growers to recognise the cause and not terminate a poor crop prematurely in order to sow another. Other positive management options to reduce long fallows and foster AMF include adoption of conservation tillage and opportunity cropping.
Resumo:
Glyphosate resistance is a rapidly developing threat to profitability in Australian cotton farming. Resistance causes an immediate reduction in the effectiveness of in-crop weed control in glyphosate-resistant transgenic cotton and summer fallows. Although strategies for delaying glyphosate resistance and those for managing resistant populations are qualitatively similar, the longer resistance can be delayed, the longer cotton growers will have choice over which tactics to apply and when to apply them. Effective strategies to avoid, delay, and manage resistance are thus of substantial value. We used a model of glyphosate resistance dynamics to perform simulations of resistance evolution in Sonchus oleraceus (common sowthistle) and Echinochloa colona (awnless barnyard grass) under a range of resistance prevention, delaying, and management strategies. From these simulations, we identified several elements that could contribute to effective glyphosate resistance prevention and management strategies. (i) Controlling glyphosate survivors is the most robust approach to delaying or preventing resistance. High-efficacy, high-frequency survivor control almost doubled the useful lifespan of glyphosate from 13 to 25 years even with glyphosate alone used in summer fallows. (ii) Two non-glyphosate tactics in-crop plus two in-summer fallows is the minimum intervention required for long-term delays in resistance evolution. (iii) Pre-emergence herbicides are important, but should be backed up with non-glyphosate knockdowns and strategic tillage; replacing a late-season, pre-emergence herbicide with inter-row tillage was predicted to delay glyphosate resistance by 4 years in awnless barnyard grass. (iv) Weed species' ecological characteristics, particularly seed bank dynamics, have an impact on the effectiveness of resistance strategies; S. oleraceus, because of its propensity to emerge year-round, was less exposed to selection with glyphosate than E. colona, resulting in an extra 5 years of glyphosate usefulness (18 v. 13 years) even in the most rapid cases of resistance evolution. Delaying tactics are thus available that can provide some or many years of continued glyphosate efficacy. If glyphosate-resistant cotton cropping is to remain profitable in Australian farming systems in the long-term, however, growers must adapt to the probability that they will have to deal with summer weeds that are no longer susceptible to glyphosate. Robust resistance management systems will need to include a diversity of weed control options, used appropriately.
Resumo:
Weed management practices in cotton systems that were based on frequent cultivation, residual herbicides, and some post-emergent herbicides have changed. The ability to use glyphosate as a knockdown before planting, in shielded sprayers, and now over-the-top in glyphosate-tolerant cotton has seen a significant reduction in the use of residual herbicides and cultivation. Glyphosate is now the dominant herbicide in both crop and fallow. This reliance increases the risk of shifts to glyphosate-tolerant species and the evolution of glyphosate-resistant weeds. Four surveys were undertaken in the 2008-09 and 2010-11 seasons. Surveys were conducted at the start of the summer cropping season (November-December) and at the end of the same season (March-April). Fifty fields previously surveyed in irrigated and non-irrigated cotton systems were re-surveyed. A major species shift towards Conyza bonariensis was observed. There was also a minor increase in the prevalence of Sonchus oleraceus. Several species were still present at the end of the season, indicating either poor control and/or late-season germinations. These included C. bonariensis, S. oleraceus, Hibiscus verdcourtii and Hibiscus tridactylites, Echinochloa colona, Convolvulus sp., Ipomea lonchophylla, Chamaesyce drummondii, Cullen sp., Amaranthus macrocarpus, and Chloris virgata. These species, with the exception of E. colona, H. verdcourtii, and H. tridactylites, have tolerance to glyphosate and therefore are likely candidates to either remain or increase in dominance in a glyphosate-based system.
Resumo:
The effect of plastic high tunnels on the performance of two strawberry (Fragaria ×ananassa) cultivars (Festival and Rubygem) and two breeding lines was studied in southeastern Queensland, Australia, over 2 years. Production in this area is affected by rain, with direct damage to the fruit and the development of fruit disease before harvest. The main objective of the study was to determine whether plants growing under tunnels had less rain damage, a lower incidence of disease, and higher yields than plants growing outdoors. Plants growing under the tunnels or outdoors had at best only small differences in leaf, crown, root, and flower and immature fruit dry weight. These responses were associated with relatively similar temperatures and relative humidities in the two growing environments. Marketable yields were 38% higher under the tunnels compared with yields outdoors in year 1, and 24% higher in year 2, mainly due to less rain damage. There were only small differences in the incidences of grey mold (Botrytis cinerea) and small and misshaped fruit in the plants growing under the tunnels and outdoors. There were also only small differences in postharvest quality, total soluble solids, and titratable acidity between the two environments. These results highlight the potential of plastic high tunnels for strawberry plants growing in subtropical areas that receive significant rainfall during the production season.