808 resultados para Best management practices (Pollution prevention)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the post-rainy (rabi) season in India around 3 million tonnes of sorghum grain is produced from 5.7 million ha of cropping. This underpins the livelihood of about 5 million households. Severe drought is common as the crop grown in these areas relies largely on soil moisture stored during the preceding rainy season. Improvement of rabi sorghum cultivars through breeding has been slow but could be accelerated if drought scenarios in the production regions were better understood. The sorghum crop model within the APSIM (Agricultural Production Systems sIMulator) platform was used to simulate crop growth and yield and the pattern of crop water status through each season using available historical weather data. The current model reproduced credibly the observed yield variation across the production region (R2=0.73). The simulated trajectories of drought stress through each crop season were clustered into five different drought stress patterns. A majority of trajectories indicated terminal drought (43%) with various timings of onset during the crop cycle. The most severe droughts (25% of seasons) were when stress began before flowering and resulted in failure of grain production in most cases, although biomass production was not affected so severely. The frequencies of drought stress types were analyzed for selected locations throughout the rabi tract and showed different zones had different predominating stress patterns. This knowledge can help better focus the search for adaptive traits and management practices to specific stress situations and thus accelerate improvement of rabi sorghum via targeted specific adaptation. The case study presented here is applicable to other sorghum growing environments. © 2012 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The off-site transport of agricultural chemicals, such as herbicides, into freshwater and marine ecosystems is a world-wide concern. The adoption of farm management practices that minimise herbicide transport in rainfall-runoff is a priority for the Australian sugarcane industry, particularly in the coastal catchments draining into the World Heritage listed Great Barrier Reef (GBR) lagoon. In this study, residual herbicide runoff and infiltration were measured using a rainfall simulator in a replicated trial on a brown Chromosol with 90–100% cane trash blanket cover in the Mackay Whitsunday region, Queensland. Management treatments included conventional 1.5 m spaced sugarcane beds with a single row of sugarcane (CONV) and 2 m spaced, controlled traffic sugarcane beds with dual sugarcane rows (0.8 m apart) (2mCT). The aim was to simulate the first rainfall event after the application of the photosynthesis inhibiting (PSII) herbicides ametryn, atrazine, diuron and hexazinone, by broadcast (100% coverage, on bed and furrow) and banding (50–60% coverage, on bed only) methods. These events included heavy rainfall 1 day after herbicide application, considered a worst case scenario, or rainfall 21 days after application. The 2mCT rows had significantly (P < 0.05) less runoff (38%) and lower peak runoff rates (43%) than CONV rows for a rainfall average of 93 mm at 100 mm h−1 (1:20 yr Average Return Interval). Additionally, final infiltration rates were higher in 2mCT rows than CONV rows, with 72 and 52 mm h−1 respectively. This resulted in load reductions of 60, 55, 47, and 48% for ametryn, atrazine, diuron and hexazinone from 2mCT rows, respectively. Herbicide losses in runoff were also reduced by 32–42% when applications were banded rather than broadcast. When rainfall was experienced 1 day after application, a large percentage of herbicides were washed off the cane trash. However, by day 21, concentrations of herbicide residues on cane trash were lower and more resistant to washoff, resulting in lower losses in runoff. Consequently, ametryn and atrazine event mean concentrations in runoff were approximately 8 fold lower at day 21 compared with day 1, whilst diuron and hexazinone were only 1.6–1.9 fold lower, suggesting longer persistence of these chemicals. Runoff collected at the end of the paddock in natural rainfall events indicated consistent though smaller treatment differences to the rainfall simulation study. Overall, it was the combination of early application, banding and controlled traffic that was most effective in reducing herbicide losses in runoff. Crown copyright © 2012

Relevância:

100.00% 100.00%

Publicador:

Resumo:

More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inheritance and fitness of phosphine resistance was investigated in an Australian strain of the rice weevil, Sitophilus oryzae (L.), as well as its prevalence in eastern Australia. This type of knowledge may provide insights in to the development of phosphine resistance in this species with the potential for better management. This strain was 12.2 × resistant at the LC50 level based on results for adults exposed for 20 h. Data from the testing of F1 adults from the reciprocal crosses (R♀ × S♂ and S♀ × R♂) showed that resistance was autosomal and inherited as an incompletely recessive trait with a degree of dominance of -0.88. The dose-response data for the F1 × S and F1 × R test crosses, and the F2 progeny were compared with predicted dose-response assuming monogenic recessive inheritance, and the results were consistent with resistance being conferred by one major gene. There was no evidence of fitness cost based on the frequency of susceptible phenotypes in hybridized populations that were reared for seven generations without exposure to phosphine. Lack of fitness cost suggests that resistant alleles will tend to persist in field populations that have undergone selection even if selection pressure is removed. Discriminating dose tests on 107 population samples collected from farms from 2006 to 2010 show that populations containing insects with the weak resistant phenotype are common in eastern Australia, although the frequency of resistant phenotypes within samples was typically low. The prevalence of resistance is a warning that this species has been subject to considerable selection pressure and that effective resistance management practices are needed to address this problem. Crown Copyright © 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Weed management practices in cotton systems that were based on frequent cultivation, residual herbicides, and some post-emergent herbicides have changed. The ability to use glyphosate as a knockdown before planting, in shielded sprayers, and now over-the-top in glyphosate-tolerant cotton has seen a significant reduction in the use of residual herbicides and cultivation. Glyphosate is now the dominant herbicide in both crop and fallow. This reliance increases the risk of shifts to glyphosate-tolerant species and the evolution of glyphosate-resistant weeds. Four surveys were undertaken in the 2008-09 and 2010-11 seasons. Surveys were conducted at the start of the summer cropping season (November-December) and at the end of the same season (March-April). Fifty fields previously surveyed in irrigated and non-irrigated cotton systems were re-surveyed. A major species shift towards Conyza bonariensis was observed. There was also a minor increase in the prevalence of Sonchus oleraceus. Several species were still present at the end of the season, indicating either poor control and/or late-season germinations. These included C. bonariensis, S. oleraceus, Hibiscus verdcourtii and Hibiscus tridactylites, Echinochloa colona, Convolvulus sp., Ipomea lonchophylla, Chamaesyce drummondii, Cullen sp., Amaranthus macrocarpus, and Chloris virgata. These species, with the exception of E. colona, H. verdcourtii, and H. tridactylites, have tolerance to glyphosate and therefore are likely candidates to either remain or increase in dominance in a glyphosate-based system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Small, not-for-profit organisations fulfil a need in the economy that is typically not satisfied by for-profit firms. They also operate in ways that are distinct from larger organisations. While such firms employ a substantial proportion of the workforce, research addressing human resource management (HRM) practices in these settings is limited. This article used data collected from five small not-for-profit firms in Australia to examine the way one significant HRM practice – the provision and utilisation of flexible work arrangements – operates in the sector. Drawing on research from several scholarly fields, the article firstly develops a framework comprising three tensions in not-for-profits that have implications for HRM. These tensions are: (1) contradictions between an informal approach to HRM vs. a formal regulatory system; (2) employee values that favour social justice vs. external market forces; and (3) a commitment to service vs. external financial expectations. The article then empirically examines how these tensions are managed in relation to the specific case of flexible work arrangements. The study reveals that tensions around providing and accessing flexible work arrangements are managed in three ways: discretion, leadership style and distancing. These findings more broadly inform the way HRM is operationalised in this under-examined sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To break the yield ceiling of rice production, a super rice project was developed in 1996 to breed rice varieties with super high yield. A two-year experiment was conducted to evaluate yield and nitrogen (N)-use response of super rice to different planting methods in the single cropping season. A total of 17 rice varieties, including 13 super rice and four non-super checks (CK), were grown under three N levels [0 (N0), 150 (N150), and 225 (N225) kg ha−1] and two planting methods [transplanting (TP) and direct-seeding in wet conditions (WDS)]. Grain yield under WDS (7.69 t ha−1) was generally lower than TP (8.58 t ha−1). However, grain yield under different planting methods was affected by N rates as well as variety groups. In both years, there was no difference in grain yield between super and CK varieties at N150, irrespective of planting methods. However, grain yield difference was dramatic in japonica groups at N225, that is, there was an 11.3% and 14.1% average increase in super rice than in CK varieties in WDS and TP, respectively. This suggests that high N input contributes to narrowing the yield gap in super rice varieties, which also indicates that super rice was bred for high fertility conditions. In the japonica group, more N was accumulated in super rice than in CK at N225, but no difference was found between super and CK varieties at N0 and N150. Similar results were also found for N agronomic efficiency. The results suggest that super rice varieties have an advantage for N-use efficiency when high N is applied. The response of super rice was greater under TP than under WDS. The results suggest that the need to further improve agronomic and other management practices to achieve high yield and N-use efficiency for super rice varieties in WDS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Many prey species around the world are suffering declines due to a variety of interacting causes such as land use change, climate change, invasive species and novel disease. Recent studies on the ecological roles of top-predators have suggested that lethal top-predator control by humans (typically undertaken to protect livestock or managed game from predation) is an indirect additional cause of prey declines through trophic cascade effects. Such studies have prompted calls to prohibit lethal top-predator control with the expectation that doing so will result in widespread benefits for biodiversity at all trophic levels. However, applied experiments investigating in situ responses of prey populations to contemporary top-predator management practices are few and none have previously been conducted on the eclectic suite of native and exotic mammalian, reptilian, avian and amphibian predator and prey taxa we simultaneously assess. We conducted a series of landscape-scale, multi-year, manipulative experiments at nine sites spanning five ecosystem types across the Australian continental rangelands to investigate the responses of sympatric prey populations to contemporary poison-baiting programs intended to control top-predators (dingoes) for livestock protection. Results Prey populations were almost always in similar or greater abundances in baited areas. Short-term prey responses to baiting were seldom apparent. Longer-term prey population trends fluctuated independently of baiting for every prey species at all sites, and divergence or convergence of prey population trends occurred rarely. Top-predator population trends fluctuated independently of baiting in all cases, and never did diverge or converge. Mesopredator population trends likewise fluctuated independently of baiting in almost all cases, but did diverge or converge in a few instances. Conclusions These results demonstrate that Australian populations of prey fauna at lower trophic levels are typically unaffected by top-predator control because top-predator populations are not substantially affected by contemporary control practices, thus averting a trophic cascade. We conclude that alteration of current top-predator management practices is probably unnecessary for enhancing fauna recovery in the Australian rangelands. More generally, our results suggest that theoretical and observational studies advancing the idea that lethal control of top-predators induces trophic cascades may not be as universal as previously supposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glyphosate-resistant Echinochloa colona L. (Link) is becoming common in non-irrigated cotton systems. Echinochloa colona is a small seeded species that is not wind-blown and has a relatively short seed bank life. These characteristics make it a potential candidate to attempt to eradicate resistant populations when they are detected. A long term systems experiment was developed to determine the feasibility of attempting to eradicate glyphosate resistant populations in the field. To this point the established Best Management Practice (BMP) strategy of two non-glyphosate actions in crop and fallow have been sufficient to significantly reduce the numbers of plants emerging, and remaining at the end of the season. Additional eradication treatments showed slight improvement on the BMP strategy, however were not significant overall. The effects of additional eradication tactics are expected to be more noticeable as the seed bank gets driven down in subsequent seasons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pasture rest is a possible strategy for improving land condition in the extensive grazing lands of northern Australia. If pastures currently in poor condition could be improved, then overall animal productivity and the sustainability of grazing could be increased. The scientific literature is examined to assess the strength of the experimental information to support and guide the use of pasture rest, and simulation modelling is undertaken to extend this information to a broader range of resting practices, growing conditions and initial pasture condition. From this, guidelines are developed that can be applied in the management of northern Australia’s grazing lands and also serve as hypotheses for further field experiments. The literature on pasture rest is diverse but there is a paucity of data from much of northern Australia as most experiments have been conducted in southern and central parts of Queensland. Despite this, the limited experimental information and the results from modelling were used to formulate the following guidelines. Rest during the growing season gives the most rapid improvement in the proportion of perennial grasses in pastures; rest during the dormant winter period is ineffective in increasing perennial grasses in a pasture but may have other benefits. Appropriate stocking rates are essential to gain the greatest benefit from rest: if stocking rates are too high, then pasture rest will not lead to improvement; if stocking rates are low, pastures will tend to improve without rest. The lower the initial percentage of perennial grasses, the more frequent the rests should be to give a major improvement within a reasonable management timeframe. Conditions during the growing season also have an impact on responses with the greatest improvement likely to be in years of good growing conditions. The duration and frequency of rest periods can be combined into a single value expressed as the proportion of time during which resting occurs; when this is done the modelling suggests the greater the proportion of time that a pasture is rested, the greater is the improvement but this needs to be tested experimentally. These guidelines should assist land managers to use pasture resting but the challenge remains to integrate pasture rest with other pasture and animal management practices at the whole-property scale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The prevalence of resistance to phosphine in the rust-red flour beetle, Tribolium castaneum, from eastern Australia was investigated, as well as the potential fitness cost of this type of resistance. Discriminating dose tests on 115 population samples collected from farms from 2006 to 2010 showed that populations containing insects with the weakly resistant phenotype are common in eastern Australia (65.2 of samples), although the frequency of resistant phenotypes within samples was typically low (median of 2.3). The population cage approach was used to investigate the possibility that carrying the alleles for weak resistance incurs a fitness cost. Hybridized populations were initiated using a resistant strain and either of two different susceptible strains. There was no evidence of a fitness cost based on the frequency of susceptible phenotypes in hybridized populations that were reared for seven generations without exposure to phosphine. This suggests that resistant alleles will tend to persist in field populations that have undergone selection even if selection pressure is removed. The prevalence of resistance is a warning that this species has been subject to considerable selection pressure and that effective resistance management practices are needed to address this problem. The resistance prevalence data also provide a basis against which to measure management success.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The root-lesion nematodes (RLN) Pratylenchus thornei and P. neglectus are widely distributed in Australian grain producing regions and can reduce the yield of intolerant wheat cultivars by up to 65 , costing the industry ~123 M AUD/year. Consequently, researchers in the northern, southern and western regions have independently developed procedures to evaluate the resistance of cereal cultivars to RLN. To compare results, each of the three laboratories phenotyped a set of 26 and 36 cereal cultivars for relative resistance/susceptibility to P. thornei and P. neglectus respectively. The northern and southern regions also investigated the effects of planting time and experiment duration on RLN reproduction and cultivar ranking. Results show the genetic correlation between cultivars tested using the northern and southern procedures evaluating P. thornei resistance was 0.93. Genetic correlations between experiments using the same procedure, but with different planting times, were 0.99 for both northern and southern procedures. The genetic correlation between cultivars tested using the northern, southern and western procedures evaluating P. neglectus resistance ranged from 0.71 to 0.95. Genetic correlations between experiments using the same procedure but with different planting times ranged from 0.91 to 0.99. This study established that, even though experiments were conducted in different geographic locations and with different trial management practices, the diverse nematode resistance screening procedures ranked cultivars similarly. Consequently, RLN resistance data can be pooled across regions to provide national consensus ratings of cultivars.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An estimated 110 Mt of dust is eroded by wind from the Australian land surface each year, most of which originates from the arid and semi-arid rangelands. Livestock production is thought to increase the susceptibility of the rangelands to wind erosion by reducing vegetation cover and modifying surface soil stability. However, research is yet to quantify the impacts of grazing land management on the erodibility of the Australian rangelands, or determine how these impacts vary among land types and over time. We present a simulation analysis that links a pasture growth and animal production model (GRASP) to the Australian Land Erodibility Model (AUSLEM) to evaluate the impacts of stocking rate, stocking strategy and land condition on the erodibility of four land types in western Queensland, Australia. Our results show that declining land condition, over stocking, and using inflexible stocking strategies have potential to increase land erodibility and amplify accelerated soil erosion. However, land erodibility responses to grazing are complex and influenced by land type sensitivities to different grazing strategies and local climate characteristics. Our simulations show that land types which are more resilient to livestock grazing tend to be least susceptible to accelerated wind erosion. Increases in land erodibility are found to occur most often during climatic transitions when vegetation cover is most sensitive to grazing pressure. However, grazing effects are limited during extreme wet and dry periods when the influence of climate on vegetation cover is strongest. Our research provides the opportunity to estimate the effects of different land management practices across a range of land types, and provides a better understanding of the mechanisms of accelerated erosion resulting from pastoral activities. The approach could help further assessment of land erodibility at a broader scale notably if combined with wind erosion models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessing the impacts of climate variability on agricultural productivity at regional, national or global scale is essential for defining adaptation and mitigation strategies. We explore in this study the potential changes in spring wheat yields at Swift Current and Melfort, Canada, for different sowing windows under projected climate scenarios (i.e., the representative concentration pathways, RCP4.5 and RCP8.5). First, the APSIM model was calibrated and evaluated at the study sites using data from long term experimental field plots. Then, the impacts of change in sowing dates on final yield were assessed over the 2030-2099 period with a 1990-2009 baseline period of observed yield data, assuming that other crop management practices remained unchanged. Results showed that the performance of APSIM was quite satisfactory with an index of agreement of 0.80, R2 of 0.54, and mean absolute error (MAE) and root mean square error (RMSE) of 529 kg/ha and 1023 kg/ha, respectively (MAE = 476 kg/ha and RMSE = 684 kg/ha in calibration phase). Under the projected climate conditions, a general trend in yield loss was observed regardless of the sowing window, with a range from -24 to -94 depending on the site and the RCP, and noticeable losses during the 2060s and beyond (increasing CO2 effects being excluded). Smallest yield losses obtained through earlier possible sowing date (i.e., mid-April) under the projected future climate suggested that this option might be explored for mitigating possible adverse impacts of climate variability. Our findings could therefore serve as a basis for using APSIM as a decision support tool for adaptation/mitigation options under potential climate variability within Western Canada.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite considerable effort and a broad range of new approaches to safety management over the years, the upstream oil & gas industry has been frustrated by the sector’s stubbornly high rate of injuries and fatalities. This short communication points out, however, that the industry may be in a position to make considerable progress by applying “Big Data” analytical tools to the large volumes of safety-related data that have been collected by these organizations. Toward making this case, we examine existing safety-related information management practices in the upstream oil & gas industry, and specifically note that data in this sector often tends to be highly customized, difficult to analyze using conventional quantitative tools, and frequently ignored. We then contend that the application of new Big Data kinds of analytical techniques could potentially reveal patterns and trends that have been hidden or unknown thus far, and argue that these tools could help the upstream oil & gas sector to improve its injury and fatality statistics. Finally, we offer a research agenda toward accelerating the rate at which Big Data and new analytical capabilities could play a material role in helping the industry to improve its health and safety performance.