47 resultados para ALMOST P-COMPACT
Resumo:
Oreochromis mossambicus (Peters 1852) are native to the eastward flowing rivers of central and southern Africa but from the early 1930s they have been widely distributed around the world for aquaculture and for biological control of weeds and insects. While O. mossambicus are now not commonly used as an aquaculture species, the biological traits that made them a popular culture species including tolerance to wide ranging ecological conditions, generalist dietary requirements and rapid reproduction with maternal care have also made them a 'model' invader. Self-sustaining populations now exist in almost every region to which they have been imported. In Australia, since their introduction in the 1970s, O. mossambicus have become established in catchments along the east and west coasts and have the potential to colonise other adjacent drainages. It is thought that intentional translocations are likely to be the most significant factor in their spread in Australia. The ecological and physical tolerances and preferences, reproductive behaviour, hybridization and the high degree of plasticity in the life history traits of O. mossambicus are reviewed. Impacts of O. mossambicus on natural ecosystems including competitive displacement of native species, habitat alteration, predation and as a vector in the spread of diseases are discussed. Potential methods for eradicating or controlling invasive populations of O. mossambicus including physical removal, piscicides, screens, environmental management and genetic technologies are outlined.
Resumo:
Nematode species Pratylenchus thornei and P. neglectus are the two most important root-lesion nematodes affecting wheat (Triticum aestivum L.) and other grain crops in Australia. For practical plant breeding, it will be valuable to know the mode of inheritance of resistance and whether the same set of genes confer resistance to both species. We evaluated reactions to P. thornei and P. neglectus of glasshouse-inoculated plants of five doubled-haploid populations derived from five resistant synthetic hexpaloid wheat lines, each crossed to the susceptible Australian wheat cultivar Janz. For each cross we determined genetic variance, heritability and minimum number of effective resistance genes for each nematode species. Distributions of nematode numbers for both species were continuous for all doubled-haploid populations. Heritabilities were high and the resistances were controlled by 4-7 genes. There was no genetic correlation between resistance to P. thornei and to P. neglectus in four of the populations and a significant but low correlation in one. Therefore, resistances to P. thornei and to P. neglectus are probably inherited quantitatively and independently in four of these synthetic hexaploid wheat populations, with the possibility of at least one genetic factor contributing to resistance to both species in one of the populations. Parents with the greatest level of resistance will be the best to use as donor parents to adapted cultivars, and selection of resistance to both species in early generations will be optimal to carry resistance through successive cycles of inbreeding to produce resistant cultivars for release.
Resumo:
Multi-species fisheries are complex to manage and the ability to develop an appropriate governance structure is often seriously impeded because trading between sustainability objectives at the species level, economic objectives at the fleet level, and social objectives at the community scale, is complex. Many of these fisheries also tend to have a mix of information, with stock assessments available for some species and almost no information on other species. The fleets themselves comprise fishers from small family enterprises to large vertically integrated businesses. The Queensland trawl fishery in Australia is used as a case study for this kind of fishery. It has the added complexity that a large part of the fishery is within a World Heritage Area, the Great Barrier Reef Marine Park, which is managed by an agency of the Australian Commonwealth Government whereas the fishery itself is managed by the Queensland State Government. A stakeholder elicitation process was used to develop social, governance, economic and ecological objectives, and then weight the relative importance of these. An expert group was used to develop different governance strawmen (or management strategies) and these were assessed by a group of industry stakeholders and experts using multi-criteria decision analysis techniques against the different objectives. One strawman clearly provided the best overall set of outcomes given the multiple objectives, but was not optimal in terms of every objective, demonstrating that even the "best" strawman may be less than perfect. © 2012.
Resumo:
INTRODUCTION:Terrestrial top-predators are expected to regulate and stabilise food webs through their consumptive and non-consumptive effects on sympatric mesopredators and prey. The lethal control of top-predators has therefore been predicted to inhibit top-predator function, generate the release of mesopredators and indirectly harm native fauna through trophic cascade effects. Understanding the outcomes of lethal control on interactions within terrestrial predator guilds is important for zoologists, conservation biologists and wildlife managers. However, few studies have the capacity to test these predictions experimentally, and no such studies have previously been conducted on the eclectic suite of native and exotic, mammalian and reptilian taxa we simultaneously assess. We conducted a series of landscape-scale, multi-year, manipulative experiments at nine sites spanning five ecosystem types across the Australian continental rangelands to investigate the responses of mesopredators (red foxes, feral cats and goannas) to contemporary poison-baiting programs intended to control top-predators (dingoes) for livestock protection.RESULT:Short-term behavioural releases of mesopredators were not apparent, and in almost all cases, the three mesopredators we assessed were in similar or greater abundance in unbaited areas relative to baited areas, with mesopredator abundance trends typically either uncorrelated or positively correlated with top-predator abundance trends over time. The exotic mammals and native reptile we assessed responded similarly (poorly) to top-predator population manipulation. This is because poison baits were taken by multiple target and non-target predators and top-predator populations quickly recovered to pre-control levels, thus reducing the overall impact of baiting on top-predators and averting a trophic cascade.CONCLUSIONS:These results are in accord with other predator manipulation experiments conducted worldwide, and suggest that Australian populations of native prey fauna at lower trophic levels are unlikely to be negatively affected by contemporary dingo control practices through the release of mesopredators. We conclude that contemporary lethal control practices used on some top-predator populations do not produce the conditions required to generate positive responses from mesopredators. Functional relationships between sympatric terrestrial predators may not be altered by exposure to spatially and temporally sporadic application of non-selective lethal control.
Resumo:
Wildfire represents a major risk to pine plantations. This risk is particularly great for young plantations (generally less than 10 m in height) where prescribed fire cannot be used to manipulate fuel biomass, and where flammable grasses are abundant in the understorey. We report results from a replicated field experiment designed to determine the effects of two rates of glyphosate (450 g L–1) application, two extents of application (inter-row only and inter-row and row) with applications being applied once or twice, on understorey fine fuel biomass, fuel structure and composition in south-east Queensland, Australia. Two herbicide applications (~9 months apart) were more effective than a once-off treatment for reducing standing biomass, grass continuity, grass height, percentage grass dry weight and the density of shrubs. In addition, the 6-L ha–1 rate of application was more effective than the 3-L ha–1 rate of application in periodically reducing grass continuity and shrub density in the inter-rows and in reducing standing biomass in the tree rows, and application in the inter-rows and rows significantly reduced shrub density relative to the inter-row-only application. Herbicide treatment in the inter-rows and rows is likely to be useful for managing fuels before prescribed fire in young pine plantations because such treatment minimised tree scorch height during prescribed burns. Further, herbicide treatments had no adverse effects on plantation trees, and in some cases tree growth was enhanced by treatments. However, the effectiveness of herbicide treatments in reducing the risk of tree damage or mortality under wildfire conditions remains untested.
Resumo:
Data from 9296 calves born to 2078 dams over 9 years across five sites were used to investigate factors associated with calf mortality for tropically adapted breeds (Brahman and Tropical Composite) recorded in extensive production systems, using multivariate logistic regression. The average calf mortality pre-weaning was 9.5% of calves born, varying from 1.5% to 41% across all sites and years. In total, 67% of calves that died did so within a week of their birth, with cause of death most frequently recorded as unknown. The major factors significantly (P < 0.05) associated with mortality for potentially large numbers of calves included the specific production environment represented by site-year, low calf birthweight (more so than high birthweight) and horn status at branding. Almost all calf deaths post-branding (assessed from n = 8348 calves) occurred in calves that were dehorned, totalling 2.1% of dehorned calves and 15.9% of all calf deaths recorded. Breed effects on calf mortality were primarily the result of breed differences in calf birthweight and, to a lesser extent, large teat size of cows; however, differences in other breed characteristics could be important. Twin births and calves assisted at birth had a very high risk of mortality, but <1% of calves were twins and few calves were assisted at birth. Conversely, it could not be established how many calves would have benefitted from assistance at birth. Cow age group and outcome from the previous season were also associated with current calf mortality; maiden or young cows (<4 years old) had increased calf losses overall. More mature cows with a previous outcome of calf loss were also more likely to have another calf loss in the subsequent year, and this should be considered for culling decisions. Closer attention to the management of younger cows is warranted to improve calf survival.
Resumo:
Fire is a major driver of ecosystem change and can disproportionately affect the cycling of different nutrients. Thus, a stoichiometric approach to investigate the relationships between nutrient availability and microbial resource use during decomposition is likely to provide insight into the effects of fire on ecosystem functioning. We conducted a field litter bag experiment to investigate the long-term impact of repeated fire on the stoichiometry of leaf litter C, N and P pools, and nutrient-acquiring enzyme activities during decomposition in a wet sclerophyll eucalypt forest in Queensland, Australia. Fire frequency treatments have been maintained since 1972, including burning every two years (2yrB), burning every four years (4yrB) and no burning (NB). C:N ratios in freshly fallen litter were 29-42% higher and C:P ratios were 6-25% lower for 2yrB than NB during decomposition, with correspondingly lower 2yrB N:P ratios (27-32) than for NB (34-49). Trends in litter soluble and microbial N:P ratios were similar to the overall litter N:P ratios across fire treatments. Consistent with these, the ratio of activities for N-acquiring to P-acquiring enzymes in litter was higher for 2yrB than NB while 4yrB was generally intermediate between 2yrB and NB. Decomposition rates of freshly fallen litter were significantly lower for 2yrB (72±2% mass remaining at the end of experiment) than for 4yrB (59±3%) and NB (62±3%), a difference that may be related to effects of N limitation, lower moisture content, and/or litter C quality. Results for older mixed-age litter were similar to those for freshly fallen litter although treatment differences were less pronounced. Overall, these findings show that frequent fire (2yrB) decoupled N and P cycling, as manifested in litter C:N:P stoichiometry and in microbial biomass N:P ratio and enzymatic activities. These data indicate that fire induced a transient shift to N-limited ecosystem conditions during the post-fire recovery phase. This article is protected by copyright. All rights reserved.
Resumo:
BACKGROUND Kernel brown centres in macadamia are a defect causing internal discolouration of kernels. This study investigates the effect on the incidence of brown centres in raw kernel after maintaining high moisture content in macadamia nuts-in-shell stored at temperatures of 30°C, 35°C, 40°C and 45°C. RESULTS Brown centres of raw kernel increased with nuts-in-shell storage time and temperature when high moisture content was maintained by sealing in polyethylene bags. Almost all kernels developed the defect when kept at high moisture content for 5 days at 45°C, and 44% developed brown centres after only 2 days of storage at high moisture content at 45°C. This contrasted with only 0.76% when stored for 2 days at 45°C but allowed to dry in open-mesh bags. At storage temperatures below 45°C, there were fewer brown centres, but there were still significant differences between those stored at high moisture content and those allowed to dry (P < 0.05). CONCLUSION Maintenance of high moisture content during macadamia nuts-in-shell storage increases the incidence of brown centres in raw kernels and the defect increases with time and temperature. On-farm nuts-in-shell drying and storage practices should rapidly remove moisture to reduce losses. Ideally, nuts-in-shell should not be stored at high moisture content on-farm at temperatures over 30°C. © 2013 Society of Chemical Industry
Resumo:
Cotton bunchy top virus (CBTV) and the related Cotton leafroll dwarf virus (CLRDV) have caused sporadic disease outbreaks in most cotton regions of the world. Until recently, little was known about the diversity of CBTV or its natural host range. Seven natural field hosts and one experimental host of CBTV have now been identified. These include cotton, Malva parviflora (Marshmallow weed), Abutilon theophrasti (Velvetleaf), Anoda cristata (Spurred anoda), Hibiscus sabdariffa (Rosella), Sida rhombifolia (Paddy’s lucerne), Chamaesyce hirta (Asthma plant) and Gossypium australe. These are currently the only eight known hosts of CBTV. However the virus may have a wider host range than originally thought and include further non-Malvaceae species like asthma plant (family Euphorbiaceae). There are two distinct strains of CBTV in Australia, -A and -B, which have been detected in cotton from numerous locations across almost all growing regions. From 105 samples of cotton that have been positive for CBTV, 6 were infections of strain A only, 60 were strain B only and 64 were a mixed infection of strains A and B. These results indicate the symptoms of cotton bunchy top disease are closely associated with the presence of strain CBTV-B. A diagnostic assay for Cotton leafroll dwarf virus (CLRDV - cotton blue disease) is being developed and applied successfully for the detection of CLRDV samples from Brazil and Thailand. This is the first confirmation of CLRDV from SE-Asia, which may pose an increased biosecurity threat to the Australian industry.
Resumo:
Introduction Many prey species around the world are suffering declines due to a variety of interacting causes such as land use change, climate change, invasive species and novel disease. Recent studies on the ecological roles of top-predators have suggested that lethal top-predator control by humans (typically undertaken to protect livestock or managed game from predation) is an indirect additional cause of prey declines through trophic cascade effects. Such studies have prompted calls to prohibit lethal top-predator control with the expectation that doing so will result in widespread benefits for biodiversity at all trophic levels. However, applied experiments investigating in situ responses of prey populations to contemporary top-predator management practices are few and none have previously been conducted on the eclectic suite of native and exotic mammalian, reptilian, avian and amphibian predator and prey taxa we simultaneously assess. We conducted a series of landscape-scale, multi-year, manipulative experiments at nine sites spanning five ecosystem types across the Australian continental rangelands to investigate the responses of sympatric prey populations to contemporary poison-baiting programs intended to control top-predators (dingoes) for livestock protection. Results Prey populations were almost always in similar or greater abundances in baited areas. Short-term prey responses to baiting were seldom apparent. Longer-term prey population trends fluctuated independently of baiting for every prey species at all sites, and divergence or convergence of prey population trends occurred rarely. Top-predator population trends fluctuated independently of baiting in all cases, and never did diverge or converge. Mesopredator population trends likewise fluctuated independently of baiting in almost all cases, but did diverge or converge in a few instances. Conclusions These results demonstrate that Australian populations of prey fauna at lower trophic levels are typically unaffected by top-predator control because top-predator populations are not substantially affected by contemporary control practices, thus averting a trophic cascade. We conclude that alteration of current top-predator management practices is probably unnecessary for enhancing fauna recovery in the Australian rangelands. More generally, our results suggest that theoretical and observational studies advancing the idea that lethal control of top-predators induces trophic cascades may not be as universal as previously supposed.
Resumo:
Objective To describe the influence of the dingo (Canis lupus dingo) on the past, present and future distributions of sheep in Australia. Design The role of the dingo in the rise and fall of sheep numbers is reviewed, revised data are provided on the present distribution and density of sheep and dingoes, and historical patterns of sheep distribution are used to explore the future of rangeland sheep grazing. Results Dingoes are a critical causal factor in the distribution of sheep at the national, regional and local levels. Dingo predation contributed substantially to the historical contraction of the sheep industry to its present-day distribution, which is almost exclusively confined to areas within fenced dingo exclusion zones. Dingo populations and/or their influence are now present and increasing in all sheep production zones of Australia, inclusive of areas that were once dingo free'. Conclusions Rangeland production of wool and sheep meat is predicted to disappear within 30-40 years if the present rate of contraction of the industry continues unabated. Understanding the influence of dingoes on sheep production may help refine disease response strategies and help predict the future distribution of sheep and their diseases.
Resumo:
Lethal control of wild dogs - that is Dingo (Canis lupus dingo) and Dingo/Dog (Canis lupus familiaris) hybrids - to reduce livestock predation in Australian rangelands is claimed to cause continental-scale impacts on biodiversity. Although top predator populations may recover numerically after baiting, they are predicted to be functionally different and incapable of fulfilling critical ecological roles. This study reports the impact of baiting programmes on wild dog abundance, age structures and the prey of wild dogs during large-scale manipulative experiments. Wild dog relative abundance almost always decreased after baiting, but reductions were variable and short-lived unless the prior baiting programme was particularly effective or there were follow-up baiting programmes within a few months. However, age structures of wild dogs in baited and nil-treatment areas were demonstrably different, and prey populations did diverge relative to nil-treatment areas. Re-analysed observations of wild dogs preying on kangaroos from a separate study show that successful chases that result in attacks of kangaroos by wild dogs occurred when mean wild dog ages were higher and mean group size was larger. It is likely that the impact of lethal control on wild dog numbers, group sizes and age structures compromise their ability to handle large difficult-to-catch prey. Under certain circumstances, these changes sometimes lead to increased calf loss (Bos indicus/B. taurus genotypes) and kangaroo numbers. Rangeland beef producers could consider controlling wild dogs in high-risk periods when predation is more likely and avoid baiting at other times.
Resumo:
The root-lesion nematodes (RLN) Pratylenchus thornei and P. neglectus are widely distributed in Australian grain producing regions and can reduce the yield of intolerant wheat cultivars by up to 65 , costing the industry ~123 M AUD/year. Consequently, researchers in the northern, southern and western regions have independently developed procedures to evaluate the resistance of cereal cultivars to RLN. To compare results, each of the three laboratories phenotyped a set of 26 and 36 cereal cultivars for relative resistance/susceptibility to P. thornei and P. neglectus respectively. The northern and southern regions also investigated the effects of planting time and experiment duration on RLN reproduction and cultivar ranking. Results show the genetic correlation between cultivars tested using the northern and southern procedures evaluating P. thornei resistance was 0.93. Genetic correlations between experiments using the same procedure, but with different planting times, were 0.99 for both northern and southern procedures. The genetic correlation between cultivars tested using the northern, southern and western procedures evaluating P. neglectus resistance ranged from 0.71 to 0.95. Genetic correlations between experiments using the same procedure but with different planting times ranged from 0.91 to 0.99. This study established that, even though experiments were conducted in different geographic locations and with different trial management practices, the diverse nematode resistance screening procedures ranked cultivars similarly. Consequently, RLN resistance data can be pooled across regions to provide national consensus ratings of cultivars.
Resumo:
With potential to accumulate substantial amounts of above-ground biomass, at maturity an irrigated cotton crop can have taken up more than 20 kg/ha phosphorus and often more than 200 kg/ha of potassium. Despite the size of plant accumulation of P and K, recovery of applied P and K fertilisers by the crop in our field experiment program has poor. Processing large amounts of mature cotton plant material to provide a representative sample for chemical analysis has not been without its challenges, but the questions regarding mechanism of where, how and when the plant is acquiring immobile nutrients remain. Dry matter measured early in the growing season (squaring, first white flower) have demonstrated a 50% increase in crop biomass to applied P (in particular), but it represents only 20% of the total P accumulation by the plant. By first open boll (and onwards), no response in dry matter or P concentration could be detected to P application. A glasshouse study indicated P recovery was greater (to FOB) where it was completely mixed through a profile as opposed to a banded application method suggesting cotton prefers a more diffuse distribution. The relative effects of root morphology, mycorrhizal fungi infection, seasonal growth patterns and how irrigation is applied are areas for future investigation on how, when and where cotton acquires immobile nutrients.
Resumo:
Two field experiments were established in central Queensland at Capella and Gindie to investigate the immediate and then residual benefit of deep placed (20 cm) nutrients in this opportunity cropping system. The field sites had factorial combinations of P (40 kg P/ha), K (200 kg K/ha) and S (40 kg S/ha) and all plots received 100 kg N/ha. No further K or S fertilizers were added during the experiment but some crops had starter P. The Capella site was sown to chickpea in 2012, wheat in 2013 and then chickpea in 2014. The Gindie site was sown to sorghum in 2011/12, chickpea in 2013 and sorghum in early 2015. There were responses to P alone in the first two crops at each site and there were K responses in half the six site years. In year 1 (a good year) both sites showed a 20% grain yield response to only to deep P. In year 2 (much drier) the effects of deep P were still evident at both sites and the effects of K were clearly evident at Gindie. There was a suggestion of an additive P+K effect at Capella and a 50% increase for P+K at Gindie. Year 3 was dry and chickpeas at Capella showed a larger response to P+K but the sorghum at Gindie only responded to deep K. These results indicate that responses to deep placed P and K are durable over an opportunity cropping system, and meeting both requirements is important to achieve yield responses.