115 resultados para Losses Tipology
Resumo:
Alternaria leaf blotch and fruit spot caused by Alternaria spp. cause annual losses to the Australian apple industry. Control options are limited, mainly due to a lack of understanding of the disease cycle. Therefore, this study aimed to determine potential sources of Alternaria spp. inoculum in the orchard and examine their relative contribution throughout the production season. Leaf residue from the orchard floor, canopy leaves, twigs and buds were collected monthly from three apple orchards for two years and examined for the number of spores on their surface. In addition, the effects of climatic factors on spore production dynamics in each plant part were examined. Although all four plant parts tested contributed to the Alternaria inoculum in the orchard, significant higher numbers of spores were obtained from leaf residue than the other plant parts supporting the hypothesis that overwintering of Alternaria spp. occurred mainly in leaf residue and minimally on twigs and buds. The most significant period of spore production on leaf residue occurred from dormancy until bloom and on canopy leaves and twigs during the fruit growth stage. Temperature was the single most significant factor influencing the amount of Alternaria inoculum and rainfall and relative humidity showed strong associations with temperature influencing the spore production dynamics in Australian orchards. The practical implications of this study include the eradication of leaf residue from the orchard floor and sanitation of the canopy after harvest to remove residual spores from the trees.
Resumo:
Alternaria leaf blotch and fruit spot of apple caused by Alternaria spp. cause annual losses to the Australian apple industry. Erratic control using protectant fungicides is often experienced and may be due to the lack of understanding of the timing of infection and epidemiology of the diseases. We found that Alternaria leaf blotch infection began about 20 days after bloom (DAB) and the highest disease incidence occurred from 70 to 110 DAB. Alternaria fruit spot infection occurred about 100 DAB in the orchard. Fruit inoculations in planta showed that there was no specific susceptible stage of fruit. Leaves and fruit in the lower canopy of trees showed higher levels of leaf blotch and fruit spot incidence than those in the upper canopy and the incidence of leaf blotch in shoot leaves was higher than in spur leaves. Temperature, relative humidity, and rainfall affected leaf blotch and fruit spot incidence. The gained knowledge on the timing of infection and development of disease may aid in the development of more effective disease management strategies.
Resumo:
The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.
Resumo:
BACKGROUND Kernel brown centres in macadamia are a defect causing internal discolouration of kernels. This study investigates the effect on the incidence of brown centres in raw kernel after maintaining high moisture content in macadamia nuts-in-shell stored at temperatures of 30°C, 35°C, 40°C and 45°C. RESULTS Brown centres of raw kernel increased with nuts-in-shell storage time and temperature when high moisture content was maintained by sealing in polyethylene bags. Almost all kernels developed the defect when kept at high moisture content for 5 days at 45°C, and 44% developed brown centres after only 2 days of storage at high moisture content at 45°C. This contrasted with only 0.76% when stored for 2 days at 45°C but allowed to dry in open-mesh bags. At storage temperatures below 45°C, there were fewer brown centres, but there were still significant differences between those stored at high moisture content and those allowed to dry (P < 0.05). CONCLUSION Maintenance of high moisture content during macadamia nuts-in-shell storage increases the incidence of brown centres in raw kernels and the defect increases with time and temperature. On-farm nuts-in-shell drying and storage practices should rapidly remove moisture to reduce losses. Ideally, nuts-in-shell should not be stored at high moisture content on-farm at temperatures over 30°C. © 2013 Society of Chemical Industry
Resumo:
Rhipicephalus (Boophilus) microplus (Acari: Ixodidae) ticks cause economic losses for cattle industries throughout tropical and subtropical regions of the world estimated at $US2.5 billion annually. Lack of access to efficacious long-lasting vaccination regimes and increases in tick acaricide resistance have led to the investigation of targets for the development of novel tick vaccines and treatments. In vitro tick feeding has been used for many tick species to study the effect of new acaricides on the transmission of tick-borne pathogens. Few studies have reported the use of in vitro feeding for functional genomic studies using RNA interference and/or the effect of specific anti-tick antibodies. In particular, in vitro feeding reports for the cattle tick are limited due to its relatively short hypostome. Previously published methods were further modified to broaden optimal tick sizes/weights, feeding sources including bovine and ovine serum, optimisation of commercially available blood anti-coagulant tubes, and IgG concentrations for effective antibody delivery. Ticks are fed overnight and monitored for ∼5–6 weeks to determine egg output and success of larval emergence using a humidified incubator. Lithium-heparin blood tubes provided the most reliable anti-coagulant for bovine blood feeding compared with commercial citrated (CPDA) and EDTA tubes. Although >30 mg semi-engorged ticks fed more reliably, ticks as small as 15 mg also fed to repletion to lay viable eggs. Ticks which gained less than ∼10 mg during in vitro feeding typically did not lay eggs. One mg/ml IgG from Bm86-vaccinated cattle produced a potent anti-tick effect in vitro (83% efficacy) similar to that observed in vivo. Alternatively, feeding of dsRNA targeting Bm86 did not demonstrate anti-tick effects (11% efficacy) compared with the potent effects of ubiquitin dsRNA. This study optimises R. microplus tick in vitro feeding methods which support the development of cattle tick vaccines and treatments.
Resumo:
Many banana producing regions around the world experience climate variability as a result of seasonal rainfall and temperature conditions, which result in sub-optimal conditions for banana production. This can create periods of plant stress which impact on plant growth, development and yields. Furthermore, diseases such as Fusarium wilt caused by Fusarium oxysporum f. sp. cubense, can become more predominant following periods of environmental stress, particularly for many culturally significant cultivars such as Ducasse (synonym Pisang Awak) (Musa ABB). The aim of this experiment was to determine if expression of symptoms of Fusarium wilt of bananas in a susceptible cultivar could be explained by environmental conditions, and if soil management could reduce the impact of the disease and increase production. An experiment was established in an abandoned commercial field of Ducasse bananas with a high incidence of Fusarium wilt. Vegetated ground cover was maintained around the base of banana plants and compared with plants grown in bare soil for changes in growth, production and disease symptoms. Expression of Fusarium wilt was found to be a function of water stress potential and the heat unit requirement for bananas. The inclusion of vegetative ground cover around the base of the banana plants significantly reduced the severity and incidence of Fusarium wilt by 20 % and altered the periods of symptom development. The growth of bananas and development of the bunch followed the accumulated heat units, with a greater number of bunched plants evident during warmer periods of the year. The weight of bunches harvested in a second crop cycle was increased when banana plants were grown in areas with vegetative ground cover, with fewer losses of plants due to Fusarium wilt.
Resumo:
Two field trials were conducted with untreated coconut wood (“cocowood”) of varying densities against the subterranean termites Coptotermes acinaciformis (Froggatt) and Mastotermes darwiniensis Froggatt in northern Queensland, Australia. Both trials ran for 16 weeks during the summer months. Cocowood densities ranged from 256 kg/m3 to 1003 kg/m3, and the test specimens were equally divided between the two termite trial sites. Termite pressure was high at both sites where mean mass losses in the Scots pine sapwood feeder specimens were: 100% for C. acinaciformis and 74.7% for M. darwiniensis. Termite species and cocowood density effects were significant. Container and position effects were not significant. Mastotermes darwiniensis fed more on the cocowood than did C. acinaciformis despite consuming less of the Scots pine than did C. acinaciformis. Overall the susceptibility of cocowood to C. acinaciformis and M. darwiniensis decreases with increasing density, but all densities (apart from a few at the high end of the density range) could be considered susceptible, particularly to M. darwiniensis. Some deviations from this general trend are discussed as well as implications for the utilisation of cocowood as a building resource.
Resumo:
Graminicolous Downy Mildew (GDM) diseases caused by the genera Peronosclerospora (13 spp.) and Sclerophthora (6 spp. and 1 variety) are poorly studied but destructive diseases of major crops such as corn, sorghum, sugarcane and other graminoids. Eight of the 13 described Peronosclerospora spp. are able to infect corn. In particular, P. philippinensis (= P. sacchari), P. maydis, P. heteropogonis, and S. rayssiae var. zeae cause major losses in corn yields in tropical Asia. In 2012 a new species, P. australiensis, was described based on isolates previously identified as P. maydis in Australia; this species is now a pathogen of major concern. Despite the strong impact of GDM diseases, there are presently no reliable molecular methods available for their detection. GDM pathogens are among the most difficult Oomycetes to identify using molecular tools, as their taxonomy is very challenging, and little genetic sequence data are available for development of molecular tools to detect GDM pathogens to species level. For example, from over 15 genes used in identification, diagnostics or phylogeny of Phytophthora, only ITS1 and cox2 show promise for use with GDM pathogens. Multiplex/multigene conventional and qPCR assays are currently under evaluation for the detection of economically important GDM spp. Scientists from the USA, Germany, Canada, Australia, and the Philippines are collaborating on the development and testing of diagnostic tools for these pathogens of concern.
Resumo:
Fruiting hybrids are reported for the first time between the genera Citrus L. and Citropsis (Engl.) Swing. & M.Kell. Conventional hybridization using the recently described species Citrus wakonai P.I.Forst. & M.W.Sm. and Citropsis gabunensis (Engl.) Swing. & M.Kell. resulted in high rates of fruit set and seed formation. Although seed were only half normal size, over 90% germinated without the need for embryo rescue techniques. Plant losses were high during the first few months but after six months, the 327 surviving hybrids were potted on. These grew vigorously on their own roots and 35 of them flowered within two years of sowing. Plants flowered continuously but all were pollen-sterile and ovaries abscised shortly after petal fall. However, at 25 months, two newly flowering hybrids began setting fruit. The development, identification, morphology, breeding efficiency, and future implications of this unique germplasm are described.
Resumo:
Bovine Viral Diarrhoea Virus (BVDV) is widely distributed in cattle industries and causes significant economic losses worldwide annually. A limiting factor in the development of subunit vaccines for BVDV is the need to elicit both antibody and T-cell-mediated immunity as well as addressing the toxicity of adjuvants. In this study, we have prepared novel silica vesicles (SV) as the new generation antigen carriers and adjuvants. With small particle size of 50 nm, thin wall (similar to 6 nm), large cavity (similar to 40 nm) and large entrance size (5.9 nm for SV-100 and 16 nm for SV-140), the SV showed high loading capacity (similar to 250 mu g/mg) and controlled release of codon-optimised E2 (oE2) protein, a major immunogenic determinant of BVDV. The in vivo functionality of the system was validated in mice immunisation trials comparing oE2 plus Quil A (50 mu g of oE2 plus 10 mu g of Quil A, a conventional adjuvant) to the oE2/SV-140 (50 mu g of oE2 adsorbed to 250 mu g of SV-140) or oE2/SV-140 together with 10 mu g of Quil A. Compared to the oE2 plus Quil A, which generated BVDV specific antibody responses at a titre of 10(4), the oE2/SV-140 group induced a 10 times higher antibody response. In addition, the cell-mediated response, which is essential to recognise and eliminate the invading pathogens, was also found to be higher [1954-2628 spot forming units (SFU)/million cells] in mice immunised with oE2/SV-140 in comparison to oE2 plus Quil A (512-1369 SFU/million cells). Our study has demonstrated that SV can be used as the next-generation nanocarriers and adjuvants for enhanced veterinary vaccine delivery. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
The critical crop-weed competition period in a dry-seeded rice system is an important consideration in formulating weed management strategies. Field experiments were conducted in the summer seasons of 2012 and 2013 at the Punjab Agricultural University, Ludhiana, India, to determine the extent of yield loss in two different rice cultivars (PR 114 and PR 115) with different periods of weed interference. Twelve weed control timings were used to identify critical periods of weed competition in dry-seeded rice. PR 114, a long-duration rice cultivar (145 d) having slower initial growth than PR 115 (125 d), was more prone to yield losses. In both years, 100% yield loss was observed where weeds were not controlled throughout the season. In weed-free plots, the grain yield of PR 114 was 6.39-6.80 t ha-1, for PR 115, it was 6.49-6.87 t ha-1. Gompertz and logistic equations fitted to yield data in response to increasing periods of weed control and weed interference showed that, PR 114 had longer critical periods than PR 115. Critical weed-free periods to achieve 95% of weed-free yield for PR 114 was longer than for PR 115 by 31 days in 2012 and 26 days in 2013. Weed infestation also influenced the duration of critical periods. Higher weed pressure in 2012 than in 2013 increased the duration of the critical period of crop-weed competition in that year. The identification of critical crop-weed competition periods for different cultivars will facilitate improved decision-making regarding the timing of weed control and the adoption of cultivars having high weed-suppressing abilities. This will also contribute to the development of integrated weed management in dry-seeded rice systems.
Resumo:
Alternative sources of N are required to bolster subtropical cereal production without increasing N2O emissions from these agro-ecosystems. The reintroduction of legumes in cereal cropping systems is a possible strategy to reduce synthetic N inputs but elevated N2O losses have sometimes been observed after the incorporation of legume residues. However, the magnitude of these losses is highly dependent on local conditions and very little data are available for subtropical regions. The aim of this study was to assess whether, under subtropical conditions, the N mineralised from legume residues can substantially decrease the synthetic N input required by the subsequent cereal crop and reduce overall N2O emissions during the cereal cropping phase. Using a fully automated measuring system, N2O emissions were monitored in a cereal crop (sorghum) following a legume pasture and compared to the same crop in rotation with a grass pasture. Each crop rotation included a nil and a fertilised treatment to assess the N availability of the residues. The incorporation of legumes provided enough readily available N to effectively support crop development but the low labile C left by these residues is likely to have limited denitrification and therefore N2O emissions. As a result, N2O emissions intensities (kgN2O-N yield-1ha-1) were considerably lower in the legume histories than in the grass. Overall, these findings indicate that the C supplied by the crop residue can be more important than the soil NO3 - content in stimulating denitrification and that introducing a legume pasture in a subtropical cereal cropping system is a sustainable practice from both environmental and agronomic perspectives.
Resumo:
Cultural practices alter patterns of crop growth and can modify dynamics of weed-crop competition, and hence need to be investigated to evolve sustainable weed management in dry-seeded rice (DSR). Studies on weed dynamics in DSR sown at different times under two tillage systems were conducted at the Agronomic Research Farm, University of Agriculture, Faisalabad, Pakistan. A commonly grown fine rice cultivar 'Super Basmati' was sown on 15th June and 7th July of 2010 and 2011 under zero-till (ZT) and conventional tillage (CONT) and it was subjected to different durations of weed competition [10, 20, 30, 40, and 50 days after sowing (DAS) and season-long competition]. Weed-free plots were maintained under each tillage system and sowing time for comparison. Grassy weeds were higher under ZT while CONT had higher relative proportion of broad-leaved weeds in terms of density and biomass. Density of sedges was higher by 175% in the crop sown on the 7th July than on the 15th June. Delaying sowing time of DSR from mid June to the first week of July reduced weed density by 69 and 43% but their biomass remained unaffected. Tillage systems had no effect on total weed biomass. Plots subjected to season-long weed competition had mostly grasses while broad-leaved weeds were not observed at harvest. In the second year of study, dominance of grassy weeds was increased under both tillage systems and sowing times. Significantly less biomass (48%) of grassy weeds was observed under CONT than ZT in 2010; however, during 2011, this effect was non-significant. Trianthema portulacastrum and Dactyloctenium aegyptium were the dominant broad-leaved and grassy weeds, respectively. Cyperus rotundus was the dominant sedge weed, especially in the crop sown on the 7th July. Relative yield loss (RYL) ranged from 3 to 13% and 7 to16% when weeds were allowed to compete only for 20 DAS. Under season-long weed competition, RYL ranged from 68 to 77% in 2010 and 74 to80% in 2011. The sowing time of 15th June was effective in minimizing weed proliferation and rectifying yield penalty associated with the 7th July sowing. The results suggest that DSR in Pakistan should preferably be sown on 15th June under CONT systems and weeds must be controlled before 20 DAS to avoid yield losses. Successful adoption of DSR at growers' fields in Pakistan will depend on whether growers can control weeds and prevent shifts in weed population from intractable weeds to more difficult-to-control weeds as a consequence of DSR adoption.
Resumo:
Reducing crop row spacing and delaying time of weed emergence may provide crops a competitive edge over weeds. Field experiments were conducted to evaluate the effects of crop row spacing (11, 15, and 23-cm) and weed emergence time (0, 20, 35, 45, 55, and 60 days after wheat emergence; DAWE) on Galium aparine and Lepidium sativum growth and wheat yield losses. Season-long weed-free and crop-free treatments were also established to compare wheat yield and weed growth, respectively. Row spacing and weed emergence time significantly affected the growth of both weed species and wheat grain yields. For both weed species, the maximum plant height, shoot biomass, and seed production were observed in the crop-free plots, and delayed emergence decreased these variables. In weed-crop competition plots, maximum weed growth was observed when weeds emerged simultaneously with the crop in rows spaced 23-cm apart. Less growth of both weed species was observed in narrow row spacing (11-cm) of wheat as compared with wider rows (15 and 23-cm). These weed species produced less than 5 seeds plant-1 in 11-cm wheat rows when they emerged at 60 DAWE. Presence of weeds in the crop especially at early stages was devastating for wheat yields. Therefore, maximum grain yield (4.91tha-1) was recorded in the weed-free treatment at 11-cm row spacing. Delay in time of weed emergence and narrow row spacing reduced weed growth and seed production and enhanced wheat grain yield, suggesting that these strategies could contribute to weed management in wheat.
Resumo:
Background: Agriculture is facing enormous challenges to feed a growing population in the face of rapidly evolving pests and pathogens. The rusts, in particular, are a major pathogen of cereal crops with the potential to cause large reductions in yield. Improving stable disease resistance is an on-going major and challenging focus for many plant breeding programs, due to the rapidly evolving nature of the pathogen. Sorghum is a major summer cereal crop that is also a host for a rust pathogen which occurs in almost all sorghum growing areas of the world, causing direct and indirect yield losses in sorghum worldwide, however knowledge about its genetic control is still limited. In order to further investigate this issue, QTL and association mapping methods were implemented to study rust resistance in three bi-parental populations and an association mapping set of elite breeding lines in different environments. Results: In total, 64 significant or highly significant QTL and 21 suggestive rust resistance QTL were identified representing 55 unique genomic regions. Comparisons across populations within the current study and with rust QTL identified previously in both sorghum and maize revealed a high degree of correspondence in QTL location. Negative phenotypic correlations were observed between rust, maturity and height, indicating a trend for both early maturing and shorter genotypes to be more susceptible to rust. Conclusions: The significant amount of QTL co-location across traits, in addition to the consistency in the direction of QTL allele effects, has provided evidence to support pleiotropic QTL action across rust, height, maturity and stay-green, supporting the role of carbon stress in susceptibility to rust. Classical rust resistance QTL regions that did not co-locate with height, maturity or stay-green QTL were found to be significantly enriched for the defence-related NBS-encoding gene family, in contrast to the lack of defence-related gene enrichment in multi-trait effect rust resistance QTL. The distinction of disease resistance QTL hot-spots, enriched with defence-related gene families from QTL which impact on development and partitioning, provides plant breeders with knowledge which will allow for fast-tracking varieties with both durable pathogen resistance and appropriate adaptive traits.