108 resultados para cooking losses
Resumo:
Equid herpesvirus 1 (EHV1) is a major disease of equids worldwide causing considerable losses to the horse industry. A variety of techniques, including PCR have been used to diagnose EHV1. Some of these PCRs were used in combination with other techniques such as restriction enzyme analysis (REA) or hybridisation, making them cumbersome for routine diagnostic testing and increasing the chances of cross-contamination. Furthermore, they involve the use of suspected carcinogens such as ethidium bromide and ultraviolet light. In this paper, we describe a real-time PCR, which uses minor groove-binding probe (MGB) technology for the diagnosis of EHV1. This technique does not require post-PCR manipulations thereby reducing the risk of cross-contamination. Most importantly, the technique is specific; it was able to differentiate EHV1 from the closely related member of the Alphaherpesvirinae, equid herpesvirus 4 (EHV4). It was not reactive with common opportunistic pathogens such as Escherichia coli, Klebsiella oxytoca, Pseudomonas aeruginosa and Enterobacter agglomerans often involved in abortion. Similarly, it did not react with equine pathogens such as Streptococcus equi, Streptococcus equisimilis, Streptococcus zooepidemicus, Taylorella equigenitalis and Rhodococcus equi, which also cause abortion. The results obtained with this technique agreed with results from published PCR methods. The assay was sensitive enough to detect EHV1 sequences in paraffin-embedded tissues and clinical samples. When compared to virus isolation, the test was more sensitive. This test will be useful for the routine diagnosis of EHV1 based on its specificity, sensitivity, ease of performance and rapidity.
Resumo:
Sunflower rust caused by Puccinia helianthi is the most important disease of sunflower in Australia with the potential to cause significant yield losses in susceptible hybrids. Rapid and frequent virulence changes in the rust fungus population limit the effective lifespan of commercial cultivars and impose constant pressure on breeding programs to identify and deploy new sources of resistance. This paper contains a synopsis of virulence data accumulated over 25 years, and more recent studies of genotypic diversity and sexual recombination. We have used this synopsis, generated from both published and unpublished data, to propose the origin, evolution and distribution of new pathotypes of P. helianthi. Virulence surveys revealed that diverse pathotypes of P. helianthi evolve in wild sunflower populations, most likely because sexual recombination and subsequent selection of recombinant pathotypes occurs there. Wild sunflower populations provide a continuum of genetically heterogeneous hosts on which P. helianthi can potentially complete its sexual cycle under suitable environmental conditions. Population genetics analysis of a worldwide collection of P. helianthi indicated that Australian isolates of the pathogen are more diverse than non-Australian isolates. Additionally, the presence of the same pathotype in different genotypic backgrounds supported evidence from virulence data that sexual recombination has occurred in the Australian population of P. helianthi at some time. A primary aim of the work described was to apply our knowledge of pathotype evolution to improve resistance in sunflower to sunflower rust. Molecular markers were identified for a number of previously uncharacterised sunflower rust R-genes. These markers have been used to detect resistance genes in breeding lines and wild sunflower germplasm. A number of virulence loci that do not recombine were identified in P. helianthi. The resistance gene combinations corresponding to these virulence loci are currently being introgressed with breeding lines to generate hybrids with durable resistance to sunflower rust.
Resumo:
Tillage is defined here in a broad sense, including disturbance of the soil and crop residues, wheel traffic and sowing opportunities. In sub-tropical, semi-arid cropping areas in Australia, tillage systems have evolved from intensively tilled bare fallow systems, with high soil losses, to reduced and no tillage systems. In recent years, the use of controlled traffic has also increased. These conservation tillage systems are successful in reducing water erosion of soil and sediment-bound chemicals. Control of runoff of dissolved nutrients and weakly sorbed chemicals is less certain. Adoption of new practices appears to have been related to practical and economic considerations, and proved to be more profitable after a considerable period of research and development. However there are still challenges. One challenge is to ensure that systems that reduce soil erosion, which may involve greater use of chemicals, do not degrade water quality in streams. Another challenge is to ensure that systems that improve water entry do not increase drainage below the crop root zone, which would increase the risk of salinity. Better understanding of how tillage practices influence soil hydrology, runoff and erosion processes should lead to better tillage systems and enable better management of risks to water quality and soil health. Finally, the need to determine the effectiveness of in-field management practices in achieving stream water quality targets in large, multi-land use catchments will challenge our current knowledge base and the tools available.
Resumo:
Two species of root-lesion nematode (predominantly Pratylenchus thornei but also P. neglectus) are widespread pathogens of wheat and other crops in Australia's northern grain belt, a subtropical region with deep, fertile clay soils and a summer-dominant rainfall pattern. Losses in grain yield from P. thornei can be as high as 70% for intolerant wheat cultivars. This review focuses on research which has led to the development of effective integrated management programs for these nematodes. It highlights the importance of correct identification in managing Pratylenchus species, reviews the plant breeding work done in developing tolerant and resistant cultivars, outlines the methods used to screen for tolerance and resistance, and discusses how planned crop sequencing with tolerant and partially resistant wheat cultivars, together with crops such as sorghum, sunflower, millets and canaryseed, can be used to reduce nematode populations and limit crop damage. The declining levels of soil organic matter in cropped soils are also discussed with reference to their effect on soil health and biological suppression of root-lesion nematodes.
Resumo:
The fate of nitrogen (N) applied in biosolids was investigated in a forage production system on an alluvial clay loam soil in south-eastern Queensland, Australia. Biosolids were applied in October 2002 at rates of 6, 12, 36, and 54dryt/ha for aerobically digested biosolids (AE) and 8, 16, 48, and 72dryt/ha for anaerobically digested biosolids (AN). Rates were based on multiples of the Nitrogen Limited Biosolids Application rate (0.5, 1, 3, and 4.5NLBAR) for each type of biosolid. The experiment included an unfertilised control and a fertilised control that received multiple applications of synthetic fertiliser. Forage sorghum was planted 1 week after biosolids application and harvested 4 times between December 2002 and May 2003. Dry matter production was significantly greater from the biosolids-treated plots (21-27t/ha) than from the unfertilised (16t/ha) and fertilised (18t/ha) controls. The harvested plant material removed an extra 148-488kg N from the biosolids-treated plots. Partial N budgets were calculated for the 1NLBAR and 4.5NLBAR treatments for each biosolids type at the end of the crop season. Crop removal only accounted for 25-33% of the applied N in the 1NLBAR treatments and as low as 8-15% with 4.5NLBAR. Residual biosolids N was predominantly in the form of organic N (38-51% of applied biosolids N), although there was also a significant proportion (10-23%) as NO3-N, predominantly in the top 0.90m of the soil profile. From 12 to 29% of applied N was unaccounted for, and presumed to be lost as gaseous nitrogen and/or ammonia, as a consequence of volatilisation or denitrification, respectively. In-season mineralisation of organic N in biosolids was 43-59% of the applied organic N, which was much greater than the 15% (AN)-25% (AE) expected, based on current NLBAR calculation methods. Excessive biosolids application produced little additional biomass but led to high soil mineral N concentrations that were vulnerable to multiple loss pathways. Queensland Guidelines need to account for higher rates of mineralisation and losses via denitrification and volatilisation and should therefore encourage lower application rates to achieve optimal plant growth and minimise the potential for detrimental impacts on the environment.
Resumo:
Diets containing 3% sorghum ergot (16 mg alkaloids/kg, including 14 mg dihydroergosine/kg) were fed to 12 sows from 14 days post-farrowing until weaning 14 days later, and their performance was compared with that of 10 control sows. Ergot-fed sows displayed a smaller weight loss during lactation of 24 kg/head vs. 29 kg/head in control sows (p > 0.05) despite feed consumption being less (61 kg/head total feed intake vs. 73 kg/head by control sows; p < 0.05). Ergot-fed sows had poorer weight gain of litters over the 14-day period (16.6 kg/litter vs. 28.3 kg/litter for controls; p < 0.05) despite an increase in consumption of creep feed by the piglets from the ergot-fed sows (1.9 kg/litter compared with 1.1 kg/litter by the control; p > 0.05). Sow plasma prolactin was reduced with ergot feeding after 7 days to 4.8 μg/l compared with 15.1 μg/l in the control sows (p < 0.01) and then at weaning was 4.9 μg/l compared with 8.0 μg/l (p < 0.01) in the control sows. Two sows fed ergot ceased lactation early, and the above sow feed intakes, body weight losses with litter weight gains and creep consumption indirectly indicate an ergot effect on milk production.
Resumo:
The cattle tick Rhipicephalus microplus (formerly Boophilus microplus) is responsible for severe production losses to the cattle industry worldwide. It has long been known that different breeds of cattle can resist tick infestation to varying degrees; however, the mechanisms by which resistant cattle prevent heavy infestation are largely unknown. The aim of this study was to determine whether gene expression varied significantly between skin sampling sites (neck, chest and tail region), and whether changes in gene expression could be detected in samples taken at tick attachment sites (tick attached to skin sample) compared with samples taken from non-attachment sites (no tick attachment). We present here the results of an experiment examining the expression of a panel of forty-four genes in skin sections taken from Bos indicus (Brahman) cattle of known high resistance, and Bos taurus (Holstein-Friesian) cattle of known low resistance to the cattle tick. The forty-four genes chosen for this study included genes known to be involved in several immune processes, some structural genes, and some genes previously suggested to be of importance in tick resistance by other researchers. The expression of fifteen gene transcripts increased significantly in Holstein-Friesian skin samples at tick attachment sites. The higher expression of many genes involved in innate inflammatory processes in the Holstein-Friesian animals at tick attachment sites suggests this breed is exhibiting a non-directed pathological response to infestation. Of the forty-four genes analysed, no transcripts were detected in higher abundance at tick attachment sites in the Brahman cattle compared with similar samples from the Holstein-Friesian group, nor difference between attachment site and non-attachment site samples within the Brahman group. The results presented here suggest that the means by which these two cattle breeds respond to tick infestation differ and warrant further investigation.
Resumo:
In semi-arid areas such as western Nebraska, interest in subsurface drip irrigation (SDI) for corn is increasing due to restricted irrigation allocations. However, crop response quantification to nitrogen (N) applications with SDI and the environmental benefits of multiple in-season (IS) SDI N applications instead of a single early-season (ES) surface application are lacking. The study was conducted in 2004, 2005, and 2006 at the University of Nebraska-Lincoln West Central Research and Extension Center in North Platte, Nebraska, comparing two N application methods (IS and ES) and three N rates (128, 186, and 278 kg N ha(-1)) using a randomized complete block design with four replications. No grain yield or biomass response was observed in 2004. In 2005 and 2006, corn grain yield and biomass production increased with increasing N rates, and the IS treatment increased grain yield, total N uptake, and gross return after N application costs (GRN) compared to the ES treatment. Chlorophyll meter readings taken at the R3 corn growth stage in 2006 showed that less N was supplied to the plant with ES compared to the IS treatment. At the end of the study, soil NO3-N masses in the 0.9 to 1.8 m depth were greater under the IS treatment compared to the ES treatment. Results suggested that greater losses of NO3-N below the root zone under the ES treatment may have had a negative effect on corn production. Under SDI systems, fertigating a recommended N rate at various corn growth stages can increase yields, GRN, and reduce NO3-N leaching in soils compared to concentrated early-season applications.
Resumo:
As a first step to better targeting the activities of a project for improving management of western flower thrips, Frankliniella occidentialis, (WFT) in field grown vegetable crops, we surveyed growers, consultants and other agribusiness personnel in two regions of Queensland. Using face-to-face interviews, we collected data on key pests and measures used to manage them, the importance of WFT and associated viral diseases, sources of pest management information and additional skills and knowledge needed by growers and industry. Responses were similar in the two regions. While capsicum growers in one northern Queensland district had suffered serious losses from WFT damage in 2002, in general the pest was not seen as a major problem. In cucurbit crops, the silverleaf whitefly (Bemisia tabaci biotype B) was considered the most difficult insect pest to manage. Pest control tactics were largely based on pesticides although many respondents mentioned non-chemical methods such as good farm hygiene practices, control of weed hosts and regular crop monitoring, particularly when prompted. Respondents wanted to know more about pest identification, biology and damage, spray application and the best use of insecticides. Natural enemies were mentioned infrequently. Keeping up to date with available pesticide options, availability of new chemicals and options for a district-wide approach to managing pests emerged as key issues. Growers identified agricultural distributors, consultants, Queensland Department of Primary Industries staff, other growers and their own experience as important sources of information. Field days, workshops and seminars did not rank highly. Busy vegetable growers wanted these activities to be short and relevant, and preferred to be contacted by post and facsimile rather than email. In response to these results, we are focusing on three core, interrelated project extension strategies: (i) short workshops, seminars and farm walks to provide opportunities for discussion, training and information sharing with growers and their agribusiness advisors; (ii) communication via newsletters and information leaflets; (iii) support for commercialisation of services.
Resumo:
Crown, stolon, and petiole rots caused by Colletotrichum gloeosporioides (C.g.) were first identified in runner beds of the Queensland Approved Runner Scheme (QARS) in February 1989. The outbreaks occurred annually from 1990 to 1994. Minor losses in subsequent fruit crops occurred from 1990 to 1993, with 50% post-establishment losses occurring on fruit farms in southeast Queensland in 1994. The objective of this work was to provide a control strategy for the disease that would give stability to the QARS. Runner-bed trials in 1993-1994 showed that Octave® (462 g/kg prochloraz as the MnCl2 complex) was highly effective in reducing the incidence field symptoms and laboratory recovery of C.g. from symptomless petioles. A simple detached petiole laboratory test for measuring fungicide efficacy in runner bed trials and for laboratory screening of fungicides, is described. Scheme protocols were changed to require that only foundation plants from tissue culture were allowed onto QARS sites. These were to be symptomless and to have tested negative for the presence of C.g. The application of Octave® at fortnightly intervals in all QARS nurseries has reduced the level of visible symptoms and the laboratory recovery of C.g. from symptomless petioles to almost zero.
Resumo:
Pratylenchus thornei and P. neglectus are two species of root-lesion nematode that cause substantial yield losses in wheat. No commercially available wheat variety has resistance to both species. A doubled-haploid population developed from a cross between the synthetic hexaploid wheat line CPI133872 and the bread wheat Janz was used to locate and tag quantitative trait loci (QTLs) associated with resistance to both P. thornei and P. neglectus. Wheat plants were inoculated with both species of nematode in independent replicated glasshouse trials repeated over 2 years. Known locations of wheat microsatellite markers were used to construct a framework map. After an initial single-marker analysis to detect marker-trait linkages, chromosome regions associated with putative QTLs were targetted with microsatellite markers to increase map density in the chromosome regions of interest. In total, 148 wheat microsatellite markers and 21 amplified fragment length polymorphism markers were mapped. The codominant microsatellite marker Xbarc183 on the distal end of chromosome 6DS was allelic for resistance to both P. thornei and P. neglectus. The QTL were designated QRlnt.lrc-6D.1 and QRlnn.lrc-6D.1, for the 2 traits, respectively. The allele inherited from CPI133872 explained 22.0-24.2% of the phenotypic variation for P. thornei resistance, and the allele inherited from Janz accounted for 11.3-14.0% of the phenotypic variation for P. neglectus resistance. Composite interval mapping identified markers that flank a second major QTL on chromosome 6DL (QRlnt.lrc-6D.2) that explained 8.3-13.4% of the phenotypic variation for P. thornei resistance. An additional major QTL associated with P. neglectus resistance was detected on chromosome 4DS (QRlnn.lrc-4D.1) and explained a further 10.3-15.4% of the phenotypic variation. The identification and tagging of nematode resistance genes with molecular markers will allow appropriate allele combinations to be selected, which will aid the successful breeding of wheat with dual nematode resistance.
Resumo:
Izmir is a hardseeded, early flowering, subterranean clover of var. subterraneum (Katz. et Morley) Zohary and Heller collected from Turkey and developed by the collaborating organisations of the National Annual Pasture Legume Improvement Program. It is a more hardseeded replacement for Nungarin and best suited to well-drained, moderately acidic soils in areas with a growing season of less than 4.5 months. Izmir seed production and regeneration densities in 3-year pasture phases were similar to Nungarin in 21 trials across southern Australia, but markedly greater in years following a crop or no seed set. Over all measurements, Izmir produced 10% more winter herbage and 7% more spring herbage than Nungarin. Its greater hardseededness and good seed production, makes it better suited to cropping rotations than Nungarin. Softening of Izmir hard seeds occurs later in the summer–autumn period than Nungarin, giving it slightly greater protection from seed losses following false breaks to the season. Izmir is recommended for sowing in Western Australia, New South Wales, Victoria, South Australia and Queensland. Izmir has been granted Plant Breeders Rights in Australia.
Resumo:
In Australia communities are concerned about atrazine being detected in drinking water supplies. It is important to understand mechanisms by which atrazine is transported from paddocks to waterways if we are to reduce movement of agricultural chemicals from the site of application. Two paddocks cropped with grain sorghum on a Black Vertosol were monitored for atrazine, potassium chloride (KCl) extractable atrazine, desethylatrazine (DEA), and desisopropylatrazine (DIA) at 4 soil depths (0-0.05, 0.05-0.10, 0.10-0.20, and 0.20-0.30 m) and in runoff water and runoff sediment. Atrazine + DEA + DIA (total atrazine) had a half-life in soil of 16-20 days, more rapid dissipation than in many earlier reports. Atrazine extracted in dilute potassium chloride, considered available for weed control, was initially 34% of the total and had a half-life of 15-20 days until day 30, after which it dissipated rapidly with a half life of 6 days. We conclude that, in this region, atrazine may not pose a risk for groundwater contamination, as only 0.5% of applied atrazine moved deeper than 0.20 m into the soil, where it dissipated rapidly. In runoff (including suspended sediment) atrazine concentrations were greatest during the first runoff event (57 days after application) (85 μg/L) and declined with time. After 160 days, the total atrazine lost in runoff was 0.4% of the initial application. The total atrazine concentration in runoff was strongly related to the total concentration in soil, as expected. Even after 98% of the KCl-extractable atrazine had dissipated (and no longer provided weed control), runoff concentrations still exceeded the human health guideline value of 40 μg/L. For total atrazine in soil (0-0.05 m), the range for coefficient of soil sorption (Kd) was 1.9-28.4 mL/g and for soil organic carbon sorption (KOC) was 100-2184 mL/g, increasing with time of contact with the soil and rapid dissipation of the more soluble, available phase. Partition coefficients in runoff for total atrazine were initially 3, increasing to 32 and 51 with time, values for DEA being half these. To minimise atrazine losses, cultural practices that maximise rain infiltration, and thereby minimise runoff, and minimise concentrations in the soil surface should be adopted.
Resumo:
The potential of beef producers to profitably produce 500-kg steers at 2.5 years of age in northern Australia's dry tropics to meet specifications of high-value markets, using a high-input management (HIM) system was examined. HIM included targeted high levels of fortified molasses supplementation, short seasonal mating and the use of growth promotants. Using herds of 300-400 females plus steer progeny at three sites, HIM was compared at a business level to prevailing best-practice, strategic low-input management (SLIM) in which there is a relatively low usage of energy concentrates to supplement pasture intake. The data presented for each breeding-age cohort within management system at each site includes: annual pregnancy rates (range: 14-99%), time of conception, mortalities (range: 0-10%), progeny losses between confirmed pregnancy and weaning (range: 0-29%), and weaning rates (range: 14-92%) over the 2-year observation. Annual changes in weight and relative net worth were calculated for all breeding and non-breeding cohorts. Reasons for outcomes are discussed. Compared with SLIM herds, both weaning weights and annual growth were >= 30 kg higher, enabling 86-100% of HIM steers to exceed 500 kg at 2.5 years of age. Very few contemporary SLIM steers reached this target. HIM was most profitably applied to steers. Where HIM was able to achieve high pregnancy rates in yearlings, its application was recommended in females. Well managed, appropriate HIM systems increased profits by around $15/adult equivalent at prevailing beef and supplement prices. However, a 20% supplement price rise without a commensurate increase in values for young slaughter steers would generally eliminate this advantage. This study demonstrated the complexity of pro. table application of research outcomes to commercial business, even when component research suggests that specific strategies may increase growth and reproductive efficiency and/or be more pro. table. Because of the higher level of management required, higher costs and returns, and higher susceptibility to market changes and disease, HIM systems should only be applied after SLIM systems are well developed. To increase profitability, any strategy must ultimately either increase steer growth and sale values and/or enable a shift to high pregnancy rates in yearling heifers.
Resumo:
Sporobolus pyramidalis, S. africanus, S. natalensis, S. fertilis and S. jacquemontii, known collectively as the weedy Sporobolus grasses, are exotic weeds causing serious economic losses in grazing areas along Australia's entire eastern coast. In one of the first attempts to provide biological control for a grass, the potential of a smut, Ustilago sporoboli-indici, as a biological control agent for all five weedy Sporobolus spp. found in Australia was evaluated in glasshouse studies. Application of basidiospores to 21-day-old Sporobolus seedlings and subsequent incubation in a moist chamber (26 °C, 90% RH, 48 h) resulted in infection of S. pyramidalis, S. africanus, S. natalensis and S. fertilis but not S. jacquemontii. Host-range trials with 13 native Australian Sporobolus spp. resulted in infection of four native species. Evaluation of damage caused by the smut on two Australian native and two weedy Sporobolus spp. showed that the total numbers of flowers infected for the four grasses were in the following order: S. creber > S. fertilis > S. elongatus > S. natalensis with percentage flower infections of 21%, 14%, 12% and 3%, respectively. Significant differences (P = 0.001) were found when the numbers of infected flowers caused by each treatment were compared. The infection of the four native Sporobolus spp. by the smut indicated that it was not sufficiently host specific for release in Australia and the organism was rejected as a potential biological control agent. The implications of these results are discussed.