957 resultados para FARMS
Resumo:
Crown, stolon, and petiole rots caused by Colletotrichum gloeosporioides (C.g.) were first identified in runner beds of the Queensland Approved Runner Scheme (QARS) in February 1989. The outbreaks occurred annually from 1990 to 1994. Minor losses in subsequent fruit crops occurred from 1990 to 1993, with 50% post-establishment losses occurring on fruit farms in southeast Queensland in 1994. The objective of this work was to provide a control strategy for the disease that would give stability to the QARS. Runner-bed trials in 1993-1994 showed that Octave® (462 g/kg prochloraz as the MnCl2 complex) was highly effective in reducing the incidence field symptoms and laboratory recovery of C.g. from symptomless petioles. A simple detached petiole laboratory test for measuring fungicide efficacy in runner bed trials and for laboratory screening of fungicides, is described. Scheme protocols were changed to require that only foundation plants from tissue culture were allowed onto QARS sites. These were to be symptomless and to have tested negative for the presence of C.g. The application of Octave® at fortnightly intervals in all QARS nurseries has reduced the level of visible symptoms and the laboratory recovery of C.g. from symptomless petioles to almost zero.
Resumo:
The effects of inorganic amendments (fertilisers and pesticides) on soil biota that are reported in the scientific literature are, to say the least, variable. Though there is clear evidence that certain products can have significant impacts, the effects can be positive or negative. This is not surprising when you consider the number of organisms and amount of different functional groups, the number of products and various rates at which they could be applied, the methods of application and the environmental differences that occur in soil at a micro scale (within centimetres) in a paddock, let alone between paddocks, farms, catchments, regions etc. It therefore becomes extremely difficult to draw definitive conclusions from the reported results in order to summarise the impacts of these inputs. Several research trials and review papers have been published on this subject and most similarly conclude that the implications of many of the effects are still uncertain.
Resumo:
Dairy farms in subtropical Australia use irrigated, annually sown short-term ryegrass (Lolium multiflorum) or mixtures of short-term ryegrass and white (Trifolium repens) and Persian (shaftal) (T. resupinatum) clover during the winter-spring period in all-year-round milk production systems. A series of small plot cutting experiments was conducted in 3 dairying regions (tropical upland, north Queensland, and subtropical southeast Queensland and northern New South Wales) to determine the most effective rate and frequency of application of nitrogen (N) fertiliser. The experiments were not grazed, nor was harvested material returned to the plots, after sampling. Rates up to 100 kg N/ha.month (as urea or calcium ammonium nitrate) and up to 200 kg N/ha every 2 months (as urea) were applied to pure stands of ryegrass in 1991. In 1993 and 1994, urea, at rates up to 150 kg N/ha.month and to 200 kg N/ha every 2 months, was applied to pure stands of ryegrass; urea, at rates up to 50 kg N/ha.month, was also applied to ryegrass-clover mixtures. The results indicate that applications of 50-85 kg N/ha.month can be recommended for short-term ryegrass pastures throughout the subtropics and tropical uplands of eastern Australia, irrespective of soil type. At this rate, dry matter yields will reach about 90% of their potential, forage nitrogen concentration will be increased, there is minimal risk to stock from nitrate poisoning and there will be no substantial increase in soil N. The rate of N for ryegrass-clover pastures is slightly higher than for pure ryegrass but, at these rates, the clover component will be suppressed. However, increased ryegrass yields and higher forage nitrogen concentrations will compensate for the reduced clover component. At application rates up to 100 kg N/ha.month, build-up of NO3--N and NH4+-N in soil was generally restricted to the surface layers (0-20 cm) of the soil, but there was a substantial increase throughout the soil profile at 150 kg N/ha.month. The build-up of NO3--N and NH4+-N was greater and was found at lower rates on the lighter soil compared with heavy clays. Generally, most of the soil N was in the NO3--N form and most was in the top 20 cm.
Resumo:
Strawberry lethal yellows (SLY) disease in Australia is associated with the phytoplasmas Candidatus Phytoplasma australiense and tomato big bud, and a rickettsia-like-organism (RLO). Ca. P. australiense is also associated with strawberry green petal (SGP) disease. This study investigated the strength of the association of the different agents with SLY disease. We also documented the location of SLY or SGP plants, and measured whether they were RLO or phytoplasma positive. Symptomatic strawberry plants collected from south-east Queensland (Australia) between January 2000 and October 2002 were screened by PCR for both phytoplasmas and the RLO. Two previously unreported disease symptoms termed severe fruit distortion (SFD) and strawberry leaves from fruit (SLF) were observed during this study but there was no clear association between these symptoms and phytoplasmas or the RLO. Only two SGP diseased plants were observed and collected, compared with 363 plants with SLY disease symptoms. Of the 363 SLY samples, 117 tested positive for the RLO, 67 tested positive for Ca. P. australiense AGY strain and 11 plants tested positive for Ca. P. australiense PYL variant strain. On runner production farms at Stanthorpe, Queensland the RLO was detected in SLY diseased plants more frequently than for the phytoplasmas. On fruit production farms on the Sunshine Coast, Queensland, Ca. P. australiense was detected in SLY disease plants more frequently than the RLO.
Resumo:
In recent years, dieback of durian has become a major problem in mature orchards in the northern Queensland wet tropics region. A survey of 13 durian orchards was conducted during the dry season (July-September 2001) and following wet season (February-April 2002), with roots and soil from the root zone of affected trees being sampled. Phytophthora palmivora was recovered from the roots of affected trees on 12 of the 13 farms in the dry season, and all farms in the wet season. Pythium vexans was recovered from all 13 farms in both seasons. P. palmivora and P. vexans were recovered from diseased roots of 3-month-old durian seedlings cv. Monthong artificially inoculated with these organisms.
Resumo:
Manure additive products can be used to reduce odour emissions (OE) from livestock farms. The standardised evaluation of these manure additive products under specific farm conditions is important. In this study, the efficacy of a manure additive (WonderTreat(TM), CKLS, Inc., Hong-Kong) was assessed under Australian conditions utilising a combination of laboratory and field-scale evaluation techniques. As a first step, the efficacy of the manure additive was assessed in a laboratory-scale trial using a series of uniformly managed digesters and standard odour, liquor ammonia and hydrogen sulphide concentration measurement procedures. This showed that the addition of WonderTreat(TM) at the 'low dose rate' (LDR) (102.6 g m-2) used during the trial significantly, but only marginally (30%; P = 0.02) reduced the OE rate (mean 13.9 OU m-2 s-1) of anaerobic pig liquor relative to an untreated control (UC) (19.9 OU m-2 s-1). However, the 'high dose rate' (HDR) (205.3 g m-2) also assessed during the trial preformed similarly (19.7 OU m-2 s-1) to the UC. No statistically significant difference in the concentrations of a range of measured water quality variables at the 5% level was observed between the treatments or controls digesters. As a second step, a field-scale assessment of the manure additive was undertaken at a commercial piggery. Two piggery manure lagoons (each with approximately 2500 m2 surface area) were included in the study; one was treated with WonderTreat(TM) while the other was used as control. The efficacy of the treatment was assessed using olfactometric evaluation of odour samples collected from the surface of the pond using a dynamic wind tunnel and ancillary equipment. No statistically significant reduction in OE rate could be demonstrated (P = 0.35), partially due to the limited number of samples taken during the assessment. However, there was a numerical reduction in the average OE rate of the treatment pond (29 OU m-2 s-1 at 1 m s-1) compared to the control lagoon (38 OU m-2 s-1 at 1 m s-1).
Resumo:
Nearly 75% of all emerging infectious diseases (EIDs) that impact or threaten human health are zoonotic. The majority have spilled from wildlife reservoirs, either directly to humans or via domestic animals. The emergence of many can be attributed to predisposing factors such as global travel, trade, agricultural expansion, deforestation habitat fragmentation, and urbanization; such factors increase the interface and or the rate of contact between human, domestic animal, and wildlife populations, thereby creating increased opportunities for spillover events to occur. Infectious disease emergence can be regarded as primarily an ecological process. The epidemiological investigation of EIDs associated with wildlife requires a trans-disciplinary approach that includes an understanding of the ecology of the wildlife species, and an understanding of human behaviours that increase risk of exposure. Investigations of the emergence of Nipah virus in Malaysia in 1999 and severe acute respiratory syndrome (SARS) in China in 2003 provide useful case studies. The emergence of Nipah virus was associated with the increased size and density of commercial pig farms and their encroachment into forested areas. The movement of pigs for sale and slaughter in turn led to the rapid spread of infection to southern peninsular Malaysia, where the high-density, largely urban pig populations facilitated transmission to humans. Identifying the factors associated with the emergence of SARS in southern China requires an understanding of the ecology of infection both in the natural reservoir and in secondary market reservoir species. A necessary extension of understanding the ecology of the reservoir is an understanding of the trade, and of the social and cultural context of wildlife consumption. Emerging infectious diseases originating from wildlife populations will continue to threaten public health. Mitigating and managing the risk requires an appreciation of the connectedness between human, livestock and wildlife health, and of the factors and processes that disrupt the balance.
Resumo:
Maize is a highly important crop to many countries around the world, through the sale of the maize crop to domestic processors and subsequent production of maize products and also provides a staple food to subsistance farms in undeveloped countries. In many countries, there have been long-term research efforts to develop a suitable hardness method that could assist the maize industry in improving efficiency in processing as well as possibly providing a quality specification for maize growers, which could attract a premium. This paper focuses specifically on hardness and reviews a number of methodologies as well as important biochemical aspects of maize that contribute to maize hardness used internationally. Numerous foods are produced from maize, and hardness has been described as having an impact on food quality. However, the basis of hardness and measurement of hardness are very general and would apply to any use of maize from any country. From the published literature, it would appear that one of the simpler methods used to measure hardness is a grinding step followed by a sieving step, using multiple sieve sizes. This would allow the range in hardness within a sample as well as average particle size and/or coarse/fine ratio to be calculated. Any of these parameters could easily be used as reference values for the development of near-infrared (NIR) spectroscopy calibrations. The development of precise NIR calibrations will provide an excellent tool for breeders, handlers, and processors to deliver specific cultivars in the case of growers and bulk loads in the case of handlers, thereby ensuring the most efficient use of maize by domestic and international processors. This paper also considers previous research describing the biochemical aspects of maize that have been related to maize hardness. Both starch and protein affect hardness, with most research focusing on the storage proteins (zeins). Both the content and composition of the zein fractions affect hardness. Genotypes and growing environment influence the final protein and starch content and. to a lesser extent, composition. However, hardness is a highly heritable trait and, hence, when a desirable level of hardness is finally agreed upon, the breeders will quickly be able to produce material with the hardness levels required by the industry.
Resumo:
The north Queensland banana industry is under pressure from government and community expectations to exhibit good environmental stewardship. The industry is situated on the high-rainfall north Queensland coast adjacent to 2 natural icons, the Great Barrier Reef to the east and World Heritage-listed rain forest areas to the west. The main environmental concern is agricultural industry pollutants harming the Great Barrier Reef. In addition to environmental issues the banana industry also suffers financial pressure from declining margins and production loss from tropical cyclones. As part of a broader government strategy to reduce land-based pollutants affecting the Great Barrier Reef, the formation of a pilot banana producers group to address these environmental and economic pressures was facilitated. Using an integrated farming systems approach, we worked collaboratively with these producers to conduct an environmental risk assessment of their businesses and then to develop best management practices (BMP) to address environmental concerns. We also sought input from technical experts to provide increased rigour for the environmental risk assessment and BMP development. The producers' commercial experience ensured new ideas for improved sustainable practices were constantly assessed through their profit-driven 'filter' thus ensuring economic sustainability was also considered. Relying heavily on the producers' knowledge and experience meant the agreed sustainable practices were practical, relevant and financially feasible for the average-sized banana business in the region. Expert input and review also ensured that practices were technically sound. The pilot group producers then implemented and adapted selected key practices on their farms. High priority practices addressed by the producers group included optimizing nitrogen fertilizer management to reduce runoff water nitrification, developing practical ground cover management to reduce soil erosion and improving integrated pest management systems to reduce pesticide use. To facilitate wider banana industry understanding and adoption of the BMP's developed by the pilot group, we conducted field days at the farms of the pilot group members. Information generated by the pilot group has had wider application to Australian horticulture and the process has been subsequently used with the north Queensland sugar industry. Our experiences have shown that integrated farming systems methodologies are useful in addressing complex issues like environmental and economic sustainability. We have also found that individual horticulture businesses need on-going technical support for change to more sustainable practices. One-off interventions have little impact, as farm improvement is usually an on-going incremental process. A key lesson from this project has been the need to develop practical, farm scale economic tools to clarify and demonstrate the financial impact of alternative management practices. Demonstrating continued profitability is critical to encourage widespread industry adoption of environmentally sustainable practices
Resumo:
Citrus canker is a disease of citrus and closely related species, caused by the bacterium Xanthomonas citri subsp. citri. This disease, previously exotic to Australia, was detected on a single farm [infested premise-1, (IP1). IP is the terminology used in official biosecurity protocols to describe a locality at which an exotic plant pest has been confirmed or is presumed to exist. IP are numbered sequentially as they are detected] in Emerald, Queensland in July 2004. During the following 10 months the disease was subsequently detected on two other farms (IP2 and IP3) within the same area and studies indicated the disease first occurred on IP1 and spread to IP2 and IP3. The oldest, naturally infected plant tissue observed on any of these farms indicated the disease was present on IP1 for several months before detection and established on IP2 and IP3 during the second quarter (i.e. autumn) 2004. Transect studies on some IP1 blocks showed disease incidences ranged between 52 and 100% (trees infected). This contrasted to very low disease incidence, less than 4% of trees within a block, on IP2 and IP3. The mechanisms proposed for disease spread within blocks include weather-assisted dispersal of the bacterium (e.g. wind-driven rain) and movement of contaminated farm equipment, in particular by pivot irrigator towers via mechanical damage in combination with abundant water. Spread between blocks on IP2 was attributed to movement of contaminated farm equipment and/or people. Epidemiology results suggest: (i) successive surveillance rounds increase the likelihood of disease detection; (ii) surveillance sensitivity is affected by tree size; and (iii) individual destruction zones (for the purpose of eradication) could be determined using disease incidence and severity data rather than a predefined set area.
Resumo:
Surface losses of nitrogen from horticulture farms in coastal Queensland, Australia, may have the potential to eutrophy sensitive coastal marine habitats nearby. A case-study of the potential extent of such losses was investigated in a coastal macadamia plantation. Nitrogen losses were quantified in 5 consecutive runoff events during the 13-month study. Irrigation did not contribute to surface flows. Runoff was generated by storms at combined intensities and durations that were 20–40 mm/h for >9 min. These intensities and durations were within expected short-term (1 year) and long-term (up to 20 years) frequencies of rainfall in the study area. Surface flow volumes were 5.3 ± 1.1% of the episodic rainfall generated by such storms. Therefore, the largest part of each rainfall event was attributed to infiltration and drainage in this farm soil (Kandosol). The estimated annual loss of total nitrogen in runoff was 0.26 kg N/ha.year, representing a minimal loading of nitrogen in surface runoff when compared to other studies. The weighted average concentrations of total sediment nitrogen (TSN) and total dissolved nitrogen (TDN) generated in the farm runoff were 2.81 ± 0.77% N and 1.11 ± 0.27 mg N/L, respectively. These concentrations were considerably greater than ambient levels in an adjoining catchment waterway. Concentrations of TSN and TDN in the waterway were 0.11 ± 0.02% N and 0.50 ± 0.09 mg N/L, respectively. The steep concentration gradient of TSN and TDN between the farm runoff and the waterway demonstrated the occurrence of nutrient loading from the farming landscapes to the waterway. The TDN levels in the stream exceeded the current specified threshold of 0.2–0.3 mg N/L for eutrophication of such a waterway. Therefore, while the estimate of annual loading of N from runoff losses was comparatively low, it was evident that the stream catchment and associated agricultural land uses were already characterised by significant nitrogen loadings that pose eutrophication risks. The reported levels of nitrogen and the proximity of such waterways (8 km) to the coastline may have also have implications for the nearshore (oligotrophic) marine environment during periods of turbulent flow.
Resumo:
Faecal Egg Count Reduction Tests (FECRTs) for macrocyclic lactone (ML) and levamisole (LEV) drenches were conducted on two dairy farms in the subtropical, summer rainfall region of eastern Australia to determine if anthelmintic failure contributed to severe gastrointestinal nematode infections observed in weaner calves. Subtropical Cooperia spp. were the dominant nematodes on both farms although significant numbers of Haemonchus placei were also present on Farm 2. On Farm 1, moxidectin pour-on (MXD) drenched at 0.5 mg kg-1 liveweight (LW) reduced the overall Cooperia burden by 82% (95% confidence limits, 37-95%) at day 7 post-drench. As worm burdens increased rapidly in younger animals in the control group (n = 4), levamisole was used as a salvage drench and these calves withdrawn from the trial on animal welfare grounds after sample collection at day 7. Levamisole (LEV) dosed at 6.8 mg kg-1 LW reduced the worm burden in these calves by 100%, 7 days after drenching. On Farm 2, MXD given at 0.5 mg kg-1 LW reduced the faecal worm egg count of cooperioids at day 8 by 96% (71-99%), ivermectin oral (IVM) at 0.2 mg kg-1 LW by 1.6% (-224 to 70%) and LEV oral at 7.1 mg kg-1 LW by 100%. For H. placei the reductions were 98% (85-99.7%) for MXD, 0.7% (-226 to 70%) for IVM and 100% for LEV. This is the first report in Australia of the failure of macrocyclic lactone treatments to control subtropical Cooperia spp. and suspected failure to control H. placei in cattle.
Resumo:
This study assessed the levels of two key pathogens, Salmonella and Campylobacter, along with the indicator organism Escherichia coli in aerosols within and outside poultry sheds. The study ranged over a 3-year period on four poultry farms and consisted of six trials across the boiler production cycle of around 55 days. Weekly testing of litter and aerosols was carried out through the cycle. A key point that emerged is that the levels of airborne bacteria are linked to the levels of these bacteria in litter. This hypothesis was demonstrated by E. coli. The typical levels of E. coli in litter were similar to 10(8) CFU g(-1) and, as a consequence, were in the range of 10(2) to 10(4) CFU m(-3) in aerosols, both inside and outside the shed. The external levels were always lower than the internal levels. Salmonella was only present intermittently in litter and at lower levels (10(3) to 10(5) most probable number [MPN] g(-1)) and consequently present only intermittently and at low levels in air inside (range of 0.65 to 4.4 MPN m(-3)) and once outside (2.3 MPN m(-3)). The Salmonella serovars isolated in litter were generally also isolated from aerosols and dust, with the Salmonella serovars Chester and Sofia being the dominant serovars across these interfaces. Campylobacter was detected late in the production cycle, in litter at levels of around 107 MPN g(-1). Campylobacter was detected only once inside the shed and then at low levels of 2.2 MPN m(-3). Thus, the public health risk from these organisms in poultry environments via the aerosol pathway is minimal.
Resumo:
In this study, nasal swabs taken from multiparous sows at weaning time or from sick pigs displaying symptoms of Glasser's disease from farms in Australia [date not given] were cultured and analysed by polymerase chain reaction (PCR). Within each genotype detected on a farm, representative isolates were serotyped by gel diffusion (GD) testing or indirect haemagglutination (IHA) test. Isolates which did not react in any of the tests were regarded as non-typable and were termed serovar NT. Serovars 1, 5, 12, 13 and 14 were classified as highly pathogenic; serovars 2, 4 and 15 being moderately pathogenic; serovar 8 being slightly pathogenic and serovars 3, 6, 7, 9 and 11 being non-pathogenic. Sows were inoculated with the strain of Haemophilus parasuis (serovars 4, 6 and 9 from Farms 1, 2 and 4, respectively) used for controlled challenge 3 and 5 weeks before farrowing. Before farrowing the sows were divided into control and treatment groups. Five to seven days after birth, the piglets of the treatment group were challenged with a strain from the farm which had were used to vaccinate the sows. The effectiveness of the controlled exposure was evaluated by number of piglets displaying clinical signs possibly related to infection, number of antibiotic treatments and pig mortality. Nasal swabs of sick pigs were taken twice a week to find a correlation to infection. A subsample of pigs was weighed after leaving the weaning sheds. The specificity of a realtime PCR amplifying the infB gene was evaluated with 68 H. parasuis isolates and 36 strains of closely related species. 239 samples of DNA from tissues and fluids of 16 experimentally challenged animals were also tested with the realtime PCR, and the results compared with culture and a conventional PCR. The farm experiments showed that none of the controlled challenge pigs showed any signs of illness due to Glasser's disease, although the treatment groups required more antibiotics than the controls. A total of 556 H. parasuis isolates were genotyped, while 150 isolates were serotyped. H. parasuis was detected on 19 of 20 farms, including 2 farms with an extensive history of freedom from Glasser's disease. Isolates belonging to serovars regarded as potentially pathogenic were obtained from healthy pigs at weaning on 8 of the 10 farms with a history of Glasser's disease outbreaks. Sampling 213 sick pigs yielded 115 isolates, 99 of which belonged to serovars that were either potentially pathogenic or of unknown pathogenicity. Only 16 isolates from these sick pigs were of a serovar known to be non-pathogenic. Healthy pigs also had H. parasuis, even on farms free of Glasser's disease. The realtime PCR gave positive results for all 68 H. parasuis isolates and negative results for all 36 non-target bacteria. When used on the clinical material from experimental infections, the realtime PCR produced significantly more positive results than the conventional PCR (165 compared to 86).
Resumo:
1. Litter samples were collected at the end of the production cycle from spread litter in a single shed from each of 28 farms distributed across the three Eastern seaboard States of Australia. 2. The geometric mean for Salmonella was 44 Most Probable Number (MPN)/g for the 20 positive samples. Five samples were between 100 and 1000 MPN/g and one at 105 MPN/g, indicating a range of factors are contributing to these varying loads of this organism in litter. 3. The geometric mean for Campylobacter was 30 MPN/g for the 10 positive samples, with 7 of these samples being 100 MPN/g. The low prevalence and incidence of Campylobacter were possibly due to the rapid die-off of this organism. 4. E. coli values were markedly higher than the two key pathogens (geometric mean 20 x 105 colony forming units (cfu)/g) with overall values being more or less within the same range across all samples in the trial, suggesting a uniform contribution pattern of these organisms in litter. 5. Listeria monocytogenes was absent in all samples and this organism appears not to be an issue in litter. 6. The dominant (70% of the isolates) Salmonella serovar was S. Sofia (a common serovar isolated from chickens in Australia) and was isolated across all regions. Other major serovars were S. Virchow and S. Chester (at 10%) and S. Bovismorbificans and S. Infantis (at 8%) with these serovars demonstrating a spatial distribution across the major regions tested. 7. There is potential to re-use litter in the environment depending on end use and the support of relevant application practices and guidelines.