17 resultados para 139-855D
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Objective To attenuate two strains of Eimeria tenella by selecting for precocious development and evaluate the strains in characterisation trials and by field evaluation, to choose one precocious line for incorporation into an Australian live coccidiosis vaccine for poultry. Design Two strains from non-commercial flocks were passaged through chickens while selecting for precocious development. Each strain was characterised for drug sensitivity, pathogenicity, protection against homologous and heterologous challenge, and oocyst output in replicated experiments in which the experimental unit was a cage of three birds. Oocyst output and/or body weight gain data collected over a 10 to 12 day period following final inoculation were measured. Feed conversion ratios were also calculated where possible. Results Fifteen passages resulted in prepatent periods reduced by 24 h for the Redlands strain (from 144 h to 120 h)and 23 h for the Darryl strain (from 139 h to 116 h). Characterisation trials demonstrated that each precocious line was significantly less pathogenic than its parent strain and each effectively induced immunity that protected chickens against challenge with both the parent strain and other virulent field strains. Both lines had oocyst outputs that, although significantly reduced relative to the parent strains, remained sufficiently high for commercial vaccine production, and both showed susceptibility to coccidiostats. Conclusion Two attenuated lines have been produced that exhibit the appropriate characteristics for use in an Australian live coccidiosis vaccine.
Resumo:
Predatory insects and spiders are key elements of integrated pest management (IPM) programmes in agricultural crops such as cotton. Management decisions in IPM programmes should to be based on a reliable and efficient method for counting both predators and pests. Knowledge of the temporal constraints that influence sampling is required because arthropod abundance estimates are likely to vary over a growing season and within a day. Few studies have adequately quantified this effect using the beat sheet, a potentially important sampling method. We compared the commonly used methods of suction and visual sampling to the beat sheet, with reference to an absolute cage clamp method for determining the abundance of various arthropod taxa over 5 weeks. There were significantly more entomophagous arthropods recorded using the beat sheet and cage clamp methods than by using suction or visual sampling, and these differences were more pronounced as the plants grew. In a second trial, relative estimates of entomophagous and phytophagous arthropod abundance were made using beat sheet samples collected over a day. Beat sheet estimates of the abundance of only eight of the 43 taxa examined were found to vary significantly over a day. Beat sheet sampling is recommended in further studies of arthropod abundance in cotton, but researchers and pest management advisors should bear in mind the time of season and time of day effects.
Resumo:
The response of grasslands to disturbance varies with the nature of the disturbance and the productivity of the landscape. In highly productive grasslands, competitive exclusion often results in decreased species richness and grazing may allow more species to coexist. Once widespread, grasslands dominated by Dichanthium sericeum (Queensland bluegrass) and Astrebla spp. (Mitchell grass) occur on fertile plains but have been reduced in extent by cultivation. We tested the effects of exclusion of livestock grazing on these grasslands by comparing the floristic composition of sites in a nature reserve with an adjacent stock reserve. In addition, sites that had been cultivated within the nature reserve were compared with those where grazing but no cultivation had occurred. To partition the effects of temporal variation from spatial variation we sampled sites in three different years (1998, 2002 and 2004). Some 194 taxa were recorded at the nature reserve and surrounding stock routes. Sampling time, the occurrence of past cultivation and livestock grazing all influenced species composition. Species richness varied greatly between sampling periods relating to highly variable rainfall and water availability on heavy clay soils. Native species richness was significantly lower at previously cultivated sites (13-22 years after cultivation), but was not significantly influenced by grazing exclusion. After 8 years it appears that reintroducing disturbance in the form of livestock grazing is not necessary to maintain plant species richness in the reserve. The highly variable climate (e.g. droughts) probably plays an important role in the coexistence of species by negating competitive exclusion and allowing interstitial species to persist.
Resumo:
Weed management is one of the most important economic and agronomic issues facing farmers in Australia's grain regions. Weed species occurrence and abundance was monitored between 1997 and 2000 on 46 paddocks (sites) across 18 commercial farms located in the Northern Grain Region. The sites generally fell within 4 disjunct regions, from south to north: Liverpool Plains, Moree, Goondiwindi and Kingaroy. While high species richness was found (139 species or species groups), only 8 species occurred in all 4 regions and many (56 species) only occurred at 1 site or region. No species were observed at every site but 7 species (Sonchus spp., Avena spp., Conyza spp., Echinochloa spp., Convolvulus erubescens, Phalaris spp. and Lactuca serriola) were recorded on more than 70% of sites. The average number of species observed within crops after treatment and before harvest was less than 13. Species richness tended to be higher in winter pulse crops, cotton and in fallows, but overall was similar at the different sampling seasons (summer v. winter). Separate species assemblages associated with the Goondiwindi and Kingaroy regions were identified by correspondence analysis but these appeared to form no logical functional group. The species richness and density was generally low, demonstrating that farmers are managing weed populations effectively in both summer and winter cropping phases. Despite the apparent adoption of conservation tillage, an increase in opportunity cropping and the diversity of crops grown (13) there was no obvious effect of management practices on weed species richness or relative abundance. Avena spp. and Sonchus spp. were 2 of the most dominant weeds, particularly in central and southern latitudes of the region; Amaranthus spp. and Raphanus raphanistrum were the most abundant species in the northern part of the region. The ubiquity of these and other species shows that continued vigilance is required to suppress weeds as a management issue.
Resumo:
Grain feeding low bodyweight, cast-for-age (CFA) sheep from pastoral areas of eastern Australia at the end of the growing season can enable critical carcass weight grades to be achieved and thus yield better economic returns. The aim of this work was to compare growth and carcass characteristics for CFA Merino ewes consuming either simple diets based on whole sorghum grain or commercial feed pellets. The experiment also compared various sources of additional nitrogen (N) for inclusion in sorghum diets and evaluated several introductory regimes. Seventeen ewes were killed initially to provide baseline carcass data and the remaining 301 ewes were gradually introduced to the concentrate diets over 14 days before being fed concentrates and wheaten hay ad libitum for 33 or 68 days. Concentrate treatments were: (i) commercial feed pellets, (ii) sorghum mix (SM; whole sorghum grain, limestone, salt and molasses) + urea and ammonium sulfate (SMU), (iii) SMU + whole cottonseed at 286 g/kg of concentrate dry matter (DM), (iv) SM + cottonseed meal at 139 g/kg of concentrate DM, (v) SMU + virginiamycin (20 mg/kg of concentrate) for the first 21 days of feeding, and (vi) whole cottonseed gradually replaced by SMU over the first 14 days of feeding. The target carcass weight of 18 kg was achieved after only 33 days on feed for the pellets and the SM + cottonseed meal diet. All other whole grain sorghum diets required between 33 and 68 days on feed to achieve the target carcass weight. Concentrates based on whole sorghum grain generally produced significantly (P < 0.05) lower carcass weight and fat score than pellets and this may have been linked to the significantly (P < 0.05) higher faecal starch concentrations for ewes consuming sorghum-based diets (270 v. 72 g/kg DM on day 51 of feeding for sorghum-based diets and pellets, respectively). Source of N in whole grain sorghum rations and special introductory regimes had no significant (P > 0.05) effects on carcass weight or fat score of ewes with the exception of carcass weight for SMU + whole cottonseed being significantly lower than SM + cottonseed meal at day 33. Ewes finished on all diets produced acceptable carcasses although muscle pH was high in all ewe carcasses (average 5.8 and 5.7 at 33 and 68 days, respectively). There were no significant (P > 0.05) differences between diets in concentrate DM intake, rumen fluid pH, meat colour score, fat colour score, eye muscle area, meat pH or meat temperature.
Resumo:
Resistance to the root-lesion nematode Pratylenchus thornei was sought in wheat from the West Asia and North Africa (WANA) region in the Watkins Collection (148 bread and 139 durum wheat accessions) and the McIntosh Collection (59 bread and 43 durum wheat accessions). It was considered that landraces from this region, encompassing the centres of origin of wheat and where P. thornei also occurs, could be valuable sources of resistance for use in wheat breeding. Resistance was determined by number of P. thornei/kg soil after the growth of the plants in replicated glasshouse experiments. On average, durum accessions produced significantly lower numbers of P. thornei than bread wheat accessions in both the Watkins and McIntosh Collections. Selected accessions with low P. thornei numbers were re-tested and 13 bread wheat and 10 durum accessions were identified with nematode numbers not significantly different from GS50a, a partially resistant bread wheat line used as a reference standard. These resistant accessions, which originated in Iran, Iraq, Syria, Egypt, Sudan, Morocco, and Tunisia, represent a resource of resistance genes in the primary wheat gene pool, which could be used in Australian wheat breeding programs to reduce the economic loss from P. thornei.
Resumo:
Large geographic areas can have numerous incipient invasive plant populations that necessitate eradication. However, resources are often deficient to address every infestation. Within the United States, weed lists (either state-level or smaller unit) generally guide the prioritization of eradication of each listed species uniformly across the focus region. This strategy has several limitations that can compromise overall effectiveness, which include spending limited resources on 1) low impact populations, 2) difficult to access populations, or 3) missing high impact populations of low priority species. Therefore, we developed a novel science-based, transparent, analytical ranking tool to prioritize weed populations, instead of species, for eradication and tested it on a group of noxious weeds in California. For outreach purposes, we named the tool WHIPPET (Weed Heuristics: Invasive Population Prioritization for Eradication Tool). Using the Analytic Hierarchy Process that included expert opinion, we developed three major criteria, four sub-criteria, and four sub-sub-criteria, taking into account both species and population characteristics. Subject matter experts weighted and scored these criteria to assess the relative impact, potential spread, and feasibility of eradication (major criteria) for 100 total populations of 19 species. Species-wide population scores indicated that conspecific populations do not necessarily group together in the final ranked output. Thus, priority lists based solely on species-level characteristics are less effective compared to a blended prioritization based on both species attributes and individual population and site parameters. WHIPPET should facilitate a more efficacious decision-making process allocating limited resources to target invasive plant infestations with the greatest predicted impacts to the region under consideration.
Resumo:
The distribution and density of the ampullary electroreceptors in the skin of elasmobranchs are influenced by the phylogeny and ecology of a species. Sensory maps were created for 4 species of pristid sawfish. Their ampullary pores were separated into pore fields based on their innervation and cluster formation. Ventrally, ampullary pores are located in 6 areas (5 in Pristis microdon), covering the rostrum and head to the gills. Dorsally, pores are located in 4 areas (3 in P. microdon), which cover the rostrum, head and may extend slightly onto the pectoral fins. In all species, the highest number of pores is found on the dorsal and ventral sides of the rostrum. The high densities of pores along the rostrum combined with the low densities around the mouth could indicate that sawfish use their rostrum to stun their prey before ingesting it, but this hypothesis remains to be tested. The directions of ampullary canals on the ventral side of the rostrum are species specific. P. microdon possesses the highest number of ampullary pores, which indicates that amongst the study species this species is an electroreception specialist. As such, juvenile P. microdon inhabit low-visibility freshwater habitats.
Resumo:
Development of new agricultural industries in northern Australia is seen as a way to provide food security in the face of reduced water availability in existing regions in the south. This report aims to identify some of the possible economic consequences of developing a rice industry in the Burdekin region, while there is a reduction of output in the Riverina. Annual rice production in the Riverina peaked at 1.7 M tonnes, but the long-term outlook, given climate change impacts on that region and government water buy-backs, is more likely to be less than 800,000 tonnes. Growers are highly efficient water users by international standards, but the ability to offset an anticipated reduction in water availability through further efficiency gains is limited. In recent years growers in the Riverina have diversified their farms to a greater extent and secondary production systems include beef, sheep and wheat. Production in north Queensland is in its infancy, but a potentially suitable farming system has been developed by including rice within the sugarcane system without competition and in fact contributing to the production of sugar by increasing yields and controlling weeds. The economic outcomes are estimated a large scale, dynamic, computable general equilibrium (CGE) model of the world economy (Tasman Global), scaled down to regional level. CGE models mimic the workings of the economy through a system of interdependent behavioural and accounting equations which are linked to an input-output database. When an economic shock or change is applied to a model, each of the markets adjusts according to the set of behavioural parameters which are underpinned by economic theory. In this study the model is driven by reducing production in the Riverina in accordance with relationships found between water availability and the production of rice and replacement by other crops and by increasing ride production in the Burdekin. Three scenarios were considered: • Scenario 1: Rice is grown using the fallow period between the last ratoon crop of sugarcane and the new planting. In this scenario there is no competition between rice and sugarcane • Scenario 2: Rice displaces sugarcane production • Scenario 3: Rice is grown on additional land and does not compete with sugarcane. Two time periods were used, 2030 and 2070, which are the conventional time points to consider climate change impacts. Under scenario 1, real economic output declines in the Riverina by $45 million in 2030 and by $139 million in 2070. This is only partially offset by the increased real economic output in the Burdekin of $35 million and $131 million respectively.
Resumo:
This research aimed to develop and evaluate pre- and postharvest management strategies to reduce stem end rot (SER) incidence and extend saleable life of 'Carabao' mango fruits in Southern Philippines. Preharvest management focused on the development and improvement of fungicide spray program, while postharvest management aimed to develop alternative interventions aside from hot water treatment (HWT). Field evaluation of systemic fungicides, namely azoxystrobin ( Amistar 25SC), tebuconazole ( Folicur 25WP), carbendazim ( Goldazim 500SC), difenoconazole ( Score 250SC) and azoxystrobin+difenoconazole ( Amistar Top), reduced blossom blight severity and improved fruit setting and retention, resulting in higher fruit yield but failed to sufficiently suppress SER incidence. Based on these findings, an improved fungicide spray program was developed taking into account the infection process of SER pathogens and fungicide resistance. Timely application of protectant (mancozeb) and systemic fungicides (azoxystrobin, carbendazim and difenoconazole) during the most critical stages of mango flower and fruit development ensured higher harvestable fruit yield and minimally lowered SER incidence. Control of SER was also achieved by employing postharvest treatment such as HWT (52-55°C for 10 min), which significantly prolonged the saleable life of mango fruits. However, extended hot water treatment (EHWT; 46°C pulp temperature for 15 min), rapid heat treatment (RHT; 59°C for 30-60 sec), fungicide dip and promising biological control agents failed to satisfactorily reduce SER and prolong saleable life. In contrast, the integration of the improved spray program as preharvest management practice, and postharvest treatments such as HWT and fungicide dips (azoxystrobin, 150-175 ppm; carbendazim, 312.5 ppm; and tebuconazole, 125-156 ppm), significantly reduced disease and extended marketable life for utmost 8 days.
Resumo:
Trichinella surveillance in wildlife relies on muscle digestion of large samples which are logistically difficult to store and transport in remote and tropical regions as well as labour-intensive to process. Serological methods such as enzyme-linked immunosorbent assays (ELISAs) offer rapid, cost-effective alternatives for surveillance but should be paired with additional tests because of the high false-positive rates encountered in wildlife. We investigated the utility of ELISAs coupled with Western blot (WB) in providing evidence of Trichinella exposure or infection in wild boar. Serum samples were collected from 673 wild boar from a high- and low-risk region for Trichinella introduction within mainland Australia, which is considered Trichinella-free. Sera were examined using both an 'in-house' and a commercially available indirect-ELISA that used excretory secretory (E/S) antigens. Cut-off values for positive results were determined using sera from the low-risk population. All wild boar from the high-risk region (352) and 139/321 (43.3%) of the wild boar from the low-risk region were tested by artificial digestion. Testing by Western blot using E/S antigens, and a Trichinella-specific real-time PCR was also carried out on all ELISA-positive samples. The two ELISAs correctly classified all positive controls as well as one naturally infected wild boar from Gabba Island in the Torres Strait. In both the high- and low-risk populations, the ELISA results showed substantial agreement (k-value = 0.66) that increased to very good (k-value = 0.82) when WB-positive only samples were compared. The results of testing sera collected from the Australian mainland showed the Trichinella seroprevalence was 3.5% (95% C.I. 0.0-8.0) and 2.3% (95% C.I. 0.0-5.6) using the in-house and commercial ELISA coupled with WB respectively. These estimates were significantly higher (P < 0.05) than the artificial digestion estimate of 0.0% (95% C.I. 0.0-1.1). Real-time PCR testing of muscle from seropositive animals did not detect Trichinella DNA in any mainland animals, but did reveal the presence of a second larvae-positive wild boar on Gabba Island, supporting its utility as an alternative, highly sensitive method in muscle examination. The serology results suggest Australian wildlife may have been exposed to Trichinella parasites. However, because of the possibility of non-specific reactions with other parasitic infections, more work using well-defined cohorts of positive and negative samples is required. Even if the specificity of the ELISAs is proven to be low, their ability to correctly classify the small number of true positive sera in this study indicates utility in screening wild boar populations for reactive sera which can be followed up with additional testing. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
The Queensland strawberry (Fragaria ×ananassa) breeding program in subtropical Australia aims to improve sustainable profitability for the producer. Selection must account for the relative economic importance of each trait and the genetic architecture underlying these traits in the breeding population. Our study used estimates of the influence of a trait on production costs and profitability to develop a profitability index (PI) and an economic weight (i.e., change in PI for a unit change in level of trait) for each trait. The economic weights were then combined with the breeding values for 12 plant and fruit traits on over 3000 genotypes that were represented in either the current breeding population or as progenitors in the pedigree of these individuals. The resulting linear combination (i.e., sum of economic weight × breeding value for all 12 traits) estimated the overall economic worth of each genotype as H, the aggregate economic genotype. H values were validated by comparisons among commercial cultivars and were also compared with the estimated gross margins. When the H value of ‘Festival’ was set as zero, the H values of genotypes in the pedigree ranged from –0.36 to +0.28. H was highly correlated (R2 = 0.77) with the year of selection (1945–98). The gross margins were highly linearly related (R2 > 0.98) to H values when the genotype was planted on less than 50% of available area, but the relationship was non-linear [quadratic with a maximum (R2 > 0.96)] when the planted area exceeded 50%. Additionally, with H values above zero, the variation in gross margin increased with increasing H values as the percentage of area planted to a genotype increased. High correlations among some traits allowed the omission of any one of three of the 12 traits with little or no effect on ranking (Spearman’s rank correlation 0.98 or greater). Thus, these traits may be dropped from the aggregate economic genotype, leading to either cost reductions in the breeding program or increased selection intensities for the same resources. H was efficient in identifying economically superior genotypes for breeding and deployment, but because of the non-linear relationship with gross margin, calculation of a gross margin for genotypes with high H is also necessary when cultivars are deployed across more than 50% of the available area.
Resumo:
Fusarium wilt of strawberry, incited by Fusarium oxysporum f. sp. fragariae (Fof), is a major disease of the cultivated strawberry (Fragaria xananassa) worldwide. An increase in disease outbreaks of the pathogen in Western Australia and Queensland plus the search for alternative disease management strategies place emphasis on the development of resistant cultivars. In response, a partial incomplete diallel cross involving four parents was performed for use in glasshouse resistance screenings. The resulting progeny were evaluated for their susceptibility to Fof. Best-performing progeny and suitability of progenies as parents were determined using data from disease severity ratings and analyzed using a linear mixed model incorporating a pedigree to produce best linear unbiased predictions of breeding values. Variation in disease response, ranging from highly susceptible to resistant, indicates a quantitative effect. The estimate of the narrow-sense heritability was 0.49 +/- 0.04 (SE), suggesting the population should be responsive to phenotypic recurrent selection. Several progeny genotypes have predicted breeding values higher than any of the parents. Knowledge of Fof resistance derived from this study can help select best parents for future crosses for the development of new strawberry cultivars with Fof resistance.
Resumo:
Several species of Phyllosticta (syn. Guignardia) have been described from orchids worldwide. A new species, Phyllosticta speewahensis, is proposed for a specimen isolated from leaf spots on a hybrid Vanda orchid in northern Queensland, Australia. Phylogenetic analysis of the nrDNA internal transcribed spacer region (ITS) and partial translation elongation factor 1-alpha (TEF1) gene sequences showed that P. speewahensis is most closely related to P. hostae. The likelihood that orchids harbour further cryptic species of endophytic and pathogenic Phyllosticta species is discussed.
Resumo:
This study compared pregnancy rates (PRs) and costs per calf born after fixed-time artificial insemination (FTAI) or AI after estrus detection (i.e., estrus detection and AI, EDAI), before and after a single PGF2α treatment in Bos indicus (Brahman-cross) heifers. On Day 0, the body weight, body condition score, and presence of a CL (46% of heifers) were determined. The heifers were then alternately allocated to one of two FTAI groups (FTAI-1, n = 139) and (FTAI-2, n = 141) and an EDAI group (n = 273). Heifers in the FTAI groups received an intravaginal progesterone-releasing device (IPRD; 0.78 g of progesterone) and 1 mg of estradiol benzoate intramuscularly (im) on Day 0. Eight days later, the IPRD was removed and heifers received 500 μg of PGF2α and 300 IU of eCG im; 24 hours later, they received 1 mg estradiol benzoate im and were submitted to FTAI 30 to 34 hours later (54 and 58 hours after IPRD removal). Heifers in the FTAI-2 group started treatment 8 days after those in the FTAI-1 group. Heifers in the EDAI group were inseminated approximately 12 hours after the detection of estrus between Days 4 and 9 at which time the heifers that had not been detected in estrus received 500 μg of PGF2α im and EDAI continued until Day 13. Heifers in the FTAI groups had a higher overall PR (proportion pregnant as per the entire group) than the EDAI group (34.6% vs. 23.2%; P = 0.003), however, conception rate (PR of heifers submitted for AI) tended to favor the estrus detection group (34.6% vs. 44.1%; P = 0.059). The cost per AI calf born was estimated to be $267.67 and $291.37 for the FTAI and EDAI groups, respectively. It was concluded that in Brahman heifers typical of those annually mated in northern Australia FTAI compared with EDAI increases the number of heifers pregnant and reduces the cost per calf born.