34 resultados para 6-55
em eResearch Archive - Queensland Department of Agriculture
Resumo:
The major cuticular hydrocarbons from the cane beetle species Antitrogus parvulus were deduced to be 4,6,8,10,16,18-hexa- and 4,6,8,10,16- pentamethyldocosanes 2 and 3, respectively. Isomers of 2,4,6,8-tetramethylundecanal 27, 36, and 37, derived from 2,4,6-trimethylphenol, were coupled with the phosphoranes 28 and 29 to furnish alkenes and, by reduction, diastereomers of 2 and 3. Chromatographic and spectroscopic comparisons confirmed 2 as either 6a or 6b and 3 as either 34a or 34b.
Resumo:
The seed-feeding jewel bug, Agonosoma trilineatum (F.), is an introduced biological control agent for bellyache bush, Jatropha gossypiifolia L. To quantify the damage potential of this agent, shadehouse experiments were conducted with individual bellyache bush plants exposed to a range of jewel bug densities (0, 6 or 24 jewel bugs/plant). The level of abortion of both immature and mature seed capsules and impacts on seed weight and seed viability were recorded in an initial short-term study. The ability of the jewel bug to survive and cause sustained damage was then investigated by measuring seed production, the survival of adults and nymph density across three 6-month cycles. The level of seed capsule abortion caused by the jewel bug was significantly affected by the maturity status of capsules and the density of insects present. Immature capsules were most susceptible and capsule abortion increased with jewel bug density. Similarly, on average, the insects reduced the viability of bellyache bush seeds by 79% and 89% at low and high densities, respectively. However, sustaining jewel bug populations for prolonged periods proved difficult. Adult survival at the end of three 6-month cycles averaged 11% and associated reductions in viable seed production ranged between 55% and 77%. These results suggest that the jewel bug has the potential to reduce the number of viable seeds entering the soil seed bank provided populations can be established and maintained at sufficiently high densities.
Resumo:
Background: Cultivated peanut or groundnut (Arachis hypogaea L.) is the fourth most important oilseed crop in the world, grown mainly in tropical, subtropical and warm temperate climates. Due to its origin through a single and recent polyploidization event, followed by successive selection during breeding efforts, cultivated groundnut has a limited genetic background. In such species, microsatellite or simple sequence repeat (SSR) markers are very informative and useful for breeding applications. The low level of polymorphism in cultivated germplasm, however, warrants a need of larger number of polymorphic microsatellite markers for cultivated groundnut. Results: A microsatellite- enriched library was constructed from the genotype TMV2. Sequencing of 720 putative SSR-positive clones from a total of 3,072 provided 490 SSRs. 71.2% of these SSRs were perfect type, 13.1% were imperfect and 15.7% were compound. Among these SSRs, the GT/CA repeat motifs were the most common (37.6%) followed by GA/CT repeat motifs (25.9%). The primer pairs could be designed for a total of 170 SSRs and were optimized initially on two genotypes. 104 (61.2%) primer pairs yielded scorable amplicon and 46 (44.2%) primers showed polymorphism among 32 cultivated groundnut genotypes. The polymorphic SSR markers detected 2 to 5 alleles with an average of 2.44 per locus. The polymorphic information content (PIC) value for these markers varied from 0.12 to 0.75 with an average of 0.46. Based on 112 alleles obtained by 46 markers, a phenogram was constructed to understand the relationships among the 32 genotypes. Majority of the genotypes representing subspecies hypogaea were grouped together in one cluster, while the genotypes belonging to subspecies fastigiata were grouped mainly under two clusters. Conclusion. Newly developed set of 104 markers extends the repertoire of SSR markers for cultivated groundnut. These markers showed a good level of PIC value in cultivated germplasm and therefore would be very useful for germplasm analysis, linkage mapping, diversity studies and phylogenetic relationships in cultivated groundnut as well as related Arachis species.
Resumo:
A laboratory study was undertaken to determine the persistence and efficacy of spinosad against Rhyzopertha dominica (F.) in wheat stored for 9 months at 30 degrees C and 55 and 70% relative humidity. The aim was to investigate the potential of spinosad for protecting wheat from R. dominica during long-term storage in warm climates. Wheat was treated with spinosad at 0.1, 0.5 and 1 mg kg(-1) grain and sampled after 0, 1.5, 3, 4.5, 6, 7.5 and 9 months of storage for bioassays and residue analyses. Residues were estimated to have declined by 30% during 9 months of storage at 30 degrees C and there was no effect of relative humidity. Spinosad applied at 0.5 or 1 mg kg(-1) was completely effective for 9 months, with 100% adult mortality after 14 days of exposure and no five F, adults produced. Adult mortality was < 100% in some samples of wheat treated with 0.1 mg kg(-1) of spinosad, and live progeny were produced in all samples treated at this level. The results show that spinosad is likely to be an effective grain protectant against R. dominica in wheat stored in warm climates.
Resumo:
Araucaria cunninghamii (hoop pine) typically occurs as an emergent tree over subtropical and tropical rainforests, in a discontinuous distribution that extends from West Irian Jaya at about 0°30'S, through the highlands of Indonesian New Guinea and Papua New Guinea, along the east coast of Australia from 11°39'S in Queensland to 30°35'S in northern New South Wales. Plantations established in Queensland since the 1920s now total about 44000 ha, and constitute the primary source for the continuing supply of hoop pine quality timber and pulpwood, with a sustainable harvest exceeding 440 000 m3 y-1. Establishment of these managed plantations allowed logging of all native forests of Araucaria species (hoop pine and bunya pine, A. bidwillii) on state-owned lands to cease in the late 1980s, and the preservation of large areas of araucarian forest types within a system of state-owned and managed reserves. The successful plantation program with this species has been strongly supported by genetic improvement activities since the late 1940s - through knowledge of provenance variation and reproductive biology, the provision of reliable sources of improved seed, and the capture of substantial genetic gains in traits of economic importance (for example growth, stem straightness, internode length and spiral grain). As such, hoop pine is one of the few tropical tree species that, for more than half a century, has been the subject of continuous genetic improvement. The history of commercialisation and genetic improvement of hoop pine provides an excellent example of the dual economic and conservation benefits that may be obtained in tropical tree species through the integration of gene conservation and genetic improvement with commercial plantation development. This paper outlines the natural distribution and reproductive biology of hoop pine, describes the major achievements of the genetic improvement program in Queensland over the past 50+ y, summarises current understanding of the genetic variation and control of key selection traits, and outlines the means by which genetic diversity in the species is being conserved.
Resumo:
Because weed eradication programs commonly take 10 or more years to complete, there is a need to evaluate progress toward the eradication objective. We present a simple model, based on information that is readily obtainable, that assesses conformity to the delimitation and extirpation criteria for eradication. It is applied to the program currently targeting the annual parasitic weed, branched broomrape, in South Australia. The model consists of delimitation and extirpation (E) measures plotted against each other to form an 'eradograph.' Deviations from the 'ideal' eradograph plot can inform tactical responses, e.g., increases in survey and/or control effort. Infestations progress from the active phase to the monitoring phase when no plants have been detected for at least 12 mo. They revert to the active phase upon further detection of plants. We summarize this process for the invasion as a whole in a state-and-transition model. Using this model we demonstrate that the invasion is unlikely to be delimited unless the amount of newly detected infested area decreases, on average, by at least 50% per annum. As a result of control activities implemented, on average approximately 70% (range, 44 to 86%) of active infestations progressed to the monitoring phase in the year following their detection. Simulations suggest that increasing this rate of transition will not increase E to a significant extent. The rate of reversion of infestations from the monitoring phase to the active phase decreased logarithmically with time since last detection, but it is likely that lower rates of reversion would accelerate the trend toward extirpation. Program performance with respect to the delimitation criterion has been variable; performance with respect to the extirpation criterion would be improved considerably by the development and application of cost-effective methods for eliminating branched broomrape soil seed populations.
Resumo:
Taro (Colocasia esculenta) accessions were collected from 15 provinces of Papua New Guinea (PNG). The collection, totalling 859 accessions was collated for characterization and a core collection of 81 accessions (10%) was established on the basis of characterization data generated on 30 agro-morphological descriptors, and DNA fingerprinting using seven SSR primers. The selection of accessions was based on cluster analysis of the morphological data enabling initial selection of 20% accessions. The 20% sample was then reduced and rationalized to 10% based on molecular data generated by SSR primers. This represents the first national core collection of any species established in PNG based on molecular markers. The core has been integrated with core from other Pacific Island countries, contributing to a Pacific regional core collection, which is conserved in vitro in the South Pacific Regional Germplasm Centre at Fiji. The core collection is a valuable resource for food security of the South Pacific region and is currently being utilized by the breeding programmes of small Pacific Island countries to broaden the genetic base of the crop.
Resumo:
BACKGROUND: Field studies of diuron and its metabolites 3-(3,4-dichlorophenyl)-1-methylurea (DCPMU), 3,4-dichlorophenylurea (DCPU) and 3,4-dichloroaniline (DCA) were conducted in a farm soil and in stream sediments in coastal Queensland, Australia. RESULTS: During a 38 week period after a 1.6 kg ha^-1 diuron application, 70-100% of detected compounds were within 0-15 cm of the farm soil, and 3-10% reached the 30-45 cm depth. First-order t1/2 degradation averaged 49 ± 0.9 days for the 0-15, 0-30 and 0-45 cm soil depths. Farm runoff was collected in the first 13-50 min of episodes lasting 55-90 min. Average concentrations of diuron, DCPU and DCPMU in runoff were 93, 30 and 83-825 µg L^-1 respectively. Their total loading in all runoff was >0.6% of applied diuron. Diuron and DCPMU concentrations in stream sediments were between 3-22 and 4-31 µg kg^-1 soil respectively. The DCPMU/diuron sediment ratio was >1. CONCLUSION: Retention of diuron and its metabolites in farm topsoil indicated their negligible potential for groundwater contamination. Minimal amounts of diuron and DCMPU escaped in farm runoff. This may entail a significant loading into the wider environment at annual amounts of application. The concentrations and ratio of diuron and DCPMU in stream sediments indicated that they had prolonged residence times and potential for accumulation in sediments. The higher ecotoxicity of DCPMU compared with diuron and the combined presence of both compounds in stream sediments suggest that together they would have a greater impact on sensitive aquatic species than as currently apportioned by assessments that are based upon diuron alone.
Resumo:
Reduced supplies of nitrogen (N) in many soils of southern Queensland that were cropped exhaustively with cereals over many decades have been the focus of much research to avoid declines in profitability and sustainability of farming systems. A 45-month period of mixed grass (purple pigeon grass, Setaria incrassata Stapf; Rhodes grass, Chloris gayana Kunth.) and legume (lucerne, Medicago sativa L.; annual medics, M. scutellata L. Mill. and M. truncatula Gaertn.) pasture was one of several options that were compared at a fertility-depleted Vertosol at Warra, southern Queensland, to improve grain yields or increase grain protein concentration of subsequent wheat crops. Objectives of the study were to measure the productivity of a mixed grass and legume pasture grown over 45 months (cut and removed over 36 months) and its effects on yield and protein concentrations of the following wheat crops. Pasture production (DM t/ha) and aboveground plant N yield (kg/ha) for grass, legume (including a small amount of weeds) and total components of pasture responded linearly to total rainfall over the duration of each of 3 pastures sown in 1986, 1987 and 1988. Averaged over the 3 pastures, each 100 mm of rainfall resulted in 0.52 t/ha of grass, 0.44 t/ha of legume and 0.97 t/ha of total pasture DM, there being little variation between the 3 pastures. Aboveground plant N yield of the 3 pastures ranged from 17.2 to 20.5 kg/ha per 100 mm rainfall. Aboveground legume N in response to total rainfall was similar (10.6 - 13.2 kg/ha. 100 mm rainfall) across the 3 pastures in spite of very different populations of legumes and grasses at establishment. Aboveground grass N yield was 5.2 - 7.0 kg/ha per 100mm rainfall. In most wheat crops following pasture, wheat yields were similar to that of unfertilised wheat except in 1990 and 1994, when grain yields were significantly higher but similar to that for continuous wheat fertilised with 75 kg N/ha. In contrast, grain protein concentrations of most wheat crops following pasture responded positively, being substantially higher than unfertilised wheat but similar to that of wheat fertilised with 75 kg N/ha. Grain protein averaged over all years of assay was increased by 25 - 40% compared with that of unfertilised wheat. Stored water supplies after pasture were < 134mm (< 55% of plant available water capacity); for most assay crops water storages were 67 - 110 mm, an equivalent wet soil depth of only 0.3 - 0.45 m. Thus, the crop assays of pasture benefits were limited by low water supply to wheat crops. Moreover, the severity of common root rot in wheat crop was not reduced by pasture - wheat rotation.
Resumo:
In the wheatbelt of eastern Australia, rainfall shifts from winter dominated in the south (South Australia, Victoria) to summer dominated in the north (northern New South Wales, southern Queensland). The seasonality of rainfall, together with frost risk, drives the choice of cultivar and sowing date, resulting in a flowering time between October in the south and August in the north. In eastern Australia, crops are therefore exposed to contrasting climatic conditions during the critical period around flowering, which may affect yield potential, and the efficiency in the use of water (WUE) and radiation (RUE). In this work we analysed empirical and simulated data, to identify key climatic drivers of potential water- and radiation-use efficiency, derive a simple climatic index of environmental potentiality, and provide an example of how a simple climatic index could be used to quantify the spatial and temporal variability in resource-use efficiency and potential yield in eastern Australia. Around anthesis, from Horsham to Emerald, median vapour pressure deficit (VPD) increased from 0.92 to 1.28 kPa, average temperature increased from 12.9 to 15.2°C, and the fraction of diffuse radiation (FDR) decreased from 0.61 to 0.41. These spatial gradients in climatic drivers accounted for significant gradients in modelled efficiencies: median transpiration WUE (WUEB/T) increased southwards at a rate of 2.6% per degree latitude and median RUE increased southwards at a rate of 1.1% per degree latitude. Modelled and empirical data confirmed previously established relationships between WUEB/T and VPD, and between RUE and photosynthetically active radiation (PAR) and FDR. Our analysis also revealed a non-causal inverse relationship between VPD and radiation-use efficiency, and a previously unnoticed causal positive relationship between FDR and water-use efficiency. Grain yield (range 1-7 t/ha) measured in field experiments across South Australia, New South Wales, and Queensland (n = 55) was unrelated to the photothermal quotient (Pq = PAR/T) around anthesis, but was significantly associated (r2 = 0.41, P < 0.0001) with newly developed climatic index: a normalised photothermal quotient (NPq = Pq . FDR/VPD). This highlights the importance of diffuse radiation and vapour pressure deficit as sources of variation in yield in eastern Australia. Specific experiments designed to uncouple VPD and FDR and more mechanistic crop models might be required to further disentangle the relationships between efficiencies and climate drivers.
Resumo:
Herpesviral haematopoietic necrosis is a disease of goldfish, Carassius auratus, caused by Cyprinid herpesvirus-2 (CyHV-2) infection. Quantitative PCR was carried out on tissue homogenates from healthy goldfish fingerlings, broodfish, eggs and fry directly sampled from commercial farms, from moribund fish submitted to our laboratory for disease diagnosis, and on naturally-infected CyHV-2 carriers subjected to experimental stress treatments. Healthy fish from 14 of 18 farms were positive with copy numbers ranging from tens to 10(7) copies mu g(-1) DNA extracted from infected fish. Of 118 pools of broodfish tested, 42 were positive. The CyHV-2 was detected in one lot of fry produced from disinfected eggs. Testing of moribund goldfish, in which we could not detect any other pathogens, produced 12 of 30 cases with 10(6)-10(8) copies of CyHV-2 mu g(-1) DNA extracted. Subjecting healthy CyHV-2 carriers to cold shock (22-10 degrees C) but not heat, ammonia or high pH, increased viral copy numbers from mean copy number (+/- SE) of 7.3 +/- 11 to 394 +/- 55 mu g(-1) DNA extracted after 24 h. CyHV-2 is widespread on commercial goldfish farms and outbreaks apparently occur when healthy carriers are subjected to a sharp temperature drop followed by holding at the permissive temperature for the disease.
Resumo:
An assessment of the relative influences of management and environment on the composition of floodplain grasslands of north-western New South Wales was made using a regional vegetation survey sampling a range of land tenures (e. g. private property, travelling stock routes and nature reserves). A total of 364 taxa belonging to 55 different plant families was recorded. Partitioning of variance with redundancy analysis determined that environmental variables accounted for a greater proportion (61.3%) of the explained variance in species composition than disturbance-related variables (37.6%). Soil type (and fertility), sampling time and rainfall had a strong influence on species composition and there were also east-west variations in composition across the region. Of the disturbance-related variables, cultivation, stocking rate and flooding frequency were all influential. Total, native, forb, shrub and subshrub richness were positively correlated with increasing time since cultivation. Flood frequency was positively correlated with graminoid species richness and was negatively correlated with total and forb species richness. Site species richness was also influenced by environmental variables (e. g. soil type and rainfall). Despite the resilience of these grasslands, some forms of severe disturbance (e. g. several years of cultivation) can result in removal of some dominant perennial grasses (e. g. Astrebla spp.) and an increase in disturbance specialists. A simple heuristic transitional model is proposed that has conceptual thresholds for plant biodiversity status. This knowledge representation may be used to assist in the management of these grasslands by defining four broad levels of community richness and the drivers that change this status.
Resumo:
Seed persistence is poorly quantified for invasive plants of subtropical and tropical environments and Lantana camara, one of the world's worst weeds, is no exception. We investigated germination, seedling emergence, and seed survival of two lantana biotypes (Pink and pink-edged red [PER]) in southeastern Queensland, Australia. Controlled experiments were undertaken in 2002 and repeated in 2004, with treatments comprising two differing environmental regimes (irrigated and natural rainfall) and sowing depths (0 and 2 cm). Seed survival and seedling emergence were significantly affected by all factors (time, biotype, environment, sowing depth, and cohort) (P < 0.001). Seed dormancy varied with treatment (environment, sowing depth, biotype, and cohort) (P < 0.001), but declined rapidly after 6 mo. Significant differential responses by the two biotypes to sowing depth and environment were detected for both seed survival and seedling emergence (P < 0.001). Seed mass was consistently lower in the PER biotype at the population level (P < 0.001), but this variation did not adequately explain the differential responses. Moreover, under natural rainfall the magnitude of the biotype effect was unlikely to result in ecologically significant differences. Seed survival after 36 mo under natural rainfall ranged from 6.8 to 21.3%. Best fit regression analysis of the decline in seed survival over time yielded a five-parameter exponential decay model with a lower asymptote approaching −0.38 (% seed survival = [( 55 − (−0.38)) • e (k • t)] + −0.38; R2 = 88.5%; 9 df). Environmental conditions and burial affected the slope parameter or k value significantly (P < 0.01). Seed survival projections from the model were greatest for buried seeds under natural rainfall (11 yr) and least under irrigation (3 yr). Experimental data and model projections suggest that lantana has a persistent seed bank and this should be considered in management programs, particularly those aimed at eradication.
Resumo:
The response of soybean (Glycine max) and dry bean (Phaseolus vulgaris) to feeding by Helicoverpa armigera during the pod-fill stage was studied in irrigated field cages over three seasons to determine the relationship between larval density and yield loss, and to develop economic injury levels. H. armigera intensity was calculated in Helicoverpa injury equivalent (HIE) units, where 1 HIE was the consumption of one larva from the start of the infestation period to pupation. In the dry bean experiment, yield loss occurred at a rate 6.00 ± 1.29 g/HIE while the rates of loss in the three soybean experiments were 4.39 ± 0.96 g/HIE, 3.70 ± 1.21 g/HIE and 2.12 ± 0.71 g/HIE. These three slopes were not statistically different (P > 0.05) and the pooled estimate of the rate of yield loss was 3.21 ± 0.55 g/HIE. The first soybean experiment also showed a split-line form of damage curve with a rate of yield loss of 26.27 ± 2.92 g/HIE beyond 8.0 HIE and a rapid decline to zero yield. In dry bean, H. armigera feeding reduced total and undamaged pod numbers by 4.10 ± 1.18 pods/HIE and 12.88 ± 1.57 pods/HIE respectively, while undamaged seed numbers were reduced by 35.64 ± 7.25 seeds/HIE. In soybean, total pod numbers were not affected by H. armigera infestation (out to 8.23 HIE in Experiment 1) but seed numbers (in Experiments 1 and 2) and the number of seeds/pod (in all experiments) were adversely affected. Seed size increased with increases in H. armigera density in two of the three soybean experiments, indicating plant compensatory responses to H. armigera feeding. Analysis of canopy pod profiles indicated that loss of pods occurred from the top of the plant downwards, but with an increase in pod numbers close to the ground at higher pest densities as the plant attempted to compensate for damage. Based on these results, the economic injury levels for H. armigera on dry bean and soybean are approximately 0.74 HIE and 2.31 HIE/m2, respectively (0.67 and 2.1 HIE/row-m for 91 cm rows).
Resumo:
Sorghum ergot produces dihydroergosine (DHES) and related alkaloids, which cause hyperthermia in cattle. Proportions of infected panicles (grain heads), leaves and stems were determined in two forage sorghum crops extensively infected 2 to 4 weeks prior to sampling and the panicles were assayed for DHES. Composite samples from each crop, plus a third grain variety crop, were coarsely chopped and half of each sealed in plastic buckets for 6 weeks to simulate ensilation. The worst-infected panicles contained up to 55 mg DHES/kg, but dilution reduced average concentrations of DHES in crops to approximately 1 mg/kg, a relatively safe level for cattle. Ensilation significantly (P = 0.043) reduced mean DHES concentrations from 0.85 to 0.46 mg/kg.