11 resultados para 13077-023

em eResearch Archive - Queensland Department of Agriculture


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nitrogen (N) is the largest agricultural input in many Australian cropping systems and applying the right amount of N in the right place at the right physiological stage is a significant challenge for wheat growers. Optimizing N uptake could reduce input costs and minimize potential off-site movement. Since N uptake is dependent on soil and plant water status, ideally, N should be applied only to areas within paddocks with sufficient plant available water. To quantify N and water stress, spectral and thermal crop stress detection methods were explored using hyperspectral, multispectral and thermal remote sensing data collected at a research field site in Victoria, Australia. Wheat was grown over two seasons with two levels of water inputs (rainfall/irrigation) and either four levels (in 2004; 0, 17, 39 and 163 kg/ha) or two levels (in 2005; 0 and 39 kg/ha N) of nitrogen. The Canopy Chlorophyll Content Index (CCCI) and modified Spectral Ratio planar index (mSRpi), two indices designed to measure canopy-level N, were calculated from canopy-level hyperspectral data in 2005. They accounted for 76% and 74% of the variability of crop N status, respectively, just prior to stem elongation (Zadoks 24). The Normalised Difference Red Edge (NDRE) index and CCCI, calculated from airborne multispectral imagery, accounted for 41% and 37% of variability in crop N status, respectively. Greater scatter in the airborne data was attributable to the difference in scale of the ground and aerial measurements (i.e., small area plant samples against whole-plot means from imagery). Nevertheless, the analysis demonstrated that canopy-level theory can be transferred to airborne data, which could ultimately be of more use to growers. Thermal imagery showed that mean plot temperatures of rainfed treatments were 2.7 °C warmer than irrigated treatments (P < 0.001) at full cover. For partially vegetated fields, the two-Dimensional Crop Water Stress Index (2D CWSI) was calculated using the Vegetation Index-Temperature (VIT) trapezoid method to reduce the contribution of soil background to image temperature. Results showed rainfed plots were consistently more stressed than irrigated plots. Future work is needed to improve the ability of the CCCI and VIT methods to detect N and water stress and apply both indices simultaneously at the paddock scale to test whether N can be targeted based on water status. Use of these technologies has significant potential for maximising the spatial and temporal efficiency of N applications for wheat growers. ‘Ground–breaking Stuff’- Proceedings of the 13th Australian Society of Agronomy Conference, 10-14 September 2006, Perth, Western Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discarding in commercially exploited fisheries has received considerable attention in the last decade, though only more recently in Australia. The Reef Line fishery (RLF) of the Great Barrier Reef (GBR) in Australia is a large-scale multi-sector, multi-species, highly regulated hook and line fishery with the potential for high levels of discarding. We used a range of data sources to estimate discard rates and discard quantities for the two main target groups of the RLF, the coral trout, Plectropomus spp, and the red throat emperor, Lethrinus miniatus, and investigated possible effects on discarding of recent changes in management of the fishery. Fleet-wide estimates of total annual quantities discarded from 1989 to 2003 were 292-622 t and 33-95 t for coral trout and red throat emperor, respectively. Hypothetical scenarios of high-grading after the introduction of a total allowable commercial catch for coral trout resulted in increases in discard quantities up to 3895 t, while no high-grading still meant 421 t were discarded. Increasing the minimum size limit of red throat emperor from 35 to 38 cm also increased discards to an estimated 103 t. We provide spatially and temporally explicit estimates of discarding for the two most important species in the GBR RLF of Australia to demonstrate the importance of accounting for regional variation in quantification of discarding. Effects of management changes on discarding are also highlighted. This study provides a template for exploring discarding levels for other species in the RLF and elsewhere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Haemophilus parasuis is the causative agent of Glässer's disease. Up to now 15 serovars of H. parasuis have been identified, with significant differences existing in virulence between serovars. In this study, suppression subtractive hybridization (SSH) was used to identify the genetic difference between Nagasaki (H. parasuis serovar 5 reference strain, highly virulent) and SW114 (H. parasuis serovar 3 reference strain, non-virulent). A total of 191 clones were obtained from the SSH library. Using dot hybridization and PCR, 15 clones were identified containing fragments that were present in the Nagasaki genome while absent in the SW114 genome. Among these 15 fragments, three fragments (ssh1, ssh13, ssh15) encode cell surface-associated components; three fragments (ssh2, ssh5, ssh9) are associated with metabolism and stress response; one fragment (ssh8) is involved in assembly of fimbria and one fragment (ssh6) is a phage phi-105 ORF25-like protein. The remaining seven fragments are hypothetical proteins or unknown. Based on PCR analysis of the 15 serovar reference strains, eight fragments (ssh1, ssh2, ssh3, ssh6, ssh8, ssh10, ssh11 and ssh12) were found in three to five of most virulent serovars (1, 5, 10, 12, 13 and 14), zero to two in three moderately virulent serovars (2, 4 and 15), but absent in the low virulent serovar (8) and non-virulent serovars (3, 6, 7, 9 and 11). In vivo transcription fragments ssh1, ssh2, ssh8 and ssh12 were identified in total RNA samples extracted from experimental infected pig lung by RT-PCR. This study has provided some evidence of genetic differences between H. parasuis strains of different virulence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increased sediment and nutrient losses resulting from unsustainable grazing management in the Burdekin River catchment are major threats to water quality in the Great Barrier Reef Lagoon. To test the effects of grazing management on soil and nutrient loss, five 1 ha mini-catchments were established in 1999 under different grazing strategies on a sedimentary landscape near Charters Towers. Reference samples were also collected from watercourses in the Burdekin catchment during major flow events.Soil and nutrient loss were relatively low across all grazing strategies due to a combination of good cover, low slope and low rainfall intensities. Total soil loss varied from 3 to 20 kg haˉ¹ per event while losses of N and P ranged from 10 to 1900 g haˉ¹ and from 1 to 71 g haˉ¹ per event respectively. Water quality of runoff was considered moderate across all strategies with relatively low levels of total suspended sediment (range: 8-1409 mg lˉ¹), total N (range: 101-4000 ug lˉ¹) and total P (range: 14-609 ug lˉ¹). However, treatment differences are likely to emerge with time as the impacts of the different grazing strategies on land condition become more apparent.Samples collected opportunistically from rivers and creeks during flow events displayed significantly higher levels of total suspended sediment (range: 10-6010 mg lˉ¹), total N (range: 650-6350 ug lˉ¹) and total P (range: 50-1500 ug lˉ¹) than those collected at the grazing trial. These differences can largely be attributed to variation in slope, geology and cover between the grazing trial and different catchments. In particular, watercourses draining hillier, grano-diorite landscapes with low cover had markedly higher sediment and nutrient loads compared to those draining flatter, sedimentary landscapes.These preliminary data suggest that on relatively flat, sedimentary landscapes, extensive cattle grazing is compatible with achieving water quality targets, provided high levels of ground cover are maintained. In contrast, sediment and nutrient loss under grazing on more erodable land types is cause for serious concern. Long-term empirical research and monitoring will be essential to quantify the impacts of changed land management on water quality in the spatially and temporally variable Burdekin River catchment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Veterinarians have few tools to predict the rate of disease progression in FIV-infected cats. In contrast, in HIV infection, plasma viral RNA load and acute phase protein concentrations are commonly used as predictors of disease progression. This study evaluated these predictors in cats naturally infected with FIV. In older cats (>5 years), log10 FIV RNA load was higher in the terminal stages of disease compared to the asymptomatic stage. There was a significant association between log10 FIV RNA load and both log10 serum amyloid A concentration and age in unwell FIV-infected cats. This study suggests that viral RNA load and serum amyloid A warrant further investigation as predictors of disease status and prognosis in FIV-infected cats.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper details Australian research that developed tools to assist fisheries managers and government agencies in engaging with the social dimension of industry and community welfare in fisheries management. These tools are in the form of objectives and indicators. These highlight the social dimensions and the effects of management plans and policy implementation on fishing industries and associated communities, while also taking into account the primacy of ecological imperatives. The deployment of these objectives and indicators initially provides a benchmark and, over the life of a management plan, can subsequently be used to identify trends in effects on a variety of social and economic elements that may be objectives in the management of a fishery. It is acknowledged that the degree to which factors can be monitored will be dependent upon resources of management agencies, however these frameworks provide a method for effectively monitoring and measuring change in the social dimension of fisheries management.Essentially, the work discussed in this paper provides fisheries management with the means to both track and begin to understand the effects of government policy and management plans on the social dimension of the fishing industry and its associated communities. Such tools allow the consideration of these elements, within an evidence base, into policy arrangements, and consequently provide an invaluable contribution to the ability to address resilience and sustainability of fishing industries and associated communities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RNA silencing in plants and insects provides an antiviral defense and as a countermeasure most viruses encode RNA silencing suppressors (RSS). For the family Rhabdoviridae, no detailed functional RSS studies have been reported in plant hosts and insect vectors. In agroinfiltrated Nicotiana benthamiana leaves we show for the first time for a cytorhabdovirus, lettuce necrotic yellows virus (LNYV), that one of the nucleocapsid core proteins, phosphoprotein (P) has relatively weak local RSS activity and delays systemic silencing of a GFP reporter. Analysis of GFP small RNAs indicated that the P protein did not prevent siRNA accumulation. To explore RSS activity in insects, we used a Flock House virus replicon system in Drosophila S2 cells. In contrast to the plant host, LNYV P protein did not exhibit RSS activity in the insect cells. Taken together our results suggest that P protein may target plant-specific components of RNA silencing post siRNA biogenesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Divergent genetic selection for wool growth as a single trait has led to major changes in sheep physiology and metabolism, including variations in rumen microbial protein production and uptake of α-amino nitrogen in portal blood. This study was conducted to determine if sheep with different genetic merit for wool growth exhibit distinct rumen bacterial diversity. Eighteen Merino wethers were separated into groups of contrasting genetic merit for clean fleece weight (CFW; low: WG− and high: WG+) and fed a blend of oaten and lucerne chaff diet at two levels of intake (LOI; 1 or 1.5 times maintenance energy requirements) for two seven-week periods in a crossover design. Bacterial diversity in rumen fluid collected by esophageal intubation was characterized using 454 amplicon pyrosequencing of the V3/V4 regions of the 16S rRNA gene. Bacterial diversity estimated by Phylogenetic distance, Chao1 and observed species did not differ significantly with CFW or LOI; however, the Shannon diversity index differed (P=0.04) between WG+ (7.67) and WG− sheep (8.02). WG+ animals had a higher (P=0.03) proportion of Bacteroidetes (71.9% vs 66.5%) and a lower (P=0.04) proportion of Firmicutes (26.6% vs 31.6%) than WG− animals. Twenty-four specific operational taxonomic units (OTUs), belonging to the Firmicutes and Bacteroidetes phyla, were shared among all the samples, whereas specific OTUs varied significantly in presence/abundance (P<0.05) between wool genotypes and 50 varied (P<0.05) with LOI. It appears that genetic selection for fleece weight is associated with differences in rumen bacterial diversity that persist across different feeding levels. Moderate correlations between seven continuous traits, such as methane production or microbial protein production, and the presence and abundance of 17 OTUs were found, indicating scope for targeted modification of the microbiome to improve the energetic efficiency of rumen microbial synthesis and reduce the greenhouse gas footprint of ruminants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prescribed fire is one of the most widely-used management tools for reducing fuel loads in managed forests. However the long-term effects of repeated prescribed fires on soil carbon (C) and nitrogen (N) pools are poorly understood. This study aimed to investigate how different fire frequency regimes influence C and N pools in the surface soils (0–10 cm). A prescribed fire field experiment in a wet sclerophyll forest established in 1972 in southeast Queensland was used in this study. The fire frequency regimes included long unburnt (NB), burnt every 2 years (2yrB) and burnt every 4 years (4yrB), with four replications. Compared with the NB treatment, the 2yrB treatment lowered soil total C by 44%, total N by 54%, HCl hydrolysable C and N by 48% and 59%, KMnO4 oxidizable C by 81%, microbial biomass C and N by 42% and 33%, cumulative CO2–C by 28%, NaOCl-non-oxidizable C and N by 41% and 51%, and charcoal-C by 17%, respectively. The 4yrB and NB treatments showed no significant differences for these soil C and N pools. All soil labile, biologically active and recalcitrant and total C and N pools were correlated positively with each other and with soil moisture content, but negatively correlated with soil pH. The C:N ratios of different C and N pools were greater in the burned treatments than in the NB treatments. This study has highlighted that the prescribed burning at four year interval is a more sustainable management practice for this subtropical forest ecosystem.