16 resultados para Abrasion loss
em eResearch Archive - Queensland Department of Agriculture
Resumo:
The hypothesis that contaminant plants growing amongst chickpea serve as Helicoverpa sinks by diverting oviposition pressure away from the main crop was tested under field conditions. Gain (recruitment) and loss (presumed mortality) of juvenile stages of Helicoverpa spp. on contaminant faba bean and wheat plants growing in chickpea plots were quantified on a daily basis over a 12-d period. The possibility of posteclosion movement of larvae from the contaminants to the surrounding chickpea crop was examined. Estimated total loss of the census population varied from 80 to 84% across plots and rows. The loss of brown eggs (40–47%) contributed most to the overall loss estimate, followed by loss of white eggs (27–35%) and larvae (6–9%). The cumulative number of individuals entering the white and brown egg and larval stages over the census period ranged from 15 to 58, 10–48 and 1–6 per m row, respectively. The corresponding estimates of mean stage-specific loss, expressed as a percentage of individuals entering the stage, ranged from 52 to 57% for white eggs, 87–108% for brown eggs and 71–87% for first-instar larvae. Mean larval density on chickpea plants in close proximity to the contaminant plants did not exceed the baseline larval density on chickpea further away from the contaminants across rows and plots. The results support the hypothesis that contaminant plants in chickpea plots serve as Helicoverpa sinks by diverting egg pressure from the main crop and elevating mortality of juvenile stages. Deliberate contamination of chickpea crops with other plant species merits further investigation as a cultural pest management strategy for Helicoverpa spp.
Resumo:
The loss and recovery of intertidal seagrass meadows were assessed following the flood related catastrophic loss of seagrass meadows in February 1999 in the Sandy Strait, Queensland. Region wide recovery rates of intertidal meadows following the catastrophic disturbance were assessed by mapping seagrass abundance in the northern Great Sandy Strait region prior to and on 3 occasions after widespread loss of seagrass. Meadow-scale assessments of seagrass loss and recovery focussed on two existing Zostera capricorni monitoring meadows in the region. Mapping surveys showed that approximately 90% of intertidal seagrasses in the northern Great Sandy Strait disappeared after the February 1999 flooding of the Mary River. Full recovery of all seagrass meadows took 3 years. At the two study sites (Urangan and Wanggoolba Creek) the onset of Z. capricorni germination following the loss of seagrass occurred 14 months post-flood at Wanggoolba Creek, and at Urangan it took 20 months for germination to occur. By February 2001 (24 months post-flood) seagrass abundance at Wanggoolba Creek sites was comparable to pre-flood abundance levels and full recovery at Urangan sites was complete in August 2001 (31 months post-flood). Reduced water quality characterised by 2–3 fold increases in turbidity and nutrient concentrations during the 6 months following the flood was followed by a 95% loss of seagrass meadows in the region. Reductions in available light due to increased flood associated turbidity in February 1999 were the likely cause of seagrass loss in the Great Sandy Strait region, southern Queensland. Although seasonal cues influence the germination of Z. capricorni, the temporal variation in the onset of seed germination between sites suggests that germination following seagrass loss may be dependent on other factors (eg. physical and chemical characteristics of sediments and water). Elevated dissolved nitrogen concentrations during 1999 at Wanggoolba Creek suggest that this site received higher loads of sediments and nutrients from flood waters than Urangan. The germination of seeds at Wanggoolba Creek one year prior to Urangan coincides with relatively low suspended sediment concentrations in Wanggoolba Creek waters. The absence of organic rich sediments at Urangan for many months following their removal during the 1999 flood may also have inhibited seed germination. Data from population cohort analyses and population growth rates showed that rhizome weight and rhizome elongation rates increased over time, consistent with rapid growth during increases in temperature and light availability from May to October
Resumo:
In the seasonally dry tropics of northern Australia, breeder cows may lose up to 30% liveweight during the dry season when pasture is of low nutritive value. This is a major cause of low reproductive rates and high mortality. Weaning early in the dry season is effective to reduce this liveweight loss of the breeder (Holroyd et al. 1988). An experiment examined the dry season liveweight loss of breeders for a range of weaning times and levels of nutrition. From April to October through the dry season, 209 Bos indicus x Shorthorn cross cows 4-6 years of age grazed speargrass pastures in north Queensland. The cows had been joined with bulls from late January until April. Twenty-nine breeders had not suckled a calf during the previous wet season (DRY cows). In addition 180 cows lactating in April were weaned in late April, mid July or early September. The cows were allocated by stratified randomisation based on lactational status, stage of pregnancy and body condition to 15 x 40 ha paddocks. Five paddocks with low fertility soils provided LOW nutrition, while 10 paddocks with medium fertility soils and no supplementation or with supplementation provided MEDIUM and HIGH nutrition, respectively. The supplement consisted of molasses containing 14% urea offered ad libitum. Liveweight was measured at intervals and conceptus-free liveweight (CF-LW) calculated. Data were analyses by AOV within groups of paddocks. Animal production for a consuming world : proceedings of 9th Congress of the Asian-Australasian Association of Animal Production Societies [AAAP] and 23rd Biennial Conference of the Australian Society of Animal Production [ASAP] and 17th Annual Symposium of the University of Sydney, Dairy Research Foundation, [DRF]. 2-7 July 2000, Sydney, Australia.
Resumo:
Runoff and sediment loss from forest roads were monitored for a two-year period in a Pinus plantation in southeast Queensland. Two classes of road were investigated: a gravelled road, which is used as a primary daily haulage route for the logging area, and an ungravelled road, which provides the main access route for individual logging compartments and is intensively used as a haulage route only during the harvest of these areas (approximately every 30 years). Both roads were subjected to routine traffic loads and maintenance during the study. Surface runoff in response to natural rainfall was measured and samples taken for the determination of sediment and nutrient (total nitrogen, total phosphorus, dissolved organic carbon and total iron) loads from each road. Results revealed that the mean runoff coefficient (runoff depth/rainfall depth) was consistently higher from the gravelled road plot with 0.57, as compared to the ungravelled road with 0.38. Total sediment loss over the two-year period was greatest from the gravelled road plot at 5.7 t km−1 compared to the ungravelled road plot with 3.9 t km−1. Suspended solids contributed 86% of the total sediment loss from the gravelled road, and 72% from the ungravelled road over the two years. Nitrogen loads from the two roads were both relatively constant throughout the study, and averaged 5.2 and 2.9 kg km−1 from the gravelled and ungravelled road, respectively. Mean annual phosphorus loads were 0.6 kg km−1 from the gravelled road and 0.2 kg km−1 from the ungravelled road. Organic carbon and total iron loads increased in the second year of the study, which was a much wetter year, and are thought to reflect the breakdown of organic matter in roadside drains and increased sediment generation, respectively. When road and drain maintenance (grading) was performed runoff and sediment loss were increased from both road types. Additionally, the breakdown of the gravel road base due to high traffic intensity during wet conditions resulted in the formation of deep (10 cm) ruts which increased erosion. The Water Erosion Prediction Project (WEPP):Road model was used to compare predicted to observed runoff and sediment loss from the two road classes investigated. For individual rainfall events, WEPP:Road predicted output showed strong agreement with observed values of runoff and sediment loss. WEPP:Road predictions for annual sediment loss from the entire forestry road network in the study area also showed reasonable agreement with the extrapolated observed values.
Resumo:
The reliability of ants as bioindicators of ecosystem condition is dependent on the consistency of their response to localised habitat characteristics, which may be modified by larger-scale effects of habitat fragmentation and loss. We assessed the relative contribution of habitat fragmentation, habitat loss and within-patch habitat characteristics in determining ant assemblages in semi-arid woodland in Queensland, Australia. Species and functional group abundance were recorded using pitfall traps across 20 woodland patches in landscapes that exhibited a range of fragmentation states. Of fragmentation measures, changes in patch area and patch edge contrast exerted the greatest influence on species assemblages, after accounting for differences in habitat loss. However, 35% of fragmentation effects on species were confounded by the effects of habitat characteristics and habitat loss. Within-patch habitat characteristics explained more than twice the amount of species variation attributable to fragmentation and four times the variation explained by habitat loss. The study indicates that within-patch habitat characteristics are the predominant drivers of ant composition. We suggest that caution should be exercised in interpreting the independent effects of habitat fragmentation and loss on ant assemblages without jointly considering localised habitat attributes and associated joint effects.
Resumo:
Limb-loss in crustaceans can reduce moult increment and delay or advance the timing of moulting, both aspects that are likely to impact upon soft-shell crab production. Pond-reared blue swimmer crabs Portunus pelagicus were harvested and maintained in a crab shedding system. The wet weight, carapace width (CW) and the occurrence of limb-loss were assessed before stocking in the shedding system and after each of the next three moults. Many of the crabs were initially missing one or two limbs and these did not grow as much as the crabs that were intact at the start of the trial. Despite its strong correlation with wet weight, CW changes proved to be misleading. Limb-loss reduced the %CW increment but not the per cent weight increment (where the later is calculated from the actual pre-moult weight). Pre-moult weight explained much of the variation in post-moult weight, with crabs moulting to approximately double their weight. Limb-loss reduced 'growth' and production from the pond because it reduced pre-moult weight but limb-loss did not alter the weight change on shedding a given weight of crabs, although some of that change now included regeneration of limbs. One can hypothesize that much of the size variation seen in pond-reared crabs may be due to accumulated effects of repeated limb-loss, rather than genetic variation.
Resumo:
Two laboratory experiments were carried out to quantify the mortality and physiological responses of juvenile blue swimmer crabs (Portunus pelagicus) after simulated gillnet entanglement, air exposure, disentanglement, and discarding. In both experiments, all but control blue swimmer crabs were entangled in 1-m(2) gillnet panels for 1 h, exposed to air for 2 min, subjected to various treatments of disentanglement ranging between the forceful removal of none, one, two, and four appendages, then "discarded" into individual experimental tanks and monitored for 10 d. In Experiment 1, mortalities were associated with the number of appendages removed and the occurrence of unsealed wounds. In Experiment 2, live blue swimmer crabs were sampled for blood at 2 min and 6, 24, and 72 h post-discarding to test for the effects of disentanglement and appendage removal on total haemocyte counts, clotting times, protein levels (by refractive index), and blood ion concentrations. Compared with blue swimmer crabs that had sealed or no wounds, those with unsealed wounds had lower total haemocyte counts, protein, and calcium concentrations and increased clotting ties and magnesium and sodium levels. Induced autotomy, as opposed to the arbitrary, forceful removal of a appendages has the potential to minimize the mortality and stress of discarded, juvenile blue swimmer crabs.
Resumo:
Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.
Resumo:
Best practice protocols for on-farm management of prawn quality to meet market requirements.
Resumo:
Increased sediment and nutrient losses resulting from unsustainable grazing management in the Burdekin River catchment are major threats to water quality in the Great Barrier Reef Lagoon. To test the effects of grazing management on soil and nutrient loss, five 1 ha mini-catchments were established in 1999 under different grazing strategies on a sedimentary landscape near Charters Towers. Reference samples were also collected from watercourses in the Burdekin catchment during major flow events.Soil and nutrient loss were relatively low across all grazing strategies due to a combination of good cover, low slope and low rainfall intensities. Total soil loss varied from 3 to 20 kg haˉ¹ per event while losses of N and P ranged from 10 to 1900 g haˉ¹ and from 1 to 71 g haˉ¹ per event respectively. Water quality of runoff was considered moderate across all strategies with relatively low levels of total suspended sediment (range: 8-1409 mg lˉ¹), total N (range: 101-4000 ug lˉ¹) and total P (range: 14-609 ug lˉ¹). However, treatment differences are likely to emerge with time as the impacts of the different grazing strategies on land condition become more apparent.Samples collected opportunistically from rivers and creeks during flow events displayed significantly higher levels of total suspended sediment (range: 10-6010 mg lˉ¹), total N (range: 650-6350 ug lˉ¹) and total P (range: 50-1500 ug lˉ¹) than those collected at the grazing trial. These differences can largely be attributed to variation in slope, geology and cover between the grazing trial and different catchments. In particular, watercourses draining hillier, grano-diorite landscapes with low cover had markedly higher sediment and nutrient loads compared to those draining flatter, sedimentary landscapes.These preliminary data suggest that on relatively flat, sedimentary landscapes, extensive cattle grazing is compatible with achieving water quality targets, provided high levels of ground cover are maintained. In contrast, sediment and nutrient loss under grazing on more erodable land types is cause for serious concern. Long-term empirical research and monitoring will be essential to quantify the impacts of changed land management on water quality in the spatially and temporally variable Burdekin River catchment.
Resumo:
There is a world-wide trend for deteriorating water quality and light levels in the coastal zone, and this has been linked to declines in seagrass abundance. Localized management of seagrass meadow health requires that water quality guidelines for meeting seagrass growth requirements are available. Tropical seagrass meadows are diverse and can be highly dynamic and we have used this dynamism to identify light thresholds in multi-specific meadows dominated by Halodule uninervis in the northern Great Barrier Reef, Australia. Seagrass cover was measured at similar to 3 month intervals from 2008 to 2011 at three sites: Magnetic Island (MI) Dunk Island (DI) and Green Island (GI). Photosynthetically active radiation was continuously measured within the seagrass canopy, and three light metrics were derived. Complete seagrass loss occurred at MI and DI and at these sites changes in seagrass cover were correlated with the three light metrics. Mean daily irradiance (I-d) above 5 and 8.4 mol m(-2) d(-1) was associated with gains in seagrass at MI and DI, however a significant correlation (R = 0.649, p < 0.05) only occurred at MI. The second metric, percent of days below 3 mol m(-2) d(-1), correlated the most strongly (MI, R = -0.714, p < 0.01 and DI, R = -0.859, p = <0.001) with change in seagrass cover with 16-18% of days below 3 mol m(-2) d(-1) being associated with more than 50% seagrass loss. The third metric, the number of hours of light saturated irradiance (H-sat) was calculated using literature-derived data on saturating irradiance (E-k). H-sat correlated well (R = 0.686, p <0.01; and DI, R = 0.704, p < 0.05) with change in seagrass abundance, and was very consistent between the two sites as 4 H-sat was associated with increases in seagrass abundance at both sites, and less than 4 H-sat with more than 50% loss. At the third site (GI), small seasonal losses of seagrass quickly recovered during the growth season and the light metrics did not correlate (p > 0.05) with change in percent cover, except for I-d which was always high, but correlated with change in seagrass cover. Although distinct light thresholds were observed, the departure from threshold values was also important. For example, light levels that are well below the thresholds resulted in more severe loss of seagrass than those just below the threshold. Environmental managers aiming to achieve optimal seagrass growth conditions can use these threshold light metrics as guidelines; however, other environmental conditions, including seasonally varying temperature and nutrient availability, will influence seagrass responses above and below these thresholds. (C) 2012 Published by Elsevier Ltd.
Resumo:
Reducing crop row spacing and delaying time of weed emergence may provide crops a competitive edge over weeds. Field experiments were conducted to evaluate the effects of crop row spacing (11, 15, and 23-cm) and weed emergence time (0, 20, 35, 45, 55, and 60 days after wheat emergence; DAWE) on Galium aparine and Lepidium sativum growth and wheat yield losses. Season-long weed-free and crop-free treatments were also established to compare wheat yield and weed growth, respectively. Row spacing and weed emergence time significantly affected the growth of both weed species and wheat grain yields. For both weed species, the maximum plant height, shoot biomass, and seed production were observed in the crop-free plots, and delayed emergence decreased these variables. In weed-crop competition plots, maximum weed growth was observed when weeds emerged simultaneously with the crop in rows spaced 23-cm apart. Less growth of both weed species was observed in narrow row spacing (11-cm) of wheat as compared with wider rows (15 and 23-cm). These weed species produced less than 5 seeds plant-1 in 11-cm wheat rows when they emerged at 60 DAWE. Presence of weeds in the crop especially at early stages was devastating for wheat yields. Therefore, maximum grain yield (4.91tha-1) was recorded in the weed-free treatment at 11-cm row spacing. Delay in time of weed emergence and narrow row spacing reduced weed growth and seed production and enhanced wheat grain yield, suggesting that these strategies could contribute to weed management in wheat.
Resumo:
AbstractObjectives Decision support tools (DSTs) for invasive species management have had limited success in producing convincing results and meeting users' expectations. The problems could be linked to the functional form of model which represents the dynamic relationship between the invasive species and crop yield loss in the DSTs. The objectives of this study were: a) to compile and review the models tested on field experiments and applied to DSTs; and b) to do an empirical evaluation of some popular models and alternatives. Design and methods This study surveyed the literature and documented strengths and weaknesses of the functional forms of yield loss models. Some widely used models (linear, relative yield and hyperbolic models) and two potentially useful models (the double-scaled and density-scaled models) were evaluated for a wide range of weed densities, maximum potential yield loss and maximum yield loss per weed. Results Popular functional forms include hyperbolic, sigmoid, linear, quadratic and inverse models. Many basic models were modified to account for the effect of important factors (weather, tillage and growth stage of crop at weed emergence) influencing weed–crop interaction and to improve prediction accuracy. This limited their applicability for use in DSTs as they became less generalized in nature and often were applicable to a much narrower range of conditions than would be encountered in the use of DSTs. These factors' effects could be better accounted by using other techniques. Among the model empirically assessed, the linear model is a very simple model which appears to work well at sparse weed densities, but it produces unrealistic behaviour at high densities. The relative-yield model exhibits expected behaviour at high densities and high levels of maximum yield loss per weed but probably underestimates yield loss at low to intermediate densities. The hyperbolic model demonstrated reasonable behaviour at lower weed densities, but produced biologically unreasonable behaviour at low rates of loss per weed and high yield loss at the maximum weed density. The density-scaled model is not sensitive to the yield loss at maximum weed density in terms of the number of weeds that will produce a certain proportion of that maximum yield loss. The double-scaled model appeared to produce more robust estimates of the impact of weeds under a wide range of conditions. Conclusions Previously tested functional forms exhibit problems for use in DSTs for crop yield loss modelling. Of the models evaluated, the double-scaled model exhibits desirable qualitative behaviour under most circumstances.
Resumo:
Meleagrid herpesvirus 1 (MeHV-1 or turkey herpesvirus) has been widely used as a vaccine in commercial poultry. Initially, these vaccine applications were for the prevention of Marek’s disease resulting from Gallid herpesvirus 2 infections, while more recently MeHV-1 has been used as recombinant vector for other poultry infections. The construction of herpesvirus infectious clones that permit propagation and manipulation of the viral genome in bacterial hosts has advanced the studies of herpesviral genetics. The current study reports the construction of five MeHV-1 infectious clones. The in vitro properties of viruses recovered from these clones were indistinguishable from the parental MeHV-1. In contrast, the rescued MeHV-1 viruses were significantly attenuated when used in vivo. Complete sequencing of the infectious clones identified the absence of two regions of the MeHV-1 genome compared to the MeHV-1 reference sequence. These analyses determined the rescued viruses have seven genes, UL43, UL44, UL45, UL56, HVT071, sorf3 and US2 either partially or completely deleted. In addition, single nucleotide polymorphisms were identified in all clones compared with the MeHV-1 reference sequence. As a consequence of one of the polymorphisms identified in the UL13 gene, four of the rescued viruses were predicted to encode a serine/threonine protein kinase lacking two of three domains required for activity. Thus four of the recovered viruses have a total of eight missing or defective genes. The implications of these findings in the context of herpesvirus biology and infectious clone construction are discussed.
Resumo:
Meleagrid herpesvirus 1 (MeHV-1 or turkey herpesvirus) has been widely used as a vaccine in commercial poultry. Initially, these vaccine applications were for the prevention of Marek’s disease resulting from Gallid herpesvirus 2 infections, while more recently MeHV-1 has been used as recombinant vector for other poultry infections. The construction of herpesvirus infectious clones that permit propagation and manipulation of the viral genome in bacterial hosts has advanced the studies of herpesviral genetics. The current study reports the construction of five MeHV-1 infectious clones. The in vitro properties of viruses recovered from these clones were indistinguishable from the parental MeHV-1. In contrast, the rescued MeHV-1 viruses were significantly attenuated when used in vivo. Complete sequencing of the infectious clones identified the absence of two regions of the MeHV-1 genome compared to the MeHV-1 reference sequence. These analyses determined the rescued viruses have seven genes, UL43, UL44, UL45, UL56, HVT071, sorf3 and US2 either partially or completely deleted. In addition, single nucleotide polymorphisms were identified in all clones compared with the MeHV-1 reference sequence. As a consequence of one of the polymorphisms identified in the UL13 gene, four of the rescued viruses were predicted to encode a serine/threonine protein kinase lacking two of three domains required for activity. Thus four of the recovered viruses have a total of eight missing or defective genes. The implications of these findings in the context of herpesvirus biology and infectious clone construction are discussed.