56 resultados para Microstructural effect
Resumo:
A trial was undertaken to evaluate the effect of microwaves on seed mortality of three weed species. Seeds of rubber vine (Cryptostegia grandiflora R.Br.), parthenium (Parthenium hysterophorous L.) and bellyache bush (Jatropha gossypiifolia L.) were buried at six depths (0, 2.5, 5, 10, 20 and 40 cm) in coarse sand maintained at one of two moisture levels, oven dry or wet (field capacity), and then subjected to one of five microwave radiation durations of (0, 2, 4, 8 and 16 min). Significant interactions between soil moisture level, microwave radiation duration, seed burial depth and species were detected for mortality of seeds of all three species. Maximum seed mortality of rubber vine (88%), parthenium (67%) and bellyache bush (94%) occurred in wet soil irradiated for 16 min. Maximum seed mortality of rubber vine and bellyache bush seeds occurred in seeds buried at 2.5 cm depth whereas that of parthenium occurred in seeds buried at 10 cm depth. Maximum soil temperatures of 114.1 and 87.5°C in dry and wet soil respectively occurred at 2.5 cm depth following 16 min irradiation. Irrespective of the greater soil temperatures recorded in dry soil, irradiating seeds in wet soil generally increased seed mortality 2.9-fold compared with dry soil. Moisture content of wet soil averaged 5.7% compared with 0.1% for dry soil. Results suggest that microwave radiation has the potential to kill seeds located in the soil seed bank. However, many factors, including weed species susceptibility, determine the effectiveness of microwave radiation on buried seeds. Microwave radiation may be an alternative to conventional methods at rapidly depleting soil seed banks in the field, particularly in relatively wet soils that contain long lived weed seeds.
Resumo:
Objective: To assess the value of s-methylmethionine sulphonium chloride (SMMSC) (200 mg/kg) on nutritional performance of pigs and as prevention or therapy for oesophagogastric ulcers. Design: Sixty pigs from a high health status herd with continuing oesophagogastric ulcer problems were endoscopically assessed for the presence or absence of oesophagogastric ulcers. Forty-eight pigs were then selected and allocated according to an initial oesophagogastric epithelial (ulcer score) classification to replicated treatment groups in a 2 × 2 factorial design. Weight gain and feed intake were measured over 49 d, after which pigs were killed and stomachs were collected, re-examined and scored for oesophagogastric ulceration. Results: There was no difference over the 49 d in weight gain, feed intake and backfat in pigs with and without SMMSC supplementation between pigs with or without fully developed oesophagogastric ulcers at the start of the study. In pigs with an initially low ulcer score, feeding SMMSC did not prevent further oesophagogastric ulcer development. No significant effect of SMMSC was apparent when final mean oesophagogastric ulcer scores were compared in pigs with existing high ulcer score. However, further analysis of the changes in individual pig oesophagogastric ulcer scores during the experiment showed that the observed reductions in scores of the high ulcer group was significantly different from all other groups. Conclusion: This study has indicated that supplementation of pig diets with SMMSC cannot be justified unless the slight ulcer score improvement observed could be translated to some commercial production advantage such as a reduction in pig mortalities due to oesophagogastric ulcers. This study has further confirmed the benefit of endoscopy as a tool to enable objective assessment of oesophageal gastric health.
Resumo:
We examined the effect of surface-applied treatments on the above-ground decay resistance of the tenon of mortice-and-tenon timber joints designed to simulate joinery that is exposed to the weather. Joints made from untreated radiata pine, Douglas-fir, brush box, spotted gum and copper-chrome-arsenic (CCA) treated radiata pine were exposed to the weather for 9 y on above-ground racks at five sites throughout eastern Australia. Results indicate (1) a poorly maintained external paint film generally accelerated decay, (2) a brush coat of water-repellent preservative inside the joints often extended serviceability (in some cases by a factor of up to seven times that of untreated joints) and (3) the level of protection provided by a coat of primer applied inside the joint varied and in most cases was not as effective as the water-repellent preservative treatment.
Resumo:
In south-eastern Queensland, Australia, sorghum planted in early spring usually escapes sorghum midge, Stenodiplosis sorghicola, attack. Experiments were conducted to better understand the role of winter diapause in the population dynamics of this pest. Emergence patterns of adult midge from diapausing larvae on the soil surface and at various depths were investigated during spring to autumn of 1987/88–1989/90. From 1987/88 to 1989/90, 89%, 65% and 98% of adult emergence, respectively, occurred during November and December. Adult emergence from larvae diapausing on the soil surface was severely reduced due to high mortality attributed to surface soil temperatures in excess of 40°C, with much of this mortality occurring between mid-September and mid-October. Emergence of adults from the soil surface was considerably delayed in the 1988/89 season compared with larvae buried at 5 or 10 cm which had similar emergence patterns for all three seasons. In 1989/90, when a 1-cm-deep treatment was included, there was a 392% increase in adult emergence from this treatment compared with deeper treatments. Some diapausing larvae on the surface did not emerge at the end of summer in only 1 year (1989/90), when 28.0% of the larvae on the surface remained in diapause, whereas only 0.8% of the buried larvae remained in diapause. We conclude that the pattern of emergence explains why spring plantings of sorghum in south-eastern Queensland usually escape sorghum midge attack.
Resumo:
BACKGROUND: In spite of the extensive use of phosphine fumigation around the world to control insects in stored grain, and the knowledge that grain sorbs phosphine, the effect of concentration on sorption has not been quantified. A laboratory study was undertaken, therefore, to investigate the effect of phosphine dose on sorption in wheat. Wheat was added to glass flasks to achieve filling ratios of 0.25-0.95, and the flasks were sealed and injected with phosphine at 0.1-1.5 mg L-1 based on flask volume. Phosphine concentration was monitored for 8 days at 25°C and 55% RH. RESULTS: When sorption occurred, phosphine concentration declined with time and was approximately first order, i.e. the data fitted an exponential decay equation. Percentage sorption per day was directly proportional to filling ratio, and was negatively correlated with dose for any given filling ratio. Based on the results, a tenfold increase in dose would result in a halving of the sorption constant and the percentage daily loss. Wheat was less sorptive if it was fumigated for a second time. CONCLUSIONS: The results have implications for the use of phosphine for control of insects in stored wheat. This study shows that dose is a factor that must be considered when trying to understand the impact of sorption on phosphine concentration, and that there appears to be a limit to the capacity of wheat to sorb phosphine.
Resumo:
The reliability of ants as bioindicators of ecosystem condition is dependent on the consistency of their response to localised habitat characteristics, which may be modified by larger-scale effects of habitat fragmentation and loss. We assessed the relative contribution of habitat fragmentation, habitat loss and within-patch habitat characteristics in determining ant assemblages in semi-arid woodland in Queensland, Australia. Species and functional group abundance were recorded using pitfall traps across 20 woodland patches in landscapes that exhibited a range of fragmentation states. Of fragmentation measures, changes in patch area and patch edge contrast exerted the greatest influence on species assemblages, after accounting for differences in habitat loss. However, 35% of fragmentation effects on species were confounded by the effects of habitat characteristics and habitat loss. Within-patch habitat characteristics explained more than twice the amount of species variation attributable to fragmentation and four times the variation explained by habitat loss. The study indicates that within-patch habitat characteristics are the predominant drivers of ant composition. We suggest that caution should be exercised in interpreting the independent effects of habitat fragmentation and loss on ant assemblages without jointly considering localised habitat attributes and associated joint effects.
Resumo:
Limb-loss in crustaceans can reduce moult increment and delay or advance the timing of moulting, both aspects that are likely to impact upon soft-shell crab production. Pond-reared blue swimmer crabs Portunus pelagicus were harvested and maintained in a crab shedding system. The wet weight, carapace width (CW) and the occurrence of limb-loss were assessed before stocking in the shedding system and after each of the next three moults. Many of the crabs were initially missing one or two limbs and these did not grow as much as the crabs that were intact at the start of the trial. Despite its strong correlation with wet weight, CW changes proved to be misleading. Limb-loss reduced the %CW increment but not the per cent weight increment (where the later is calculated from the actual pre-moult weight). Pre-moult weight explained much of the variation in post-moult weight, with crabs moulting to approximately double their weight. Limb-loss reduced 'growth' and production from the pond because it reduced pre-moult weight but limb-loss did not alter the weight change on shedding a given weight of crabs, although some of that change now included regeneration of limbs. One can hypothesize that much of the size variation seen in pond-reared crabs may be due to accumulated effects of repeated limb-loss, rather than genetic variation.
Resumo:
Quantifying the local crop response to irrigation is important for establishing adequate irrigation management strategies. This study evaluated the effect of irrigation applied with subsurface drip irrigation on field corn (Zea mays L.) evapotranspiration (ETc), yield, water use efficiencies (WUE = yield/ETc, and IWUE = yield/irrigation), and dry matter production in the semiarid climate of west central Nebraska. Eight treatments were imposed with irrigation amounts ranging from 53 to 356 mm in 2005 and from 22 to 226 mm in 2006. A soil water balance approach (based on FAO-56) was used to estimate daily soil water and ETc. Treatments resulted in seasonal ETc of 580-663 mm and 466-656 mm in 2005 and 2006, respectively. Yields among treatments differed by as much as 22% in 2005 and 52% in 2006. In both seasons, irrigation significantly affected yields, which increased with irrigation up to a point where irrigation became excessive. Distinct relationships were obtained each season. Yields increased linearly with seasonal ETc (R 2 = 0.89) and ETc/ETp (R 2 = 0.87) (ETp = ETc with no water stress). The yield response factor (ky), which indicates the relative reduction in yield to relative reduction in ETc, averaged 1.58 over the two seasons. WUE increased non-linearly with seasonal ETc and with yield. WUE was more sensitive to irrigation during the drier 2006 season, compared with 2005. Both seasons, IWUE decreased sharply with irrigation. Irrigation significantly affected dry matter production and partitioning into the different plant components (grain, cob, and stover). On average, the grain accounted for the majority of the above-ground plant dry mass (≈59%), followed by the stover (≈33%) and the cob (≈8%). The dry mass of the plant and that of each plant component tended to increase with seasonal ETc. The good relationships obtained in the study between crop performance indicators and seasonal ETc demonstrate that accurate estimates of ETc on a daily and seasonal basis can be valuable for making tactical in-season irrigation management decisions and for strategic irrigation planning and management.
Resumo:
The effect of fungal endophyte (Neotyphodium lolii) infection on the performance of perennial ryegrass (Lolium perenne) growing under irrigation in a subtropical environment was investigated. Seed of 4 cultivars, infected with standard (common toxic or wild-type) endophyte or the novel endophyte AR1, or free of endophyte (Nil), was sown in pure swards, which were fertilised with 50 kg N/ha.month. Seasonal and total yield, persistence, and rust susceptibility were assessed over 3 years, along with details of the presence of endophyte and alkaloids in plant shoots. Endophyte occurrence in tillers in both the standard and AR1 treatments was above 95% for Bronsyn and Impact throughout and rose to that level in Samson by the end of the second year. Meridian AR1 only reached 93% while, in the standard treatment, the endophyte had mostly died before sowing. Nil Zendophyte treatments carried an average of ?0.6% infection throughout. Infection of the standard endophyte was associated with increased dry matter (DM) yields in all 3 years compared with no endophyte. AR1 also significantly increased yields in the second and third years. Over the full 3 years, standard and AR1 increased yields by 18% and 11%, respectively. Infection with both endophytes was associated with increased yields in all 4 seasons, the effects increasing in intensity over time. There was 27% better persistence in standard infected plants compared with Nil at the end of the first year, increasing to 198% by the end of the experiment, while for AR1 the improvements were 20 and 134%, respectively. The effect of endophyte on crown rust (Puccinia coronata) infection was inconsistent, with endophyte increasing rust damage on one occasion and reducing it on another. Cultivar differences in rust infection were greater than endophyte effects. Plants infected with the AR1 endophyte had no detectable ergovaline or lolitrem B in leaf, pseudostem, or dead tissue. In standard infected plants, ergovaline and lolitrem B were highest in pseudostem and considerably lower in leaf. Dead tissue had very low or no detectable ergovaline but high lolitrem B concentrations. Peramine concentration was high and at similar levels in leaf and pseudostem, but not detectable in dead material. Concentration was similar in both AR1 and standard infected plants. Endophyte presence appeared to have a similar effect in the subtropics as has been demonstrated in temperate areas, in terms of improving yields and persistence and increasing tolerance of plants to stress factors.
Resumo:
Diets containing 3% sorghum ergot (16 mg alkaloids/kg, including 14 mg dihydroergosine/kg) were fed to 12 sows from 14 days post-farrowing until weaning 14 days later, and their performance was compared with that of 10 control sows. Ergot-fed sows displayed a smaller weight loss during lactation of 24 kg/head vs. 29 kg/head in control sows (p > 0.05) despite feed consumption being less (61 kg/head total feed intake vs. 73 kg/head by control sows; p < 0.05). Ergot-fed sows had poorer weight gain of litters over the 14-day period (16.6 kg/litter vs. 28.3 kg/litter for controls; p < 0.05) despite an increase in consumption of creep feed by the piglets from the ergot-fed sows (1.9 kg/litter compared with 1.1 kg/litter by the control; p > 0.05). Sow plasma prolactin was reduced with ergot feeding after 7 days to 4.8 μg/l compared with 15.1 μg/l in the control sows (p < 0.01) and then at weaning was 4.9 μg/l compared with 8.0 μg/l (p < 0.01) in the control sows. Two sows fed ergot ceased lactation early, and the above sow feed intakes, body weight losses with litter weight gains and creep consumption indirectly indicate an ergot effect on milk production.
Resumo:
The Queensland Great Barrier Reef line fishery in Australia is regulated via a range of input and output controls including minimum size limits, daily catch limits and commercial catch quotas. As a result of these measures a substantial proportion of the catch is released or discarded. The fate of these released fish is uncertain, but hook-related mortality can potentially be decreased by using hooks that reduce the rates of injury, bleeding and deep hooking. There is also the potential to reduce the capture of non-target species though gear selectivity. A total of 1053 individual fish representing five target species and three non-target species were caught using six hook types including three hook patterns (non-offset circle, J and offset circle), each in two sizes (small 4/0 or 5/0 and large 8/0). Catch rates for each of the hook patterns and sizes varied between species with no consistent results for target or non-target species. When data for all of the fish species were aggregated there was a trend for larger hooks, J hooks and offset circle hooks to cause a greater number of injuries. Using larger hooks was more likely to result in bleeding, although this trend was not statistically significant. Larger hooks were also more likely to foul-hook fish or hook fish in the eye. There was a reduction in the rates of injuries and bleeding for both target and non-target species when using the smaller hook sizes. For a number of species included in our study the incidence of deep hooking decreased when using non-offset circle hooks, however, these results were not consistent for all species. Our results highlight the variability in hook performance across a range of tropical demersal finfish species. The most obvious conservation benefits for both target and non-target species arise from using smaller sized hooks and non-offset circle hooks. Fishers should be encouraged to use these hook configurations to reduce the potential for post-release mortality of released fish.
Resumo:
We examine the microchemistry of otoliths of cohorts of a fished shed population of the large catadromous fish, barramundi Lates calcarifer from the estuary of a large tropical river. Barramundi from the estuary of the large, heavily regulated Fitzroy River, north eastern Australia were analysed by making transects of 87Sr/86Sr isotope and trace metal/Ca ratios from the core to the outer edge. Firstly, we examined the Sr/Ca, Ba/Ca, Mg/Ca and Mn/Ca and 87Sr/86Sr isotope ratios in otoliths of barramundi tagged in either freshwater or estuarine habitats that were caught by the commercial fishery in the estuary. We used 87Sr/86Sr isotope ratios to identify periods of freshwater residency and assess whether trace metal/Ca ratios varied between habitats. Only Sr/Ca consistently varied between known periods of estuarine or freshwater residency. The relationships between trace metal/Ca and river flow, salinity, temperature were examined in fish tagged and recaptured in the estuary. We found weak and inconsistent patterns in relationships between these variables in the majority of fish. These results suggest that both individual movement history within the estuary and the scale of environmental monitoring were reducing our ability to detect any patterns. Finally, we examined fish in the estuary from two dominant age cohorts (4 and 7 yr old) before and after a large flood in 2003 to ascertain if the flood had enabled fish from freshwater habitats to migrate to the estuary. There was no difference in the proportion of fish in the estuary that had accessed freshwater after the flood. Instead, we found that larger individuals with each age cohort were more likely to have spent a period in freshwater. This highlights the need to maintain freshwater flows in rivers. About half the fish examined had accessed freshwater habitats before capture. Of these, all had spent at least their first two months in marine salinity waters before entering freshwater and some did not enter freshwater until four years of age. This contrasts with the results of several previous studies in other parts of the range that found that access to freshwater swamps by larval barramundi was important for enhanced population productivity and recruitment.
Resumo:
To maximize the information commonly collected from otoliths, the effect of DNA extraction on the estimation of age with otoliths was evaluated by comparing sagittal otolith samples from common coral trout (Plectropomus leopardus) for clarity and ageing discrepancies in DNA-extracted and untreated control otoliths. The DNA extraction process had no significant effect, indicating that archived otoliths can be used as a source of DNA while retaining their utility for age estimation.
Resumo:
Grain produced from doubled-haploid (DH) wheat lines, developed from a hard- and a soft-grained wheat cultivar, were bulked according to Pinb (puroindoline b) genotypes for an assessment of Chinese fresh noodle texture by a trained taste panel. Each DH line was designated as 'soft' or 'hard' grained, based on a PCR amplification of the wildtype, soft allele, or the mutant, hard allele. Theoretically, the soft and hard grain bulks represented respective Pinb alleles and an independent assortment of unlinked alleles from the parents, Sunco and Chuanyu 12. Grains from the parents and DH lines were grown at 2 locations in Queensland, Australia, and one in Sichuan, China. The grains were milled and processed for a taste panel evaluation in Chengdu, Sichuan. Results suggest the Pinb alleles had a significant effect on noodle softness and explained 30% of the variation; the 'soft' Pinb allele conferred a softer noodle texture. Location had a significant effect on noodle smoothness; wheat grain grown at Biloela, Queensland, produced a smoother noodle texture than grain grown in Sichuan. The effect of location confirms the importance of environment as a variable for this quality character. This investigation exemplifies the utility of Pinb markers for specifically altering Chinese Fresh Noodle texture.
Resumo:
Plant tissue culture has been used for a number of years to produce micropropagated strawberry plants for planting into runner growing beds in the Stanthorpe (Queensland) and Bothwell (Tasmania) regions. This process has allowed the rapid release of new cultivars from the LAWS (Late Autumn, Winter, Spring) breeding program into the current runner production system. Micro-propagation in vitro allows plants to be produced during the autumn and winter months, when mother plants would normally be in a fruit production phase in the field in Queensland. The plants produced are of a high health status when they are planted. The subsequent arrival and build up of various diseases in the runner fields are closely monitored. Using tissue culture for the first generation reduces the time the plants spend in the field by twelve months, reducing disease incidence. To date, any disease outbreak has been successfully managed using early detection and rapid response methods.