35 resultados para Insecticide mortality percentage
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Surveys were conducted between 1997 and 2001 to investigate the incidence of overwintering Helicoverpa spp. pupae under summer crop residues on the Darling Downs, Queensland. Only Helicoverpa armigera was represented in collections of overwintering pupae. The results indicated that late-season crops of cotton, sorghum, maize, soybean, mungbean and sunflower were equally likely to have overwintering pupae under them. In the absence of tillage practices, these crops had the potential to produce similar numbers of moths/ha in the spring. There were expected differences between years in the densities of overwintering pupae and the number of emerged moths/ha. Irrigated crops produced 2.5 times more moths/ha than dryland crops. Overall survival from autumn-formed pupae to emerged moths averaged 44%, with a higher proportion of pupae under maize surviving to produce moths than each of the other crops. Parasitoids killed 44.1% of pupae, with Heteropelma scaposum representing 83.3% of all parasitoids reared from pupae. Percentage parasitism levels were lower in irrigated crops (27.6%) compared with dryland crops (40.5%). Recent changes to Helicoverpa spp. management in cotton/grain-farming systems in south-eastern Queensland, including widespread adoption of Bt cotton, and use of more effective and more selective insecticides, could lead to lower densities of overwintering pupae under late summer crops.
Resumo:
The first larval instar has been identified as a critical stage for population mortality in Lepidoptera, yet due to the body size of these larvae, the factors that contribute to mortality under field conditions are still not clear. Dispersal behaviour has been suggested as a significant, but ignored factor contributing to mortality in first-instar lepidopteran larvae. The impact that leaving the host plant has on the mortality rate of Helicoverpa armigera neonates was examined in field crops and laboratory trials. In this study the following are examined: (1) the effects of soil surface temperature, and the level of shade within the crop, on the mortality of neonates on the soil after dropping off from the host plant; (2) the percentage of neonates that dropped off from a host plant and landed on the soil; and (3) the effects of exposure to different soil surface temperatures on the development and mortality of neonates. The findings of this study showed that: (1) on the soil, surface temperatures above 43°C were lethal for neonates, and exposure to these temperatures contributed greatly to the overall mortality rate observed; however, the fate of neonates on the soil varied significantly depending on canopy closure within the crop; (2) at least 15% of neonates dropped off from the host plant and landed on the soil, meaning that the proportion of neonates exposed to these condition is not trivial; and (3) 30 min exposure to soil surface temperatures approaching the lethal level (>43°C) has no significant negative effects on the development and mortality of larvae through to the second instar. Overall leaving the plant through drop-off contributes to first-instar mortality in crops with open canopies; however, survival of neonates that have lost contact with a host plant is possible, and becomes more likely later in the crop growing season.
Resumo:
Typically, in bag-stack or silo fumigations the concentration of phosphine is not constant, and yet most of what is known about phosphine efficacy against grain insects comes from studies with fixed concentrations. Indeed, where changing concentration experiments have been performed, researchers have been unable to explain observed efficacy on the basis of data from fixed concentrations. The ability to predict insect mortality in relation to changing phosphine concentrations would facilitate the development of effective fumigation protocols. In this paper, we explore the prospects for making such predictions. After reviewing published and new results, we conclude that the commonly used concentration x time (Ct) product is unreliable for this purpose. New results, for a strongly resistant strain of Rhyzopertha dominica from Australia, suggest that the relationship Cnt = k may be useful for predicting mortality of this type of insect in changing concentrations. However, in the case of a strain of Sitophilus oryzae with a type of resistance common in Australian S. oryzae, the relationship Cnt = k proved to be less reliable.
Resumo:
The hypothesis that contaminant plants growing amongst chickpea serve as Helicoverpa sinks by diverting oviposition pressure away from the main crop was tested under field conditions. Gain (recruitment) and loss (presumed mortality) of juvenile stages of Helicoverpa spp. on contaminant faba bean and wheat plants growing in chickpea plots were quantified on a daily basis over a 12-d period. The possibility of posteclosion movement of larvae from the contaminants to the surrounding chickpea crop was examined. Estimated total loss of the census population varied from 80 to 84% across plots and rows. The loss of brown eggs (40–47%) contributed most to the overall loss estimate, followed by loss of white eggs (27–35%) and larvae (6–9%). The cumulative number of individuals entering the white and brown egg and larval stages over the census period ranged from 15 to 58, 10–48 and 1–6 per m row, respectively. The corresponding estimates of mean stage-specific loss, expressed as a percentage of individuals entering the stage, ranged from 52 to 57% for white eggs, 87–108% for brown eggs and 71–87% for first-instar larvae. Mean larval density on chickpea plants in close proximity to the contaminant plants did not exceed the baseline larval density on chickpea further away from the contaminants across rows and plots. The results support the hypothesis that contaminant plants in chickpea plots serve as Helicoverpa sinks by diverting egg pressure from the main crop and elevating mortality of juvenile stages. Deliberate contamination of chickpea crops with other plant species merits further investigation as a cultural pest management strategy for Helicoverpa spp.
Resumo:
Adults of a phosphine-resistant strain of Sitophilus oryzae (L) were exposed to constant phosphine concentrations of 0.0035-0.9 mg litre(-1) for periods of between 20 and 168 h at 25 °C, and the effects of time and concentration on mortality were quantified. Adults were also exposed to a series of treatments lasting 48, 72 or 168 h at 25 °C, during which the concentration of phosphine was varied. The aim of this study was to determine whether equations from experiments using constant concentrations could be used to predict the efficacy of changing phosphine concentrations against adults of S oryzae. A probit plane without interaction, in which the logarithms of time (t) and concentration (C) were variables, described the effects of concentration and time on mortality in experiments with constant concentrations. A derived equation of the form C^nt = k gave excellent predictions of toxicity when applied to data from changing concentration experiments. The results suggest that for resistant S oryzae adults there is nothing inherently different between constant and changing concentration regimes, and that data collected from fixed concentrations can be used to develop equations for predicting mortality in fumigations in which phosphine concentration changes. This approach could simplify the prediction of efficacy of typical fumigations in which concentrations tend to rise and then fall over a period of days.
Resumo:
By quantifying the effects of climatic variability in the sheep grazing lands of north western and western Queensland, the key biological rates of mortality and reproduction can be predicted for sheep. These rates are essential components of a decision support package which can prove a useful management tool for producers, especially if they can easily obtain the necessary predictors. When the sub-models of the GRAZPLAN ruminant biology process model were re-parameterised from Queensland data along with an empirical equation predicting the probability of ewes mating added, the process model predicted the probability of pregnancy well (86% variation explained). Predicting mortality from GRAZPLAN was less successful but an empirical equation based on relative condition of the animal (a measure based on liveweight), pregnancy status and age explained 78% of the variation in mortalities. A crucial predictor in these models was liveweight which is not often recorded on producer properties. Empirical models based on climatic and pasture conditions estimated from the pasture production model GRASP, predicted marking and mortality rates for Mitchell grass (Astrebla sp.) pastures (81% and 63% of the variation explained). These prediction equations were tested against independent data from producer properties and the model successfully validated for Mitchell grass communities.
Resumo:
Mounting levels of insecticide resistance within Australian Helicoverpa spp. populations have resulted in the adoption of non-chemical IPM control practices such as trap cropping with chickpea, Cicer arietinum (L.). However, a new leaf blight disease affecting chickpea in Australia has the potential to limit its use as a trap crop. Therefore this paper evaluates the potential of a variety of winter-active legume crops for use as an alternative spring trap crop to chickpea as part of an effort to improve the area-wide management strategy for Helicoverpa spp. in central Queensland’s cotton production region. The densities of Helicoverpa eggs and larvae were compared over three seasons on replicated plantings of chickpea, Cicer arietinum (L.), field pea Pisum sativum (L), vetch, Vicia sativa (L.) and faba bean, Vicia faba (L.). Of these treatments, field pea was found to harbour the highest densities of eggs. A partial life table study of the fate of eggs oviposited on field pea and chickpea suggested that large proportions of the eggs laid on field pea suffered mortality due to dislodgment from the plants after oviposition. Plantings of field pea as a replacement trap crop for chickpea under commercial conditions confirmed the high level of attractiveness of this crop to ovipositing moths. The use of field pea as a trap crop as part of an areawide management programme for Helicoverpa spp. is discussed.
Resumo:
Solvent extracts of cultures of the fungus Paecilomyces varioti are toxic to sheep blowfly, Lucilia cuprina (Wiedemann) (Diptera: Calliphoridae). Different components of the culture extracts were isolated and bioassayed with L. cuprina. The component with most toxicity was purified and identified from its proton magnetic resonance spectrum as viriditoxin, a known antibiotic metabolite of the fungus. The insecticidal properties of viriditoxin were then evaluated. Mean LCso values for first instar larvae of organophosphate susceptible and resistant strains of L. cuprina were 7.5 and 8.4 ppm respectively. Pilot implant trials in sheep demonstrated that the compound provided protection for 9-17 weeks against both strains of L. cuprina. No adverse effects on the trial sheep were detected.
Resumo:
Supplements containing urea or biuret were fed in the dry season to yearling and two year old pregnant heifers grazing native spear grass pastures in north Queensland. Liveweight change and survival during the dry season and fertility in the following year were measured. In the first experiment during a relatively favourable dry season, supplementation significantly (P<0.01) reduced liveweight loss in yearling heifers (5 vs. 32 kg). In the following year during a drought, supplement significantly (P<.01) reduced liveweight loss in yearling heifers (32 vs. 41 kg) and significantly (P <0.01) reduced mortalities (23.5% vs. 5.2%) in pregnant and lactating heifers. The supplement had no significant effect on subsequent fertility in either experiment. 14th Biennial Conference.
Resumo:
The establishment of experimental populations of scarab larvae using eggs and early instar larvae has proven to be difficult for many researchers. Despite this, little work has been published examining ways to optimise establishment under artificial conditions. In this experiment, we examined the effect of shade and irrigation on the establishment of Heteronyx piceus Blanchard larvae introduced into pots as eggs and first-, second- and third-instar larvae to optimise artificial infestation techniques. The most important factor affecting larval establishment was the life stage introduced. Establishment of eggs and first instars was very low, with only 21% of eggs and 11% of first-instar larvae establishing. In contrast, 82% of second-instar larvae and 84% of third-instar larvae established successfully. The addition of shade marginally improved overall survival from 45% in the unshaded pots to 53% in the shaded pots. However, most of this increase was in the eggs and first instars. Irrigation did not improve survival. These results suggest that when introducing scarab larvae to field or pot experiments, second- or thirdinstar larvae should be used to maximise establishment. The provision of shade and supplementary irrigation is optional.
Resumo:
A study was undertaken from 2004 to 2007 to investigate factors associated with decreased efficacy of metalaxyl to manage damping-off of cucumber in Oman. A survey over six growing seasons showed that growers lost up to 14.6% of seedlings following application of metalaxyl. No resistance to metalaxyl was found among Pythium isolates. Damping-off disease in the surveyed greenhouses followed two patterns. In most (69%) greenhouses, seedling mortality was found to occur shortly after transplanting and decrease thereafter (Phase-I). However, a second phase of seedling mortality (Phase-II) appeared 9-14 d after transplanting in about 31% of the surveyed greenhouses. Analysis of the rate of biodegradation of metalaxyl in six greenhouses indicated a significant increase in the rate of metalaxyl biodegradation in greenhouses, which encountered Phase-II damping-off. The half-life of metalaxyl dropped from 93 d in soil, which received no previous metalaxyl treatment to 14 d in soil, which received metalaxyl for eight consecutive seasons, indicating an enhanced rate of metalaxyl biodegradation after repeated use. Multiple applications of metalaxyl helped reduce the appearance of Phase-II damping-off. This appears to be the first report of rapid biodegradation of metalaxyl in greenhouse soils and the first report of its association with appearance of a second phase of mortality in cucumber seedlings.
Resumo:
A trial was undertaken to evaluate the effect of microwaves on seed mortality of three weed species. Seeds of rubber vine (Cryptostegia grandiflora R.Br.), parthenium (Parthenium hysterophorous L.) and bellyache bush (Jatropha gossypiifolia L.) were buried at six depths (0, 2.5, 5, 10, 20 and 40 cm) in coarse sand maintained at one of two moisture levels, oven dry or wet (field capacity), and then subjected to one of five microwave radiation durations of (0, 2, 4, 8 and 16 min). Significant interactions between soil moisture level, microwave radiation duration, seed burial depth and species were detected for mortality of seeds of all three species. Maximum seed mortality of rubber vine (88%), parthenium (67%) and bellyache bush (94%) occurred in wet soil irradiated for 16 min. Maximum seed mortality of rubber vine and bellyache bush seeds occurred in seeds buried at 2.5 cm depth whereas that of parthenium occurred in seeds buried at 10 cm depth. Maximum soil temperatures of 114.1 and 87.5°C in dry and wet soil respectively occurred at 2.5 cm depth following 16 min irradiation. Irrespective of the greater soil temperatures recorded in dry soil, irradiating seeds in wet soil generally increased seed mortality 2.9-fold compared with dry soil. Moisture content of wet soil averaged 5.7% compared with 0.1% for dry soil. Results suggest that microwave radiation has the potential to kill seeds located in the soil seed bank. However, many factors, including weed species susceptibility, determine the effectiveness of microwave radiation on buried seeds. Microwave radiation may be an alternative to conventional methods at rapidly depleting soil seed banks in the field, particularly in relatively wet soils that contain long lived weed seeds.
Resumo:
In south-eastern Queensland, Australia, sorghum planted in early spring usually escapes sorghum midge, Stenodiplosis sorghicola, attack. Experiments were conducted to better understand the role of winter diapause in the population dynamics of this pest. Emergence patterns of adult midge from diapausing larvae on the soil surface and at various depths were investigated during spring to autumn of 1987/88–1989/90. From 1987/88 to 1989/90, 89%, 65% and 98% of adult emergence, respectively, occurred during November and December. Adult emergence from larvae diapausing on the soil surface was severely reduced due to high mortality attributed to surface soil temperatures in excess of 40°C, with much of this mortality occurring between mid-September and mid-October. Emergence of adults from the soil surface was considerably delayed in the 1988/89 season compared with larvae buried at 5 or 10 cm which had similar emergence patterns for all three seasons. In 1989/90, when a 1-cm-deep treatment was included, there was a 392% increase in adult emergence from this treatment compared with deeper treatments. Some diapausing larvae on the surface did not emerge at the end of summer in only 1 year (1989/90), when 28.0% of the larvae on the surface remained in diapause, whereas only 0.8% of the buried larvae remained in diapause. We conclude that the pattern of emergence explains why spring plantings of sorghum in south-eastern Queensland usually escape sorghum midge attack.
Resumo:
Resistance to cyfluthrin in broiler farm populations of lesser mealworm, Alphitobius diaperinus (Panzer) (Coleoptera: Tenebrionidae), in eastern Australia was suspected to have contributed to recent control failures. In 2000-2001, beetles from 11 broiler farms were tested for resistance by comparing them to an insecticide-susceptible reference population by using topical application. Resistance was detected in almost all beetle populations (up to 22 times the susceptible at the LC50), especially in southeastern Queensland where more cyfluthrin applications had been made. Two from outside southeastern Queensland were found to be susceptible. Dose-mortality data generated from the reference population over a range of cyflutbrin concentrations showed that 0.0007% cyfluthrin at a LC99.9 level could be used as a convenient dose to discriminate between susceptible and resistant populations. Using this discriminating concentration, from 2001 to 2005, the susceptibilities of 18 field populations were determined. Of these, 11 did not exhibit complete mortality at the discriminating concentration (mortality range 2.8-97.7%), and in general, cyfluthrin resistance was directly related to the numbers of cyfluthrin applications. As in the full study, populations outside of southeastern Queensland were found to have lower levels of resistance or were susceptible. One population from an intensively farmed broiler area in southeastern Queensland exhibited low mortality despite having no known exposure to cyfluthrin. Comparisons of LC50 values of three broiler populations and a susceptible population, collected in 2000 and 2001 and recollected in 2004 and 2005 indicated that values from the three broiler populations had increased over this time for all populations. The continued use of cyfluthrin for control of A. diaperinus in eastern Australia is currently under consideration.
Resumo:
An adaptive conjoint analysis was use to evaluate stakeholders' opinion of welfare indicators for ship-transported sheep and cattle, both onboard and in pre-export depots. In consultations with two nominees of each identified stakeholder group (government officials, animal welfare representatives, animal scientists, stockpersons, producers/pre-export depot operators, exporters/ship owners and veterinarians), 18 potential indicators were identified Three levels were assigned to each using industry statistics and expert opinion, representing those observed on the best and worst 5% of voyages and an intermediate value. A computer-based questionnaire was completed by 135 stakeholders (48% of those invited). All indicators were ranked by respondents in the assigned order, except fodder intake, in which case providing the amount necessary to maintain bodyweight was rated better than over or underfeeding, and time in the pre-export assembly depot, in which case 5 days was rated better than 0 or 10 days. The respective Importance Values (a relative rating given by the respondent) for each indicator were, in order of declining importance: mortality (8.6%), clinical disease incidence (8.2%), respiration rate (6.8%), space allowance (6.2%), ammonia levels (6.1%), weight change (6.0%), wet bulb temperature (6.0%), time in assembly depot (5.4%), percentage of animals in hospital pen (5.4%), fodder intake (5.2%), stress-related metabolites (5.0%), percentage of feeding trough utilised (5.0%), injuries (4.8%), percentage of animals able to access food troughs at any one time (4.8%), percentage of animals lying down (4.7%), cortisol concentration (4.5Y.), noise (3.9y.), and photoperiod (3.4%). The different stakeholder groups were relatively consistent in their ranking of the indicators, with all groups nominating the some top two and at least five of the top seven indicators. Some of the top indicators, in particular mortality, disease incidence and temperature, are already recorded in the Australian industry, but the study identified potential new welfare indicators for exported livestock, such as space allowance and ammonia concentration, which could be used to improve welfare standards if validated by scientific data. The top indicators would also be useful worldwide for countries engaging in long distance sea transport of livestock.