35 resultados para timing of application
Resumo:
To reduce the number of musters and handling costs, calves in extensive cattle herds in northern Australia are processed (vaccinated, ear-marked, de-horned, branded and males castrated) shortly after they are weaned. As stress has adverse effects on health and growth, and weaning is a stressful time for calves, this experiment asked if calf health, welfare and performance were improved when calves had a period with their mothers post-processing, before they were weaned.
Resumo:
Pseudocercospora macadamiae is an important pathogen of macadamia in Australia, causing a disease known as husk spot. Growers strive to control the disease with a number of carbendazim and copper treatments. The aim of this study was to consider the macadamia fruit developmental stage at which fungicide application is most effective against husk spot, and whether application of copper-only applications at full-size fruit developmental stage toward the end of the season contributed to effective disease control. Fungicides were applied to macadamia trees at four developmental stages in three orchards in two subsequent production seasons. The effects of the treatments on disease incidence and severity were quantified using area under disease progress curve (AUDPC) and logistic regression models. Although disease incidence varied between cultivars, incidence and severity on cv. A16 showed consistent differences between the treatments. Most significant reduction in husk spot incidence occurred when spraying commenced at match-head sized-fruit developmental stage. All treatments significantly reduced husk spot incidence and severity compared with the untreated controls, and a significant positive linear relationship (R2 = 73%) between AUDPC and severity showed that timing of the first fungicide application is important for effective disease control. Application of fungicide at full-size fruit stage reduced disease incidence but had no impact on premature fruit drop.
Resumo:
Pseudocercospora macadamiae causes husk spot of macadamia. Husk spot control would be improved by verifying the stages in fruit development susceptible to infection, and determine some of the climatic conditions likely to lead to high disease pressure periods in the field. Our results showed that the percent conidia germination and growth of germ tubes and mycelia of P. macadamiae were greatest at 26 degrees C, with better conidia germination associated with high relative humidity and free water. The exposure of match-head-sized and pea-sized fruit stages to natural P. macadamiae inoculum in the field led to 2 5-fold increases in husk spot incidence, and up to 8.5-fold increases in premature abscission, compared with unexposed fruit. Exposure of fruit stages later than match-head-sized and pea-sized fruit generally caused no further increases in disease incidence or premature abscission. Climatic conditions were found to have a strong influence on the behaviour of P. macadamiae, the host, oil accumulation, and the subsequent impact of husk spot on premature abscission. Our findings suggest that fungicide application should target fruit at the match-head-sized stage of development in order to best reduce yield losses, particularly in seasons where oil accumulation in fruit is prolonged and climatic conditions are optimal for P. macadamiae.
Resumo:
Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.
Resumo:
This paper reports a field study undertaken to determine if the foliar application of herbicides fluroxypyr (150 mL 100 L-1 a.i.) and metsulfuron-methyl (12 g 100 L-1 a.i.) were capable of reducing the germination and viability of Chromolaena odorata (L.) R.M.King & H.Rob. (Siam weed) seeds at three different stages of maturity. After foliar application of fluroxypyr germination of mature seeds was reduced by 88% and intermediate and immature seeds were reduced by 100%, compared to the control. Fluroxypyr also reduced the viability of mature, intermediate and immature seeds by 79, 89 and 67% respectively, compared to the control. Metsulfuron-methyl reduced germination of intermediate and immature seeds by 53 and 99% respectively compared to the control. Viability was also reduced by 74 and 96% respectively, compared to the control. Mature seeds were not affected by metsulfuron-methyl as germination and viability increased by 2% and 1% respectively, as compared to the control. These results show that these herbicides are capable of reducing the amount of viable seed entering the seed bank. However depending on the treatment and stage of seed development a percentage of seeds on the plants will remain viable and contribute to the seed bank. This information is of value to Siam weed eradication teams as plants are most easily located and subsequently treated at the time of flowering. Knowledge of the impact of control methods on seeds at various stages of development will help determine the most suitable chemical control option for a given situation.
Resumo:
Alternaria leaf blotch and fruit spot of apple caused by Alternaria spp. cause annual losses to the Australian apple industry. Erratic control using protectant fungicides is often experienced and may be due to the lack of understanding of the timing of infection and epidemiology of the diseases. We found that Alternaria leaf blotch infection began about 20 days after bloom (DAB) and the highest disease incidence occurred from 70 to 110 DAB. Alternaria fruit spot infection occurred about 100 DAB in the orchard. Fruit inoculations in planta showed that there was no specific susceptible stage of fruit. Leaves and fruit in the lower canopy of trees showed higher levels of leaf blotch and fruit spot incidence than those in the upper canopy and the incidence of leaf blotch in shoot leaves was higher than in spur leaves. Temperature, relative humidity, and rainfall affected leaf blotch and fruit spot incidence. The gained knowledge on the timing of infection and development of disease may aid in the development of more effective disease management strategies.
Resumo:
Piggery pond sludge (PPS) was applied, as-collected (Wet PPS) and following stockpiling for 12 months (Stockpiled PPS), to a sandy Sodosol and clay Vertosol at sites on the Darling Downs of Queensland. Laboratory measures of N availability were carried out on unamended and PPS-amended soils to investigate their value in estimating supplementary N needs of crops in Australia's northern grains region. Cumulative net N mineralised from the long-term (30 weeks) leached aerobic incubation was described by a first-order single exponential model. The mineralisation rate constant (0.057/week) was not significantly different between Control and PPS treatments or across soil types, when the amounts of initial mineral N applied in PPS treatments were excluded. Potentially mineralisable N (No) was significantly increased by the application of Wet PPS, and increased with increasing rate of application. Application of Wet PPS significantly increased the total amount of inorganic N leached compared with the Control treatments. Mineral N applied in Wet PPS contributed as much to the total mineral N status of the soil as did that which mineralised over time from organic N. Rates of C02 evolution during 30 weeks of aerobic leached incubation indicated that the Stockpiled PPS was more stabilised (19-28% of applied organic C mineralised) than the WetPPS (35-58% of applied organic C mineralised), due to higher lignin content in the former. Net nitrate-N produced following 12 weeks of aerobic non-leached incubation was highly correlated with net nitrate-N leached during 12 weeks of aerobic incubation (R^2 = 0.96), although it was <60% of the latter in both sandy and clayey soils. Anaerobically mineralisable N determined by waterlogged incubation of laboratory PPS-amended soil samples increased with increasing application rate of Wet PPS. Anaerobically minemlisable N from field-moist soil was well correlated with net N mineralised during 30 weeks of aerobic leached incubation (R^2 =0.90 sandy soil; R^2=0.93 clay soil). In the clay soil, the amount of mineral N produced from all the laboratory incubations was significantly correlated with field-measured nitrate-N in the soil profile (0-1.5 m depth) after 9 months of weed-free fallow following PPS application. In contrast, only anaerobic mineralisable N was significantly correlated with field nitrate-N in the sandy soil. Anaerobic incubation would, therefore, be suitable as a rapid practical test to estimate potentially mineralisable N following applications of different PPS materials in the field.
Resumo:
Phosphonate fungicides are used widely in the control of diseases caused by Phytophthora cinnamomi Rands. For the most part phosphonate is seen as a safe to use on crops with phytotoxicity rare. However, recent research has shown that phosphonate has detrimental effects on the floral biology of some indigenous Australian plants. Since phosphonate fungicides are regularly used for the control of Phytophthora root rot in avocados, research was carried out to study the translocation of phosphonate fungicide in 'Hass' trees and any effects on their floral biology. Field-grown trees were sprayed with 0, 0.06 or 0.12 M mono-dipotassium phosphonate (pH 7.2) at summer flush maturity, floral bud break or anthesis. Following treatment, phosphonic acid concentrations were determined in leaves, roots, inflorescence rachi and flowers and in vitro pollen germination and pollen tube growth studied. Phosphonic acid concentration in the roots and floral parts was related to their sink strength at the respective times of application with concentration in roots highest (36.9.mg g±1) after treatment at summer flush maturity and in flowers (234.7 mg g±1) after treatment during early anthesis. Phosphonate at >0.03 M was found to be significantly phytotoxic to in vitro pollen germination and pollen tube growth. However, this rate gave a concentration far in excess of that measured in plant tissues following standard commercial applications of mono-dipotassium phosphonate fungicide. There was a small effect on pollen germination and pollen tube growth when 0.06 and 0.12 M mono-dipotassium phosphonate was applied during early anthesis. However, under favourable pollination and fruit set conditions it is not expected to have commercial impact on tree yield. However, there may be detrimental commercial implications from phosphonate sprays at early anthesis if unfavourable climatic conditions for pollination and fruit set subsequently occur. A commercial implication from this study is that phosphonic acid root concentrations can be elevated and maintained with strategic foliar applications of phosphonate fungicide timed to coincide with peaks in root sink strength. These occur at the end of the spring and summer flushes when shoot growth is relatively quiescent. Additional foliar applications may be advantageous in under high disease-pressure situations but where possible should be timed to minimize overlap with other significant growth events in the tree such as rapid inflorescence, and fruit development and major vegetative flushing.
Resumo:
The Juvenile Hormone analogue s-methoprene is used to protect stored grain from pests such as the lesser grain borer, Rhyzopertha dominica (F.). The possibility that uneven application influences s-methoprene efficacy against this species was investigated in the laboratory. Adults of methoprene-susceptible strains were exposed for 14 days to wheat treated at doses of up to 0.6 mg kg-1, or to mixtures of treated and untreated wheat giving equivalent average doses. Adult mortality after exposure to treated wheat was negligible in all cases (3.3%) and there was no significant effect of either average dose or evenness of application. In contrast, the number of adult progeny depended on both the average dose and evenness of application. Average doses of 0.3 and 0.6 mg kg-1 reduced the number of live F1 adults by 99-100% relative to the untreated wheat and no effect of evenness of application was detected. At lower doses, however, efficacy tended to decrease with increasing unevenness of application. When adults from the parental generation were transferred to untreated wheat for another 14 days neither the average dose nor evenness of application in the wheat from which they came had any significant effect on reproduction of these adults. This study demonstrates that uneven application can reduce the efficacy of s-methoprene against R. dominica, but that this is unlikely to influence the performance of s-methoprene against susceptible populations at target doses likely to be used in practice (e.g. 0.6 mg kg-1 in Australia). However, the possibility that uneven application leads to underdosing and selects for resistance should be investigated.
Resumo:
Current understanding is that high planting density has the potential to suppress weeds and crop-weed interactions can be exploited by adjusting fertilizer rates. We hypothesized that (a) high planting density can be used to suppress Rottboellia cochinchinensis growth and (b) rice competitiveness against this weed can be enhanced by increasing nitrogen (N) rates. We tested these hypotheses by growing R. cochinchinensis alone and in competition with four rice planting densities (0, 100, 200, and 400 plants m-2) at four N rates (0, 50, 100, and 150 kg ha-1). At 56 days after sowing (DAS), R. cochinchinensis plant height decreased by 27-50 %, tiller number by 55-76 %, leaf number by 68-84 %, leaf area by 70-83 %, leaf biomass by 26-90 %, and inflorescence biomass by 60-84 %, with rice densities ranging from 100 to 400 plants m-2. All these parameters increased with an increase in N rate. Without the addition of N, R. cochinchinensis plants were 174 % taller than rice; whereas, with added N, they were 233 % taller. Added N favored more weed biomass production relative to rice. R. cochinchinensis grew taller than rice (at all N rates) to avoid shade, which suggests that it is a "shade-avoiding" plant. R. cochinchinensis showed this ability to reduce the effect of rice interference through increased leaf weight ratio, specific stem length, and decreased root-shoot weight ratio. This weed is more responsive to N fertilizer than rice. Therefore, farmers should give special consideration to the application timing of N fertilizer when more N-responsive weeds are present in their field. Results suggest that the growth and seed production of R. cochinchinensis can be decreased considerably by increasing rice density to 400 plants m-2. There is a need to integrate different weed control measures to achieve complete control of this noxious weed.
Resumo:
Tension banding castration of cattle is gaining favour because it is relatively simple to perform and is promoted by retailers of the banders as a humane castration method. Two experiments were conducted, under tropical conditions using Bos indicus bulls comparing tension banding (Band) and surgical (Surgical) castration of weaner (7–10 months old) and mature (22–25 months old) bulls with and without pain management (NSAID (ketoprofen) or saline injected intramuscularly immediately prior to castration). Welfare outcomes were assessed using a range of measures; this paper reports on some physiological, morbidity and productivity-related responses to augment the behavioural responses reported in an accompanying paper. Blood samples were taken on the day of castration (day 0) at the time of restraint (0 min) and 30 min (weaners) or 40 min (mature bulls), 2 h, and 7 h; and days 1, 2, 3, 7, 14, 21 and 28 post-castration. Plasmas from day 0 were assayed for cortisol, creatine kinase, total protein and packed cell volume. Plasmas from the other samples were assayed for cortisol and haptoglobin (plus the 0 min sample). Liveweights were recorded approximately weekly to 6 weeks and at 2 and 3 months post-castration. Castration sites were checked at these same times to 2 months post-castration to score the extent of healing and presence of sepsis. Cortisol concentrations (mean ± s.e. nmol/L) were significantly (P < 0.05) higher in the Band (67 ± 4.5) compared with Surgical weaners (42 ± 4.5) at 2 h post-castration, but at 24 h post-castration were greater in the Surgical (43 ± 3.2) compared with the Band weaners (30 ± 3.2). The main effect of ketoprofen was on the cortisol concentrations of the mature Surgical bulls; concentrations were significantly reduced at 40 min (47 ± 7.2 vs. 71 ± 7.2 nmol/L for saline) and 2 h post-castration (24 ± 7.2, vs. 87 ± 7.2 nmol/L for saline). Ketoprofen, however, had no effect on the Band mature bulls, with their cortisol concentrations averaging 54 ± 5.1 nmol/L at 40 min and 92 ± 5.1 nmol/L at 2 h. Cortisol concentrations were also significantly elevated in the Band (83 ± 3.0 nmol/L) compared with Surgical mature bulls (57 ± 3.0 nmol/L) at weeks 2–4 post-castration. The timing of this elevation coincided with significantly elevated haptoglobin concentrations (mg/mL) in the Band bulls (2.97 ± 0.102 for mature bulls and 1.71 ± 0.025 for weaners, vs. 2.10 ± 0.102 and 1.45 ± 0.025 respectively for the Surgical treatment) and evidence of slow wound healing and sepsis in both the weaner (0.81 ± 0.089 not healed at week 4 for Band, 0.13 ± 0.078 for Surgical) and mature bulls (0.81 ± 0.090 at week 4 for Band, 0.38 ± 0.104 for Surgical). Overall, liveweight gains of both age groups were not affected by castration method. The findings of acute pain, chronic inflammation and possibly chronic pain in the mature bulls at least, together with poor wound healing in the Band bulls support behavioural findings reported in the accompanying paper and demonstrate that tension banding produces inferior welfare outcomes for weaner and mature bulls compared with surgical castration.
Resumo:
There is limited understanding about how insect movement patterns are influenced by landscape features, and how landscapes can be managed to suppress pest phytophage populations in crops. Theory suggests that the relative timing of pest and natural enemy arrival in crops may influence pest suppression. However, there is a lack of data to substantiate this claim. We investigate the movement patterns of insects from native vegetation (NV) and discuss the implications of these patterns for pest control services. Using bi-directional interception traps we quantified the number of insects crossing an NV/crop ecotone relative to a control crop/crop interface in two agricultural regions early in the growing season. We used these data to infer patterns of movement and net flux. At the community-level, insect movement patterns were influenced by ecotone in two out of three years by region combinations. At the functional-group level, pests and parasitoids showed similar movement patterns from NV very soon after crop emergence. However, movement across the control interface increased towards the end of the early-season sampling period. Predators consistently moved more often from NV into crops than vice versa, even after crop emergence. Not all species showed a significant response to ecotone, however when a response was detected, these species showed similar patterns between the two regions. Our results highlight the importance of NV for the recruitment of natural enemies for early season crop immigration that may be potentially important for pest suppression. However, NV was also associated with crop immigration by some pest species. Hence, NV offers both opportunities and risks for pest management. The development of targeted NV management may reduce the risk of crop immigration by pests, but not of natural enemies.
Resumo:
BACKGROUND: Field studies of diuron and its metabolites 3-(3,4-dichlorophenyl)-1-methylurea (DCPMU), 3,4-dichlorophenylurea (DCPU) and 3,4-dichloroaniline (DCA) were conducted in a farm soil and in stream sediments in coastal Queensland, Australia. RESULTS: During a 38 week period after a 1.6 kg ha^-1 diuron application, 70-100% of detected compounds were within 0-15 cm of the farm soil, and 3-10% reached the 30-45 cm depth. First-order t1/2 degradation averaged 49 ± 0.9 days for the 0-15, 0-30 and 0-45 cm soil depths. Farm runoff was collected in the first 13-50 min of episodes lasting 55-90 min. Average concentrations of diuron, DCPU and DCPMU in runoff were 93, 30 and 83-825 µg L^-1 respectively. Their total loading in all runoff was >0.6% of applied diuron. Diuron and DCPMU concentrations in stream sediments were between 3-22 and 4-31 µg kg^-1 soil respectively. The DCPMU/diuron sediment ratio was >1. CONCLUSION: Retention of diuron and its metabolites in farm topsoil indicated their negligible potential for groundwater contamination. Minimal amounts of diuron and DCMPU escaped in farm runoff. This may entail a significant loading into the wider environment at annual amounts of application. The concentrations and ratio of diuron and DCPMU in stream sediments indicated that they had prolonged residence times and potential for accumulation in sediments. The higher ecotoxicity of DCPMU compared with diuron and the combined presence of both compounds in stream sediments suggest that together they would have a greater impact on sensitive aquatic species than as currently apportioned by assessments that are based upon diuron alone.
Resumo:
Many arthropod predators and parasitoids exhibit either stage-specific or lifetime omnivory, in that they include extra-floral nectar, floral nectar, honeydew or pollen in their immature and/or adult diet. Access to these plant-derived foods can enhance pest suppression by increasing both the individual fitness and local density of natural enemies. Commercial products such as Amino-Feed®, Envirofeast®, and Pred-Feed® can be applied to crops to act as artificial-plant-derived foods. In laboratory and glasshouse experiments we examined the influence of carbohydrate and protein rich Amino-Feed UV® or Amino-Feed, respectively, on the fitness of a predatory nabid bug Nabis kinbergii Reuter (Hemiptera: Nabidae) and bollworm pupal parasitoid Ichneumon promissorius (Erichson) (Hymenoptera: Ichneumonidae). Under the chosen conditions, the provision of either wet or dry residues of Amino-Feed UV had no discernable effect on immediate or longer-term survival and immature development times of N. kinbergii. In contrast, the provision of honey, Amino-Feed plus extrafloral nectar, and extrafloral nectar alone had a marked effect on the longevity of I. promissorius, indicating that they were limited by at least carbohydrates as an energy source, but probably not protein. Compared with a water only diet, the provision of Amino-Feed plus extrafloral nectar increased the longevity of males and females of I. promissorius by 3.0- and 2.4-fold, respectively. Not only did female parasitoids live longer when provided food, but the total number of eggs laid and timing of deposition was affected by diet under the chosen conditions. Notably, females in the water and honey treatments deposited greater numbers of eggs earlier in the trial, but this trend was unable to be sustained over their lifetime. Egg numbers in these treatments subsequently fell below the levels achieved by females in the Amino-Feed plus extrafloral nectar and cotton extrafloral nectar only treatments. Furthermore, there were times when the inclusion of the Amino-Feed was beneficial compared with cotton extrafloral nectar only. Artificial food supplements and plant-derived foods are worthy of further investigation because they have potential to improve the ecosystem service of biological pest control in targeted agroecosystems by providing natural enemies with an alternative source of nutrition, particularly during periods of prey/host scarcity.
Resumo:
Mortality of calves born to provisioned mothers is identified in the literature as an issue of concern in dolphin provisioning programs. Wild dolphin provisioning at Tangalooma, Moreton Island, Australia has been occurring since 1992. Each evening, up to eight dolphins are provided with fish in a regulated provisioning program. In this paper, calf survival at the Tangalooma provisioning program is reported and contrasted with that from wild populations and from a similar provisioning program at Monkey Mia, Western Australia. At Tangalooma, the calf survival rate is 100%, including both orphaned and first-born calves, both of which are expected to have relatively low survival rates. Possible explanations for the high calf survival rate are explored. These include site attributes such as isolated location and high water quality, aspects of foraging ecology likely to benefit calves of provisioned mothers, and the management regime used in the provisioning program (e.g., duration and timing of provisioning; quality of provisioned fish).