130 resultados para Soil-Neem relationships
Resumo:
The genetics of heifer performance in tropical 'wet' and 'dry' seasons, and relationships with steer performance, were studied in Brahman (BRAH) and Tropical Composite (TCOMP) (50% Bos indicus, African Sanga or other tropically adapted Bos taurus; 50% non-tropically adapted Bos taurus) cattle of northern Australia. Data were from 2159 heifers (1027 BRAH, 1132 TCOMP), representing 54 BRAH and 51 TCOMP sires. Heifers were assessed after post-weaning 'wet' (ENDWET) and 'dry' (ENDDRY) seasons. Steers were assessed post-weaning, at feedlot entry, over a 70-day feed test, and after similar to 120-day finishing. Measures studied in both heifers and steers were liveweight (LWT), scanned rump fat, rib fat and M. longissimus area (SEMA), body condition score (CS), hip height (HH), serum insulin-like growth factor-I concentration (IGF-I), and average daily gains (ADG). Additional steer measures were scanned intra-muscular fat%, flight time, and daily (DFI) and residual feed intake (RFI). Uni- and bivariate analyses were conducted for combined genotypes and for individual genotypes. Genotype means were predicted for a subset of data involving 34 BRAH and 26 TCOMP sires. A meta-analysis of genetic correlation estimates examined how these were related to the difference between measurement environments for specific traits. There were genotype differences at the level of means, variances and genetic correlations. BRAH heifers were significantly (P < 0.05) faster-growing in the 'wet' season, slower-growing in the 'dry' season, lighter at ENDDRY, and taller and fatter with greater CS and IGF-I at both ENDWET and ENDDRY. Heritabilities were generally in the 20 to 60% range for both genotypes. Phenotypic and genetic variances, and genetic correlations, were commonly lower for BRAH. Differences were often explained by the long period of tropical adaptation of B. indicus. Genetic correlations were high between corresponding measures at ENDWET and ENDDRY, positive between fat and muscle measures in TCOMP but negative in BRAH (mean of 13 estimates 0.50 and -0.19, respectively), and approximately zero between steer feedlot ADG and heifer ADG in BRAH. Numerous genetic correlations between heifers and steers differed substantially from unity, especially in BRAH, suggesting there may be scope to select differently in the sexes where that would aid the differing roles of heifers and steers in production. Genetic correlations declined as measurement environments became more different, the rates of decline (environment sensitivity) sometimes differing with genotype. Similar measures (LWT, HH and ADG; IGF-I at ENDWET in TCOMP) were genetically correlated with steer DFI in heifers as in steers. Heifer SEMA was genetically correlated with steer feedlot RFI in BRAH (0.75 +/- 0.27 at ENDWET, 0.66 +/- 0.24 at ENDDRY). Selection to reduce steer RFI would reduce SEMA in BRAH heifers but otherwise have only small effects on heifers before their first joining.
Resumo:
It has been reported that high-density planting of sugarcane can improve cane and sugar yield through promoting rapid canopy closure and increasing radiation interception earlier in crop growth. It is widely known that the control of adverse soil biota through fumigation (removes soil biological constraints and improves soil health) can improve cane and sugar yield. Whether the responses to high-density planting and improved soil health are additive or interactive has important implications for the sugarcane production system. Field experiments established at Bundaberg and Mackay, Queensland, Australia, involved all combinations of 2-row spacings (0.5 and 1.5 m), two planting densities (27 000 and 81 000 two-eyed setts/ha), and two soil fumigation treatments (fumigated and non-fumigated). The Bundaberg experiment had two cultivars (Q124, Q155), was fully irrigated, and harvested 15 months after planting. The Mackay experiment had one cultivar (Q117), was grown under rainfed conditions, and harvested 10 months after planting. High-density planting (81 000 setts/ha in 0.5-m rows) did not produce any more cane or sugar yield at harvest than low-density planting (27 000 setts/ha in 1.5-m rows) regardless of location, crop duration (15 v. 10 months), water supply (irrigated v. rainfed), or soil health (fumigated v. non-fumigated). Conversely, soil fumigation generally increased cane and sugar yields regardless of site, row spacing, and planting density. In the Bundaberg experiment there was a large fumigation x cultivar x density interaction (P<0.01). Cultivar Q155 responded positively to higher planting density in non-fumigated soil but not in fumigated soil, while Q124 showed a negative response to higher planting density in non-fumigated soil but no response in fumigated soil. In the Mackay experiment, Q117 showed a non-significant trend of increasing yield in response to increasing planting density in non-fumigated soil, similar to the Q155 response in non-fumigated soil at Bundaberg. The similarity in yield across the range of row spacings and planting densities within experiments was largely due to compensation between stalk number and stalk weight, particularly when fumigation was used to address soil health. Further, the different cultivars (Q124 and Q155 at Bundaberg and Q117 at Mackay) exhibited differing physiological responses to the fumigation, row spacing, and planting density treatments. These included the rate of tiller initiation and subsequent loss, changes in stalk weight, and propensity to lodging. These responses suggest that there may be potential for selecting cultivars suited to different planting configurations.
Resumo:
Numerous tests have been used to measure beef cattle temperament, but limited research has addressed the relationship between such tests and whether temperament can be modified. One-hundred-and-forty-four steers were given one of three human handling and yarding experiences on six occasions during a 12-month grazing period post-weaning (backgrounding): Good handling/yarding, Poor handling/yarding and Minimal handling/yarding. At the end of this phase the cattle were lot-fed for 78 days, with no handling/yarding treatments imposed, before being transported for commercial slaughter. Temperament was assessed at the start of the experiment, during backgrounding and lot-feeding by flight speed (FS) and a fear of humans test, which measured the proximity to a stimulus person (zone average; ZA), the closest approach to the person (CA) and the amount the cattle moved around the test arena (total transitions; TT). During backgrounding, FS decreased for all treatments and at the end of backgrounding there was no difference between them. The rate of decline, however, was greatest in the Good group, smallest in the Minimal group with the Poor intermediate. In contrast, ZA was affected by treatment, with a greater reduction for the Good group than the others (P = 0.012). During lot-feeding, treatment did not affect FS, but all groups showed a decrease in ZA, with the greatest change in the Poor group, the least in the Good and the Minimal intermediate (P = 0.052). CA was positively correlated with ZA (r = 0.18 to 0.66) and negatively with TT (r = -0.180 to -0.659). FS was consistently correlated with TT only (r = 0.17 to 0.49). These findings suggest that FS and TT measure a similar characteristic, as do ZA and CA, but that these characteristics are different from one another, indicating that temperament is not a unitary trait, but has different facets. FS and TT measure one facet that we suggest is general agitation, whilst ZA and CA measure fear of people. Thus, the cattle became less agitated during backgrounding, but the effect was not permanently influenced by the quantity and quality of handling/yarding. However, Good handling/yarding reduced fearfulness of people. Fear of people was also reduced during lot-feeding, probably as a consequence of frequent exposure to humans in a situation that was neutral or positive for the cattle.
Resumo:
Avian haemophili demonstrating in vitro satellitic growth, also referred to as the V-factor or NAD requirement, have mainly been classified with Avibacterium paragallinarum (Haemophilus paragallinarum), Avibacterium avium (Pasteurella avium), Avibacterium volantium (Pasteurella volantium) and Avibacterium sp. A (Pasteurella species A). The aim of the present study was to assess the taxonomic position of 18 V-factor-requiring isolates of unclassified Haemophilus-like organisms isolated from galliforme, anseriforme, columbiforme and gruiforme birds as well as kestrels and psittacine birds including budgerigars by conventional phenotypic tests and 16S rRNA gene sequencing. All isolates shared phenotypical characteristics which allowed classification with Pasteurellaceae. Haemolysis of bovine red blood cells was negative. Haemin (X-factor) was not required for growth. Maximum-likelihood phylogenetic analysis including bootstrap analysis showed that six isolates were related to the avian 16S rRNA group and were classified as Avibacterium according to 16S rRNA sequence analysis. Surprisingly, the other 12 isolates were unrelated to Avibacterium. Two isolates were unrelated to any of the known 16S rRNA groups of Pasteurellaceae. Two isolates were related to Volucribacter of the avian 16S rRNA group. Seven isolates belonged to the Testudinis 16S rRNA group and out of these, two isolates were closely related to taxa 14 and 32 of Bisgaard, whereas four other isolates were found to form a genus-like group distantly related to taxon 40 and one isolated remained distantly related to other members of the Testudinis group. One isolate was closely related to taxon 26 (a member of Actinobacillus sensu stricto). The study documented major genetic diversity among V-factor-requiring avian isolates beyond the traditional interpretation that they only belong to Avibacterium, underlining the limited value of satellitic growth for identification of avian members of Pasteurellaceae. Our study also emphasized that these organisms will never be isolated without the use of special media satisfying the V-factor requirement.
Resumo:
The response of soybean (Glycine max) and dry bean (Phaseolus vulgaris) to feeding by Helicoverpa armigera during the pod-fill stage was studied in irrigated field cages over three seasons to determine the relationship between larval density and yield loss, and to develop economic injury levels. H. armigera intensity was calculated in Helicoverpa injury equivalent (HIE) units, where 1 HIE was the consumption of one larva from the start of the infestation period to pupation. In the dry bean experiment, yield loss occurred at a rate 6.00 ± 1.29 g/HIE while the rates of loss in the three soybean experiments were 4.39 ± 0.96 g/HIE, 3.70 ± 1.21 g/HIE and 2.12 ± 0.71 g/HIE. These three slopes were not statistically different (P > 0.05) and the pooled estimate of the rate of yield loss was 3.21 ± 0.55 g/HIE. The first soybean experiment also showed a split-line form of damage curve with a rate of yield loss of 26.27 ± 2.92 g/HIE beyond 8.0 HIE and a rapid decline to zero yield. In dry bean, H. armigera feeding reduced total and undamaged pod numbers by 4.10 ± 1.18 pods/HIE and 12.88 ± 1.57 pods/HIE respectively, while undamaged seed numbers were reduced by 35.64 ± 7.25 seeds/HIE. In soybean, total pod numbers were not affected by H. armigera infestation (out to 8.23 HIE in Experiment 1) but seed numbers (in Experiments 1 and 2) and the number of seeds/pod (in all experiments) were adversely affected. Seed size increased with increases in H. armigera density in two of the three soybean experiments, indicating plant compensatory responses to H. armigera feeding. Analysis of canopy pod profiles indicated that loss of pods occurred from the top of the plant downwards, but with an increase in pod numbers close to the ground at higher pest densities as the plant attempted to compensate for damage. Based on these results, the economic injury levels for H. armigera on dry bean and soybean are approximately 0.74 HIE and 2.31 HIE/m2, respectively (0.67 and 2.1 HIE/row-m for 91 cm rows).
Resumo:
The response of vegetative soybean (Glycine max) to Helicoverpa armigera feeding was studied in irrigated field cages over three years in eastern Australia to determine the relationship between larval density and yield loss, and to develop economic injury levels. Rather than using artificial defoliation techniques, plants were infested with either eggs or larvae of H. armigera, and larvae allowed to feed until death or pupation. Larvae were counted and sized regularly and infestation intensity was calculated in Helicoverpa injury equivalent (HIE) units, where 1 HIE was the consumption of one larva from the start of the infestation period to pupation. In the two experiments where yield loss occurred, the upper threshold for zero yield loss was 7.51 ± 0.21 HIEs and 6.43 ± 1.08 HIEs respectively. In the third experiment, infestation intensity was lower and no loss of seed yield was detected up to 7.0 HIEs. The rate of yield loss/HIE beyond the zero yield loss threshold varied between Experiments 1 and 2 (-9.44 ± 0.80 g and -23.17 ± 3.18 g, respectively). H. armigera infestation also affected plant height and various yield components (including pod and seed numbers and seeds/pod) but did not affect seed size in any experiment. Leaf area loss of plants averaged 841 and 1025 cm2/larva in the two experiments compared to 214 and 302 cm2/larva for cohort larvae feeding on detached leaves at the same time, making clear that artificial defoliation techniques are unsuitable for determining H. armigera economic injury levels on vegetative soybean. Analysis of canopy leaf area and pod profiles indicated that leaf and pod loss occurred from the top of the plant downwards. However, there was an increase in pod numbers closer to the ground at higher pest densities as the plant attempted to compensate for damage. Defoliation at the damage threshold was 18.6 and 28.0% in Experiments 1 and 2, indicating that yield loss from H. armigera feeding occurred at much lower levels of defoliation than previously indicated by artificial defoliation studies. Based on these results, the economic injury level for H. armigera on vegetative soybean is approximately 7.3 HIEs/row-metre in 91 cm rows or 8.0 HIEs/m2.
Resumo:
Weed biocontrol relies on host specificity testing, usually carried out under quarantine conditions to predict the future host range of candidate control agents. The predictive power of host testing can be scrutinised directly with Aconophora compressa, previously released against the weed Lantana camara L. (lantana) because its ecology in its new range (Australia) is known and includes the unanticipated use of several host species. Glasshouse based predictions of field host use from experiments designed a posteriori can therefore be compared against known field host use. Adult survival, reproductive output and egg maturation were quantified. Adult survival did not differ statistically across the four verbenaceous hosts used in Australia. Oviposition was significantly highest on fiddlewood (Citharexylum spinosum L.), followed by lantana, on which oviposition was significantly higher than on two varieties of Duranta erecta (‘‘geisha girl’’ and ‘‘Sheena’s gold’’; all Verbenaceae). Oviposition rates across Duranta varieties were not significantly different from each other but were significantly higher than on the two non-verbenaceous hosts (Jacaranda mimosifolia D. Don: Bignoneaceae (jacaranda) and Myoporum acuminatum R. Br.: Myoporaceae (Myoporum)). Production of adult A. compressa was modelled across the hosts tested. The only major discrepancy between model output and their relative abundance across hosts in the field was that densities on lantana in the field were much lower than predicted by the model. The adults may, therefore, not locate lantana under field conditions and/or adults may find lantana but leave after laying relatively few eggs. Fiddlewood is the only primary host plant of A. compressa in Australia, whereas lantana and the others are used secondarily or incidentally. The distinction between primary, secondary and incidental hosts of a herbivore species helps to predict the intensity and regularity of host use by that herbivore. Populations of the primary host plants of a released biological control agent are most likely to be consistently impacted by the herbivore, whereas secondary and incidental host plant species are unlikely to be impacted consistently. As a consequence, potential biocontrol agents should be released only against hosts to which they have been shown to be primarily adapted.
Resumo:
Three experiments were conducted on the use of water retaining amendments under newly-laid turf mats. The work focused on the first 12 weeks of establishment. In soils that already possessed a good water-holding capacity, water retaining amendments did not provide any benefit. On a sand-based profile, a rooting depth of 200 mm was achieved with soil amendment products within three weeks of laying turf. Most products differed in their performance relative to each other at each three weekly measurement interval. Polyacrylamide gels gave superior results when the crystals were incorporated into the soil profile. They were not suitable for broadcasting at the soil/sod interface. Finer grades of crystals were less likely to be subject to excessive expansion than medium grade crystals after heavy rainfall. Turf establishment was more responsive to products at higher application rates, however these higher rates may result in surface stability problems.
Resumo:
Soil water repellency occurs widely in horticultural and agricultural soils when very dry. The gradual accumulation and breakdown of surface organic matter over time produces wax-like organic acids, which coat soil particles preventing uniform entry of water into the soil. Water repellency is usually managed by regular surfactant applications. Surfactants, literally, are surface active agents (SURFace ACTive AgeNTS). Their mode of action is to reduce the surface tension of water, allowing it to penetrate and wet the soil more easily and completely. This practice improves water use efficiency (by requiring less water to wet the soil and by capturing rainfall and irrigation more effectively and rapidly). It also reduces nutrient losses through run-off erosion or leaching. These nutrients have the potential to pollute the surrounding environment and water courses. This project investigated potential improvements to standard practices (product combination and scheduling) for surfactant use to overcome localised dry spots on water repellent soils and thus improve turf quality and water use efficiency. Weather conditions for the duration of the trial prevented the identification of improved practices in terms of combination and scheduling. However, the findings support previous research that the use of soil surfactants decreased the time for water to infiltrate dry soil samples taken from a previously severely hydrophobic site. Data will be continually collected from this trial site on a private contractual basis, with the hope that improvements to standard practices will be observed during the drier winter months when moisture availability is a limiting factor for turfgrass growth and quality.
Resumo:
Rooted cutting propagation is widely used for maximising tree yield, quality and uniformity in conjunction with clonal selection. Some eucalypt species are deployed as rooted cuttings but many are considered to difficult to root. This study examined IBA effects on photoinhibition, root formation, mortality and root and shoot development in cuttings of Corymbia torelliana, C. citriodora and their hybrids. IBA had little or no effect on photoinhibition but it had strong, dose-dependent effects on root formation and mortality. IBA frequently increases primary root number of rooted cutting but it did not increase total root weight, length, surface area or volume, possibly because the highest doe (8g IBA/kg IBA/kg powder) caused leaf abscission and sometimes reduced leaf area (by 55-79%)or shoot dry weight (by 40-58%). An intermediate dose (3g IBA/kg powder) most consistnely improved root formation with little or no effect on mortality or shoot development. Across the F1 hybrid families this treatment increased the number of rooted cuttings by 72-121% and more than ddoubled the number of primary roots per rooted cutting (from 1.1-1.7 roots to 3.5-4.1 roots). This simple treatment will facilitate commercial multiplication of superior individuals or selected families of C. torelliana x C. citriodora through a vegetative propagation system.
Resumo:
Spotted gum dominant forests occur from Cooktown in northern Queensland (Qld) to Orbost in Victoria (Boland et al. 2006) and these forests are commercially very important with spotted gum the most commonly harvested hardwood timber in Qld and one of the most important in New South Wales (NSW). Spotted gum has a wide range of end uses from solid wood products through to power transmission poles and generally has excellent sawing and timber qualities (Hopewell 2004). The private native forest resource in southern Qld and northern NSW is a critical component of the hardwood timber industry (Anon 2005, Timber Qld 2006) and currently half or more of the native forest timber resource harvested in northern NSW and Qld is sourced from private land. However, in many cases productivity on private lands is well below what could be achieved with appropriate silvicultural management. This project provides silvicultural management tools to assist extension staff, land owners and managers in the south east Qld and north eastern NSW regions. The intent was that this would lead to improvement of the productivity of the private estate through implementation of appropriate management. The other intention of this project was to implement a number of silvicultural experiments and demonstration sites to provide data on growth rates of managed and unmanaged forests so that landholders can make informed decisions on the future management of their forests. To assist forest managers and improve the ability to predict forest productivity in the private resource, the project has developed: • A set of spotted gum specific silvicultural guidelines for timber production on private land that cover both silvicultural treatment and harvesting. The guidelines were developed for extension officers and property owners. • A simple decision support tool, referred to as the spotted gum productivity assessment tool (SPAT), that allows an estimation of: 1. Tree growth productivity on specific sites. Estimation is based on the analysis of site and growth data collected from a large number of yield and experimental plots on Crown land across a wide range of spotted gum forest types. Growth algorithms were developed using tree growth and site data and the algorithms were used to formulate basic economic predictors. 2. Pasture development under a range of tree stockings and the expected livestock carrying capacity at nominated tree stockings for a particular area. 3. Above-ground tree biomass and carbon stored in trees. •A series of experiments in spotted gum forests on private lands across the study area to quantify growth and to provide measures of the effect of silvicultural thinning and different agro-forestry regimes. The adoption and use of these tools by farm forestry extension officers and private land holders in both field operations and in training exercises will, over time, improve the commercial management of spotted gum forests for both timber and grazing. Future measurement of the experimental sites at ages five, 10 and 15 years will provide longer term data on the effects of various stocking rates and thinning regimes and facilitate modification and improvement of these silvicultural prescriptions.
Resumo:
This paper quantifies gaseous N losses due to ammonia volatilisation and denitrification under controlled conditions at 30 degrees C and 75% to 150% of Field Capacity (FC). Biosolids were mixed with two contrasting soils from subtropical Australia at a rate designed to meet crop N requirements for irrigated cotton or maize (i.e., equivalent to 180 kg N ha(-1)). In the first experiment, aerobically (AE) and anaerobically (AN) digested biosolids were mixed into a heavy Vertosol soil and then incubated for 105 days. Ammonia volatilization over 72 days accounted for less than 4% of the applied NH4-N but 24% (AN) to 29% (AE) of the total applied biosolids' N was lost through denitrification in 105 days. In the second experiment AN biosolids with and without added polyacrimide polymer were mixed with either a heavy Vertosol or a lighter Red Ferrosol and then incubated for 98 days. The N loss was higher from the Vertosol with 16-29% of total N applied versus the Red Ferrosol with 7-10% of total N applied, while addition of polymer to the biosolids increased N loss from 7 to 10% and from 16 to 29% in the Red Ferrosol and Vertosol, respectively. A major product from the denitrification process was N-2 gas, accounting for >90% of the emitted N gases from both experiments. Our findings demonstrate that denitrification could be a major pathway of gaseous N losses under warm and moist conditions.
Resumo:
Lantana camara is a recognized weed of worldwide significance due to its extensive distribution and its impacts on primary industries and nature conservation. However, quantitative data on the impact of the weed on soil ecosystem properties are scanty, especially in SE Australia, despite the pervasive presence of the weed along its coastal and inland regions. Consequently, mineral soils for physicochemical analyses were collected beneath and away from L. camara infestations in four sites west of Brisbane, SE Australia. These sites (hoop pine plantation, cattle farm, and two eucalyptus forests with occasional grazing and a fire regime, respectively) vary in landscape and land-use types. Significant site effect was more frequently observed than effect due to invasion status. Nonetheless, after controlling for site differences, ~50% of the 23 soil traits examined differed significantly between infested and non-infested soils. Moisture, pH, Ca, total and organic C, and total N (but not exchangeable N in form of NO3-) were significantly elevated, while sodium, chloride, copper, iron, sulfur, and manganese, many of which can be toxic to plant growth if present in excess levels, were present at lower levels in soils supporting L. camara compared to soils lacking the weed. These results indicate that L. camara can improve soil fertility and influence nutrient cycling, making the substratum ideal for its own growth and might explain the ability of the weed to outcompete other species, especially native ones.
Resumo:
Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.
Resumo:
The impact of three cropping histories (sugarcane, maize and soybean) and two tillage practices (conventional tillage and direct drill) on plant-parasitic and free-living nematodes in the following sugarcane crop was examined in a field trial at Bundaberg. Soybean reduced populations of lesion nematode (Pratylenchus zeae) and root-knot nematode (Meloidogyne javanica) in comparison to previous crops of sugarcane or maize but increased populations of spiral nematode (Helicotylenchus dihystera) and maintained populations of dagger nematode (Xiphinema elongatum). However the effect of soybean on P zeae and M. javanica was no longer apparent 15 weeks after planting sugarcane, while later in the season, populations of these nematodes following soybean were as high as or higher than maize or sugarcane. Populations of P zeae were initially reduced by cultivation but due to strong resurgence tended to be higher in conventionally tilled than direct drill plots at the end of the plant crop. Even greater tillage effects were observed with M. javanica and X. elongatum, as nematode populations were significantly higher in conventionally tilled than direct drill plots late in the season. Populations of free-living nematodes in the upper 10 cm of soil were initially highest following soybean, but after 15, 35 and 59 weeks were lower than after sugarcane and contained fewer omnivorous and predatory nematodes. Conventional tillage increased populations of free-living nematodes in soil in comparison to direct drill and was also detrimental to omnivorous and predatory nematodes. These results suggest that crop rotation and tillage not only affect plant-parasitic nematodes directly, but also have indirect effects by impacting on natural enemies that regulate nematode populations. More than 2 million nematodes/m(2) were often present in crop residues on the surface of direct drill plots. Bacterial-feeding nematodes were predominant in residues early in the decomposition process but fungal-feeding nematodes predominated after 15 weeks. This indicates that fungi become an increasingly important component of the detritus food web as decomposition proceeds, and that that the rate of nutrient cycling decreases with time. Correlations between total numbers of free-living nematodes and mineral N concentrations in crop residues and surface soil suggested that the free-living nematode community may provide an indication of the rate of mineralisation of N from organic matter.