916 resultados para forage maize


Relevância:

10.00% 10.00%

Publicador:

Resumo:

An experiment using herds of similar to 20 cows (farmlets) assessed the effects of high stocking rates on production and profitability of feeding systems based on dryland and irrigated perennial ryegrass-based pastures in a Mediterranean environment in South Australia over 4 years. A target level of milk production of 7000 L/cow.year was set, based on predicted intakes of 2.7 t DM/cow.year as concentrates, pasture intakes from 1.5 to 2.7 t/cow.year and purchased fodder. In years 1 and 2, up to 1.5 t DM/cow.year of purchased fodder was used and in years 3 and 4 the amounts were increased if necessary to enable levels of milk production per cow to be maintained at target levels. Cows in dryland farmlets calved in March to May inclusive and were stocked at 2.5, 2.9, 3.3, 3.6 and 4.1 cows/ha, while those in irrigated farmlets calved in August to October inclusive and were stocked at 4.1, 5.2, 6.3 and 7.4 cows/ha. In the first 2 years, when inputs of purchased fodder were limited, milk production per cow was reduced with higher stocking rates (P < 0.01), but in years 3 and 4 there were no differences. Mean production was 7149 kg/cow.year in years 1 and 2, and 8162 kg/cow.year in years 3 and 4. Production per hectare was very closely related to stocking rate in all years (P < 0.01), increasing from 18 to 34 t milk/ha.year for dryland farmlets (1300 to 2200 kg milk solids/ha) and from 30 to 60 t milk/ha.year for irrigated farmlets (2200 to 4100 kg milk solids/ha). Almost all of these increases were attributed to the increases in grain and purchased fodder inputs associated with the increases in stocking rate. Net pasture accumulation rates and pasture harvest were generally not altered with stocking rate, though as stocking rate increased there was a change to more of the pasture being grazed and less conserved in both dryland and irrigated farmlets. Total pasture harvest averaged similar to 8 and 14 t DM/ha.year for dryland and irrigated pastures, respectively. An exception was at the highest stocking rate under irrigation, where pugging during winter was associated with a 14% reduction in annual pasture growth. There were several indications that these high stocking rates may not be sustainable without substantial changes in management practice. There were large and positive nutrient balances and associated increases in soil mineral content (P < 0.01), especially for phosphorus and nitrate nitrogen, with both stocking rate and succeeding years. Levels under irrigation were considerably higher (up to 90 and 240 mg/kg of soil for nitrate nitrogen and phosphorus, respectively) than under dryland pastures (60 and 140 mg/kg, respectively). Soil organic carbon levels did not change with stocking rate, indicating a high level of utilisation of forage grown. Weed ingress was also high (to 22% DM) in all treatments and especially in heavily stocked irrigated pastures during winter. It was concluded the higher stocking rates used exceeded those that are feasible for Mediterranean pastures in this environment and upper levels of stocking are suggested to be 2.5 cows/ha for dryland pastures and 5.2 cows/ha for irrigated pastures. To sustain these suggested stocking rates will require further development of management practices to avoid large increases in soil minerals and weed invasion of pastures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Methane emissions from ruminant livestock represent a loss of carbon during feed conversion, which has implications for both animal productivity and the environment because this gas is considered to be one of the more potent forms of greenhouses gases contributing to global warming. Many strategies to reduce emissions are targeting the methanogens that inhabit the rumen, but such an approach can only be successful if it targets all the major groups of ruminant methanogens. Therefore, a thorough knowledge of the diversity of these microbes in different breeds of cattle and sheep, as well as in response to different diets, is required. A study was undertaken using the molecular techniques denaturing gradient gel electrophoresis, DNA cloning and DNA sequence analysis to define the extent of diversity among methanogens in ruminants, particularly Bos indicus cross cattle, on differing forages in Queensland. It was found that the diversity of methanogens in forage-fed cattle in Queensland was greater than in grain-fed cattle but there was little variability in methanogen community composition between cattle fed different forages. The species that dominate the rumen microbial communities of B. indicus cross cattle are from the genus Methanobrevibacter, although rumen-fluid inoculated digestors fed Leucaena leucocephala leaf were populated with Methanosphaera-like strains, with the Methanobrevibacter-like strains displaced. If ruminant methane emissions are to be reduced, then antimethanogen bioactives that target both broad groups of ruminant methanogens are most likely to be needed, and as a part of an integrated suite of approaches that redirect rumen fermentation towards other more useful end products.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The arutors studied the impact of a forage legume, butterfly pea, on rubber vine at the early establishment phase under seven planting combinations at three nitrogen fertiliser levels. In pure stands, both species increased their shoot and root dry weight yield in response to nitrogen but rubber vine exhibited the greater response. In mixed stands, rubber vine and butterfly pea did not compete with each other at any nitrogen level. An over-yielding response resulted in all mixture combinations in terms of shoot and root yields. Total shoot and root mass of mixed stands significantly out-yielded their highest yielding pure stands by 8% and 27% respectively, suggesting that butterfly pea not only failed to reduce shoot and root growth of rubber vine, but actually improved its growth performance. Consequently, the introduction of butterfly pea to suppress rubber vine is not warranted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For pasture growth in the semi-arid tropics of north-east Australia, where up to 80% of annual rainfall occurs between December and March, the timing and distribution of rainfall events is often more important than the total amount. In particular, the timing of the 'green break of the season' (GBOS) at the end of the dry season, when new pasture growth becomes available as forage and a live-weight gain is measured in cattle, affects several important management decisions that prevent overgrazing and pasture degradation. Currently, beef producers in the region use a GBOS rule based on rainfall (e. g. 40mm of rain over three days by 1 December) to define the event and make their management decisions. A survey of 16 beef producers in north-east Queensland shows three quarters of respondents use a rainfall amount that occurs in only half or less than half of all years at their location. In addition, only half the producers expect the GBOS to occur within two weeks of the median date calculated by the CSIRO plant growth days model GRIM. This result suggests that in the producer rules, either the rainfall quantity or the period of time over which the rain is expected, is unrealistic. Despite only 37% of beef producers indicating that they use a southern oscillation index (SOI) forecast in their decisions, cross validated LEPS (linear error in probability space) analyses showed both the average 3 month July-September SOI and the 2 month August-September SOI have significant forecast skill in predicting the probability of both the amount of wet season rainfall and the timing of the GBOS. The communication and implementation of a rigorous and realistic definition of the GBOS, and the likely impacts of anthropogenic climate change on the region are discussed in the context of the sustainable management of northern Australian rangelands.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The project assembled basic information to allow effective management and manipulation of native pastures in the southern Maranoa region of Queensland. This involved a range of plant studies, including a grazing trial, to quantify the costs of poor pasture composition. While the results focus on perennial grasses, we recognise the important dietary role played by broad-leaved herbs. The plant manipulation studies focussed on ways to change the proportions of plants in a grazed pasture, eg. by recruitment or accelerated morbidity of existing plants. As most perennial grasses have a wide range of potential flowering times outside of mid-winter, rainfall exerts the major influence on flowering and seedset; exceptions are black speargrass, rough speargrass and golden beardgrass that flower only for a restricted period each year. This simplifies potential control options through reducing seedset. Data from field growth studies of four pasture grasses have been used to refine the State's pasture production model GRASP. We also provide detailed data on the forage value of many native species at different growth stages. Wiregrass dominance in pastures on a sandy red earth reduced wool value by only 5-10% at Roma in 1994/95 when winters were very dry and grass seed problems were minimal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Sorghum genome mapping based on DNA markers began in the early 1990s and numerous genetic linkage maps of sorghum have been published in the last decade, based initially on RFLP markers with more recent maps including AFLPs and SSRs and very recently, Diversity Array Technology (DArT) markers. It is essential to integrate the rapidly growing body of genetic linkage data produced through DArT with the multiple genetic linkage maps for sorghum generated through other marker technologies. Here, we report on the colinearity of six independent sorghum component maps and on the integration of these component maps into a single reference resource that contains commonly utilized SSRs, AFLPs, and high-throughput DArT markers. Results: The six component maps were constructed using the MultiPoint software. The lengths of the resulting maps varied between 910 and 1528 cM. The order of the 498 markers that segregated in more than one population was highly consistent between the six individual mapping data sets. The framework consensus map was constructed using a "Neighbours" approach and contained 251 integrated bridge markers on the 10 sorghum chromosomes spanning 1355.4 cM with an average density of one marker every 5.4 cM, and were used for the projection of the remaining markers. In total, the sorghum consensus map consisted of a total of 1997 markers mapped to 2029 unique loci ( 1190 DArT loci and 839 other loci) spanning 1603.5 cM and with an average marker density of 1 marker/0.79 cM. In addition, 35 multicopy markers were identified. On average, each chromosome on the consensus map contained 203 markers of which 58.6% were DArT markers. Non-random patterns of DNA marker distribution were observed, with some clear marker-dense regions and some marker-rare regions. Conclusion: The final consensus map has allowed us to map a larger number of markers than possible in any individual map, to obtain a more complete coverage of the sorghum genome and to fill a number of gaps on individual maps. In addition to overall general consistency of marker order across individual component maps, good agreement in overall distances between common marker pairs across the component maps used in this study was determined, using a difference ratio calculation. The obtained consensus map can be used as a reference resource for genetic studies in different genetic backgrounds, in addition to providing a framework for transferring genetic information between different marker technologies and for integrating DArT markers with other genomic resources. DArT markers represent an affordable, high throughput marker system with great utility in molecular breeding programs, especially in crops such as sorghum where SNP arrays are not publicly available.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Root-lesion nematodes (Pratylenchus thornei Sher and Allen and P. neglectus (Rensch) Filipijev and Schuurmans Stekhoven) cause substantial yield loss to wheat crops in the northern grain region of Australia. Resistance to P. thornei for use in wheat breeding programs was sought among synthetic hexaploid wheats (2n= 6x = 42, AABBDD) produced through hybridisations of Triticum turgidum L. subsp. durum (Desf.) Husn (2n= 4x = 28, AABB) with Aegilops tauschii Coss. (2n= 2x = 14, DD). Resistance was determined for the synthetic hexaploid wheats and their durum and Ae. tauschii parents from the numbers of nematodes in the roots of plants grown for 16 weeks in pots of pasteurised soil inoculated with P. thornei. Fifty-nine (32%) of 186 accessions of synthetic hexaploid wheats had lower numbers of nematodes than Gatcher Selection 50a (GS50a), a partially resistant bread wheat. Greater frequencies of partial resistance were present in the durum parents (72% of 39 lines having lower nematode numbers than GS50a) and in the Ae. tauschii parents (55% of 53 lines). The 59 synthetic hexaploids were re-tested in a second experiment along with their parents. In a third experiment, 11 resistant synthetic hexaploid wheats and their F-1 hybrids with Janz, a susceptible bread wheat, were tested and the F(1)s were found to give nematode counts intermediate between the respective two parents. Synthetic hexaploid wheats with higher levels of resistance resulted from hybridisations where both the durum and Ae. tauschii parents were partially resistant, rather than where only one parent was partially resistant. These results suggest that resistance to P. thornei in synthetic hexaploid wheats is polygenic, with resistances located both in the D genome from Ae. tauschii and in the A and/or B genomes from durum. Five synthetic hexaploid wheats were selected for further study on the basis of (1) a high level of resistance to P. thornei of the synthetic hexaploid wheats and of both their durum and Ae. tauschii parents, (2) being representative of both Australian and CIMMYT (International Maize and Wheat Improvement Centre) durums, and (3) being representative of the morphological subspecies and varieties of Ae. tauschii. These 5 synthetic hexaploid wheats were also shown to be resistant to P. neglectus, whereas GS50a and 2 P. thornei-resistant derivatives were quite susceptible. Results of P. thornei resistance of F(1)s and F(2)s from a half diallel of these 5 synthetic hexaploid wheats, GS50a, and Janz from another study indicate polygenic additive resistance and better general combining ability for the synthetic hexaploid wheats than for GS50a. Published molecular marker studies on a doubled haploid population between the synthetic hexaploid wheat with best general combining ability (CPI133872) and Janz have shown quantitative trait loci for resistance located in all 3 genomes. Synthetic hexaploid wheats offer a convenient way of introgressing new resistances to P. thornei and P. neglectus from both durum and Ae. tauschii into commercial bread wheats.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite recognition that non-native plant species represent a substantial risk to natural systems, there is currently no compilation of weeds that impact on the biodiversity of the rangelands within Australia. Using published and expert knowledge, this paper presents a list of 622 non-native naturalised species known to occur within the rangelands. Of these, 160 species (26%) are considered a current threat to rangeland biodiversity. Most of these plant species have been deliberately introduced for forage or other commercial use (e.g. nursery trade). Among growth forms, shrubs and perennial grasses comprise over 50% of species that pose the greatest risk to rangeland biodiversity. We identify regions within the rangelands containing both high biodiversity values and a high proportion of weeds and recommend these areas as priorities for weed management. Finally, we examine the resources available for weed detection and identification since detecting weeds in the early stages of invasion is the most cost effective method of reducing further impact.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stay-green, an important trait for grain yield of sorghum grown under water limitation, has been associated with a high leaf nitrogen content at the start of grain filling. This study quantifies the N demand of leaves and stems and explores effects of N stress on the N balance of vegetative plant parts of three sorghum hybrids differing in potential crop height. The hybrids were grown under well-watered conditions at three levels of N supply. Vertical profiles of biomass and N% of leaves and stems, together with leaf size and number, and specific leaf nitrogen (SLN), were measured at regular intervals. The hybrids had similar minimum but different critical and maximum SLN, associated with differences in leaf size and N partitioning, the latter associated with differences in plant height. N demand of expanding new leaves was represented by critical SLN, and structural stem N demand by minimum stem N%. The fraction of N partitioned to leaf blades increased under N stress. A framework for N dynamics of leaves and stems is developed that captures effects of N stress and genotype on N partitioning and on critical and maximum SLN.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and Aims: The evolution of resistance to herbicides is a substantial problem in contemporary agriculture. Solutions to this problem generally consist of the use of practices to control the resistant population once it evolves, and/or to institute preventative measures before populations become resistant. Herbicide resistance evolves in populations over years or decades, so predicting the effectiveness of preventative strategies in particular relies on computational modelling approaches. While models of herbicide resistance already exist, none deals with the complex regional variability in the northern Australian sub-tropical grains farming region. For this reason, a new computer model was developed. Methods: The model consists of an age- and stage-structured population model of weeds, with an existing crop model used to simulate plant growth and competition, and extensions to the crop model added to simulate seed bank ecology and population genetics factors. Using awnless barnyard grass (Echinochloa colona) as a test case, the model was used to investigate the likely rate of evolution under conditions expected to produce high selection pressure. Key Results: Simulating continuous summer fallows with glyphosate used as the only means of weed control resulted in predicted resistant weed populations after approx. 15 years. Validation of the model against the paddock history for the first real-world glyphosate-resistant awnless barnyard grass population shows that the model predicted resistance evolution to within a few years of the real situation. Conclusions: This validation work shows that empirical validation of herbicide resistance models is problematic. However, the model simulates the complexities of sub-tropical grains farming in Australia well, and can be used to investigate, generate and improve glyphosate resistance prevention strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present review identifies various constraints relating to poor adoption of ley-pastures in south-west Queensland, and suggests changes in research, development and extension efforts for improved adoption. The constraints include biophysical, economic and social constraints. In terms of biophysical constraints, first, shallower soil profiles with subsoil constraints (salt and sodicity), unpredictable rainfall, drier conditions with higher soil temperature and evaporative demand in summer, and frost and subzero temperature in winter, frequently result in a failure of established, or establishing, pastures. Second, there are limited options for legumes in a ley-pasture, with the legumes currently being mostly winter-active legumes such as lucerne and medics. Winter-active legumes are ineffective in improving soil conditions in a region with summer-dominant rainfall. Third, most grain growers are reluctant to include grasses in their ley-pasture mix, which can be uneconomical for various reasons, including nitrogen immobilisation, carryover of cereal diseases and depressed yields of the following cereal crops. Fourth, a severe depletion of soil water following perennial ley-pastures (grass + legumes or lucerne) can reduce the yields of subsequent crops for several seasons, and the practice of longer fallows to increase soil water storage may be uneconomical and damaging to the environment. Economic assessments of integrating medium- to long-term ley-pastures into cropping regions are generally less attractive because of reduced capital flow, increased capital investment, economic loss associated with establishment and termination phases of ley-pastures, and lost opportunities for cropping in a favourable season. Income from livestock on ley-pastures and soil productivity gains to subsequent crops in rotation may not be comparable to cropping when grain prices are high. However, the economic benefits of ley-pastures may be underestimated, because of unaccounted environmental benefits such as enhanced water use, and reduced soil erosion from summer-dominant rainfall, and therefore, this requires further investigation. In terms of social constraints, the risk of poor and unreliable establishment and persistence, uncertainties in economic and environmental benefits, the complicated process of changing from crop to ley-pastures and vice versa, and the additional labour and management requirements of livestock, present growers socially unattractive and complex decision-making processes for considering adoption of an existing medium- to long-term ley-pasture technology. It is essential that research, development and extension efforts should consider that new ley-pasture options, such as incorporation of a short-term summer forage legume, need to be less risky in establishment, productive in a region with prevailing biophysical constraints, economically viable, less complex and highly flexible in the change-over processes, and socially attractive to growers for adoption in south-west Queensland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The response of soybean (Glycine max) and dry bean (Phaseolus vulgaris) to feeding by Helicoverpa armigera during the pod-fill stage was studied in irrigated field cages over three seasons to determine the relationship between larval density and yield loss, and to develop economic injury levels. H. armigera intensity was calculated in Helicoverpa injury equivalent (HIE) units, where 1 HIE was the consumption of one larva from the start of the infestation period to pupation. In the dry bean experiment, yield loss occurred at a rate 6.00 ± 1.29 g/HIE while the rates of loss in the three soybean experiments were 4.39 ± 0.96 g/HIE, 3.70 ± 1.21 g/HIE and 2.12 ± 0.71 g/HIE. These three slopes were not statistically different (P > 0.05) and the pooled estimate of the rate of yield loss was 3.21 ± 0.55 g/HIE. The first soybean experiment also showed a split-line form of damage curve with a rate of yield loss of 26.27 ± 2.92 g/HIE beyond 8.0 HIE and a rapid decline to zero yield. In dry bean, H. armigera feeding reduced total and undamaged pod numbers by 4.10 ± 1.18 pods/HIE and 12.88 ± 1.57 pods/HIE respectively, while undamaged seed numbers were reduced by 35.64 ± 7.25 seeds/HIE. In soybean, total pod numbers were not affected by H. armigera infestation (out to 8.23 HIE in Experiment 1) but seed numbers (in Experiments 1 and 2) and the number of seeds/pod (in all experiments) were adversely affected. Seed size increased with increases in H. armigera density in two of the three soybean experiments, indicating plant compensatory responses to H. armigera feeding. Analysis of canopy pod profiles indicated that loss of pods occurred from the top of the plant downwards, but with an increase in pod numbers close to the ground at higher pest densities as the plant attempted to compensate for damage. Based on these results, the economic injury levels for H. armigera on dry bean and soybean are approximately 0.74 HIE and 2.31 HIE/m2, respectively (0.67 and 2.1 HIE/row-m for 91 cm rows).