99 resultados para Irrigated lands
Resumo:
Two field experiments using maize (Pioneer 31H50) and three watering regimes [(i) irrigated for the whole crop cycle, until anthesis, (ii) not at all (experiment 1) and (iii) fully irrigated and rain grown for the whole crop cycle (experiment 2)] were conducted at Gatton, Australia, during the 2003-04 season. Data on crop ontogeny, leaf, sheath and internode lengths and leaf width, and senescence were collected at 1- to 3-day intervals. A glasshouse experiment during 2003 quantified the responses of leaf shape and leaf presentation to various levels of water stress. Data from experiment 1 were used to modify and parameterise an architectural model of maize (ADEL-Maize) to incorporate the impact of water stress on maize canopy characteristics. The modified model produced accurate fitted values for experiment 1 for final leaf area and plant height, but values during development for leaf area were lower than observed data. Crop duration was reasonably well fitted and differences between the fully irrigated and rain-grown crops were accurately predicted. Final representations of maize crop canopies were realistic. Possible explanations for low values of leaf area are provided. The model requires further development using data from the glasshouse study and before being validated using data from experiment 2 and other independent data. It will then be used to extend functionality in architectural models of maize. With further research and development, the model should be particularly useful in examining the response of maize production to water stress including improved prediction of total biomass and grain yield. This will facilitate improved simulation of plant growth and development processes allowing investigation of genotype by environment interactions under conditions of suboptimal water supply.
Resumo:
Surveys were conducted between 1997 and 2001 to investigate the incidence of overwintering Helicoverpa spp. pupae under summer crop residues on the Darling Downs, Queensland. Only Helicoverpa armigera was represented in collections of overwintering pupae. The results indicated that late-season crops of cotton, sorghum, maize, soybean, mungbean and sunflower were equally likely to have overwintering pupae under them. In the absence of tillage practices, these crops had the potential to produce similar numbers of moths/ha in the spring. There were expected differences between years in the densities of overwintering pupae and the number of emerged moths/ha. Irrigated crops produced 2.5 times more moths/ha than dryland crops. Overall survival from autumn-formed pupae to emerged moths averaged 44%, with a higher proportion of pupae under maize surviving to produce moths than each of the other crops. Parasitoids killed 44.1% of pupae, with Heteropelma scaposum representing 83.3% of all parasitoids reared from pupae. Percentage parasitism levels were lower in irrigated crops (27.6%) compared with dryland crops (40.5%). Recent changes to Helicoverpa spp. management in cotton/grain-farming systems in south-eastern Queensland, including widespread adoption of Bt cotton, and use of more effective and more selective insecticides, could lead to lower densities of overwintering pupae under late summer crops.
Resumo:
The effects of recycled water (effluent) on 8 tropical grasses growing in 100-L bags of sand were studied in Murrumba Downs, just north of Brisbane in southern Queensland (27.4°S, 153.1°E). The species used were: Axonopus compressus (broad-leaf carpetgrass), Cynodon dactylon (bermudagrass 'Winter Green') and C. dactylon x C. transvaalensis hybrid ('Tifgreen'), Digitaria didactyla (Queensland blue couch), Paspalum notatum (bahiagrass '38824'), Stenotaphrum secundatum (buffalograss 'Palmetto'), Eremochloa ophiuroides (centipedegrass 'Centec') and Zoysia japonica (zoysiagrass 'ZT-11'). From May 2002 to June 2003, control plots were irrigated with potable water and fertilised monthly. Plots irrigated with effluent received no fertiliser from May to August 2002 (deficient phase), complete fertilisers at control rates from September to December 2002 (recovery phase) and nitrogen (N) only at control rates from January to June 2003 (supplementary phase). In October 2002, the average shoot weight of plants from the effluent plots was 4% of that from potable plots, with centipedegrass less affected than the other species (relative growth of 20%). Shoot N concentrations declined by 40% in the effluent plots from May to August 2002 (1.8 ± 0.1%) along with phosphorus (P, 0.46 ± 0.02%), potassium (K, 1.6 ± 0.2%), sulfur (S, 0.28 ± 0.02%) and manganese (Mn, 19 ± 2 mg/kg) concentrations. Only the N and Mn concentrations were below the optimum for grasses. The grasses grew satisfactorily when irrigated with effluent if it was supplemented with N. Between January and June 2003 the average weight of shoots from the effluent plots was 116% of the weight of shoots from the control plots. Shoot nutrient concentrations were also similar in the 2 regimes at this time. The recycled water supplied 23% of the N required for maximum shoot growth, 80-100% of the P and K, and 500-880% of the S, calcium and magnesium. The use of recycled water represents savings in irrigation and fertiliser costs, and reductions in the discharge of N and P to local waterways. Effluent is currently about 50% of the cost of potable water with a saving of about AU$8000/ha.year for a typical sporting field.
Resumo:
Dwindling water supplies for irrigation are prompting alternative management choices by irrigators. Limited irrigation, where less water is applied than full crop demand, may be a viable approach. Application of limited irrigation to corn was examined in this research. Corn was grown in crop rotations with dryland, limited irrigation, or full irrigation management from 1985 to 1999. Crop rotations included corn following corn (continuous corn), corn following wheat, followed by soybean (wheat-corn-soybean), and corn following soybean (corn-soybean). Full irrigation was managed to meet crop evapotranspiration requirements (ETc). Limited irrigation was managed with a seasonal target of no more than 150 mm applied. Precipitation patterns influenced the outcomes of measured parameters. Dryland yields had the most variation, while fully irrigated yields varied the least. Limited irrigation yields were 80% to 90%> of fully irrigated yields, but the limited irrigation plots received about half the applied water. Grain yields were significantly different among irrigation treatments. Yields were not significantly different among rotation treatments for all years and water treatments. For soil water parameters, more statistical differences were detected among the water management treatments than among the crop rotation treatments. Economic projections of these management practices showed that full irrigation produced the most income if water was available. Limited irrigation increased income significantly from dryland management.
Resumo:
The effect of defoliation on Amarillo (Arachis pintoi cv. Amarillo) was studied in a glasshouse and in mixed swards with 2 tropical grasses. In the glasshouse, Amarillo plants grown in pots were subjected to a 30/20°C or 25/15°C temperature regime and to defoliation at 10-, 20- or 30-day intervals for 60 days. Two field plot studies were conducted on Amarillo with either irrigated kikuyu (Pennisetum clandestinum) in autumn and spring or dryland Pioneer rhodes grass (Chloris gayana) over summer and autumn. Treatments imposed were 3 defoliation intervals (7, 14 and 28 days) and 2 residual heights (5 and 10 cm for kikuyu; 3 and 10 cm for rhodes grass) with extra treatments (56 days to 3 cm for both grasses and 21 days to 5 cm for kikuyu). Defoliation interval had no significant effect on accumulated Amarillo leaf dry matter (DM) at either temperature regime. At the higher temperature, frequent defoliation reduced root dry weight (DW) and increased crude protein (CP) but had no effect on stolon DW or in vitro organic matter digestibility (OMD). On the other hand, at the lower temperature, frequent defoliation reduced stolon DW and increased OMD but had no effect on root DW or CP. Irrespective of temperaure and defoliation, water-soluble carbohydrate levels were higher in stolons than in roots (4.70 vs 3.65%), whereas for starch the reverse occured (5.37 vs 9.44%). Defoliating the Amarillo-kikuyu sward once at 56 days to 3 cm produced the highest DM yield in autumn and sprong (582 and 7121 kg/ha DM, respectively), although the Amarillo component and OMD were substantially reduced. Highest DM yields (1726 kg/ha) were also achieved in the Amarillo-rhodes grass sward when defoliated every 56 days to 3 cm, although the Amarillo component was unaffected. In a mixed sward with either kikuyu or rhodes grass, the Amarillo component in the sward was maintained up to a 28-day defoliation interval and was higher when more severely defoliated. The results show that Amarillo can tolerate frequent defoliation and that it can co-exist with tropical grasses of differing growth habits, provided the Amarillo-tropical grass sward is subject to frequent and severe defoliation.
Resumo:
In the subtropics of Australia, irrigated temperate species are the key to reliable cool season feed on dairy farms. Persistence of perennial species is a major limitation to achieving reliable production from irrigated areas and yearly sowings of annual ryegrasses have replaced them as the most productive cool season forage production system in the subtropics. This series of experiments evaluated the yield, and resistance to rust damage, of commercially available cultivars and breeders' lines of annually sown ryegrasses (Lolium multiflorum, L. rigidum, L. x boucheanum and L perenne) in pure, nitrogen-fertilised swards under irrigation in the subtropics over a 22-year period. Barberia and Aristocrat 2 were the most adapted cultivars for subtropical conditions, producing high yields (119 and 114% of mean yield, respectively) and demonstrating the least rust damage. Newer selections from New Zealand, South African, United States of America and European breeding programs are performing better under subtropical conditions than older cultivars, particularly if a component of the selection process has been conducted in that environment. Cultivars such as Passerei Plus, Crusader, Hulk, Status and Warrior are examples of this process, producing between 105 and 115% of mean yield. Yields of annual ryegrass cultivars, which have been available or still are available for sale in Australia, ranged from 14-30 t/ha DM, depending on cultivar, site and seasonal conditions. Yields were lower at the site, which had inferior soil structure and drainage. Up to 50% of yield was produced in the 3 winter months. There was a trend towards improved yields and better tolerance of crown rust from experimental lines in the subtropics, as breeders strive for wider adaptation. Around 70% of the variation in total yield of annual ryegrass and 50 and 60% of the variation in winter and spring yield, respectively, were significantly explained by cultivar, site and climatic variables in autumn, winter and spring. While level of rust damage had no effect on total or seasonal yields, it affected the amount of green leaf available in spring. Under subtropical conditions, winter, spring and overall (autumn to mid-summer) temperatures influenced the- development of rust, which along with cultivar, accounted for 46% of the variation in rust damage. Cultivars showed a range of adaptation, with some performing well only under adverse conditions, some being well adapted to all conditions and some which performed well only under favoured conditions. Cultivars with high winter yields were most suited to subtropical conditions and included Aristocrat 2 (now released as CM 108), Barberia, Warrior, Crusader, Status, Passerei Plus and Hulk. Short growing season types such as Winter Star and T Rex performed well in winter but achieved lower total production, and long season cultivars such as Flanker rarely achieved their potential because of unfavourable conditions in late summer.
Resumo:
It has been reported that high-density planting of sugarcane can improve cane and sugar yield through promoting rapid canopy closure and increasing radiation interception earlier in crop growth. It is widely known that the control of adverse soil biota through fumigation (removes soil biological constraints and improves soil health) can improve cane and sugar yield. Whether the responses to high-density planting and improved soil health are additive or interactive has important implications for the sugarcane production system. Field experiments established at Bundaberg and Mackay, Queensland, Australia, involved all combinations of 2-row spacings (0.5 and 1.5 m), two planting densities (27 000 and 81 000 two-eyed setts/ha), and two soil fumigation treatments (fumigated and non-fumigated). The Bundaberg experiment had two cultivars (Q124, Q155), was fully irrigated, and harvested 15 months after planting. The Mackay experiment had one cultivar (Q117), was grown under rainfed conditions, and harvested 10 months after planting. High-density planting (81 000 setts/ha in 0.5-m rows) did not produce any more cane or sugar yield at harvest than low-density planting (27 000 setts/ha in 1.5-m rows) regardless of location, crop duration (15 v. 10 months), water supply (irrigated v. rainfed), or soil health (fumigated v. non-fumigated). Conversely, soil fumigation generally increased cane and sugar yields regardless of site, row spacing, and planting density. In the Bundaberg experiment there was a large fumigation x cultivar x density interaction (P<0.01). Cultivar Q155 responded positively to higher planting density in non-fumigated soil but not in fumigated soil, while Q124 showed a negative response to higher planting density in non-fumigated soil but no response in fumigated soil. In the Mackay experiment, Q117 showed a non-significant trend of increasing yield in response to increasing planting density in non-fumigated soil, similar to the Q155 response in non-fumigated soil at Bundaberg. The similarity in yield across the range of row spacings and planting densities within experiments was largely due to compensation between stalk number and stalk weight, particularly when fumigation was used to address soil health. Further, the different cultivars (Q124 and Q155 at Bundaberg and Q117 at Mackay) exhibited differing physiological responses to the fumigation, row spacing, and planting density treatments. These included the rate of tiller initiation and subsequent loss, changes in stalk weight, and propensity to lodging. These responses suggest that there may be potential for selecting cultivars suited to different planting configurations.
Resumo:
Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.
Resumo:
Seed persistence is poorly quantified for invasive plants of subtropical and tropical environments and Lantana camara, one of the world's worst weeds, is no exception. We investigated germination, seedling emergence, and seed survival of two lantana biotypes (Pink and pink-edged red [PER]) in southeastern Queensland, Australia. Controlled experiments were undertaken in 2002 and repeated in 2004, with treatments comprising two differing environmental regimes (irrigated and natural rainfall) and sowing depths (0 and 2 cm). Seed survival and seedling emergence were significantly affected by all factors (time, biotype, environment, sowing depth, and cohort) (P < 0.001). Seed dormancy varied with treatment (environment, sowing depth, biotype, and cohort) (P < 0.001), but declined rapidly after 6 mo. Significant differential responses by the two biotypes to sowing depth and environment were detected for both seed survival and seedling emergence (P < 0.001). Seed mass was consistently lower in the PER biotype at the population level (P < 0.001), but this variation did not adequately explain the differential responses. Moreover, under natural rainfall the magnitude of the biotype effect was unlikely to result in ecologically significant differences. Seed survival after 36 mo under natural rainfall ranged from 6.8 to 21.3%. Best fit regression analysis of the decline in seed survival over time yielded a five-parameter exponential decay model with a lower asymptote approaching −0.38 (% seed survival = [( 55 − (−0.38)) • e (k • t)] + −0.38; R2 = 88.5%; 9 df). Environmental conditions and burial affected the slope parameter or k value significantly (P < 0.01). Seed survival projections from the model were greatest for buried seeds under natural rainfall (11 yr) and least under irrigation (3 yr). Experimental data and model projections suggest that lantana has a persistent seed bank and this should be considered in management programs, particularly those aimed at eradication.
Resumo:
The response of soybean (Glycine max) and dry bean (Phaseolus vulgaris) to feeding by Helicoverpa armigera during the pod-fill stage was studied in irrigated field cages over three seasons to determine the relationship between larval density and yield loss, and to develop economic injury levels. H. armigera intensity was calculated in Helicoverpa injury equivalent (HIE) units, where 1 HIE was the consumption of one larva from the start of the infestation period to pupation. In the dry bean experiment, yield loss occurred at a rate 6.00 ± 1.29 g/HIE while the rates of loss in the three soybean experiments were 4.39 ± 0.96 g/HIE, 3.70 ± 1.21 g/HIE and 2.12 ± 0.71 g/HIE. These three slopes were not statistically different (P > 0.05) and the pooled estimate of the rate of yield loss was 3.21 ± 0.55 g/HIE. The first soybean experiment also showed a split-line form of damage curve with a rate of yield loss of 26.27 ± 2.92 g/HIE beyond 8.0 HIE and a rapid decline to zero yield. In dry bean, H. armigera feeding reduced total and undamaged pod numbers by 4.10 ± 1.18 pods/HIE and 12.88 ± 1.57 pods/HIE respectively, while undamaged seed numbers were reduced by 35.64 ± 7.25 seeds/HIE. In soybean, total pod numbers were not affected by H. armigera infestation (out to 8.23 HIE in Experiment 1) but seed numbers (in Experiments 1 and 2) and the number of seeds/pod (in all experiments) were adversely affected. Seed size increased with increases in H. armigera density in two of the three soybean experiments, indicating plant compensatory responses to H. armigera feeding. Analysis of canopy pod profiles indicated that loss of pods occurred from the top of the plant downwards, but with an increase in pod numbers close to the ground at higher pest densities as the plant attempted to compensate for damage. Based on these results, the economic injury levels for H. armigera on dry bean and soybean are approximately 0.74 HIE and 2.31 HIE/m2, respectively (0.67 and 2.1 HIE/row-m for 91 cm rows).
Resumo:
The response of vegetative soybean (Glycine max) to Helicoverpa armigera feeding was studied in irrigated field cages over three years in eastern Australia to determine the relationship between larval density and yield loss, and to develop economic injury levels. Rather than using artificial defoliation techniques, plants were infested with either eggs or larvae of H. armigera, and larvae allowed to feed until death or pupation. Larvae were counted and sized regularly and infestation intensity was calculated in Helicoverpa injury equivalent (HIE) units, where 1 HIE was the consumption of one larva from the start of the infestation period to pupation. In the two experiments where yield loss occurred, the upper threshold for zero yield loss was 7.51 ± 0.21 HIEs and 6.43 ± 1.08 HIEs respectively. In the third experiment, infestation intensity was lower and no loss of seed yield was detected up to 7.0 HIEs. The rate of yield loss/HIE beyond the zero yield loss threshold varied between Experiments 1 and 2 (-9.44 ± 0.80 g and -23.17 ± 3.18 g, respectively). H. armigera infestation also affected plant height and various yield components (including pod and seed numbers and seeds/pod) but did not affect seed size in any experiment. Leaf area loss of plants averaged 841 and 1025 cm2/larva in the two experiments compared to 214 and 302 cm2/larva for cohort larvae feeding on detached leaves at the same time, making clear that artificial defoliation techniques are unsuitable for determining H. armigera economic injury levels on vegetative soybean. Analysis of canopy leaf area and pod profiles indicated that leaf and pod loss occurred from the top of the plant downwards. However, there was an increase in pod numbers closer to the ground at higher pest densities as the plant attempted to compensate for damage. Defoliation at the damage threshold was 18.6 and 28.0% in Experiments 1 and 2, indicating that yield loss from H. armigera feeding occurred at much lower levels of defoliation than previously indicated by artificial defoliation studies. Based on these results, the economic injury level for H. armigera on vegetative soybean is approximately 7.3 HIEs/row-metre in 91 cm rows or 8.0 HIEs/m2.
Resumo:
Farmlets, each of 20 cows, were established to field test five milk production systems and provide a learning platform for farmers and researchers in a subtropical environment. The systems were developed through desktop modelling and industry consultation in response to the need for substantial increases in farm milk production following deregulation of the industry. Four of the systems were based on grazing and the continued use of existing farmland resource bases, whereas the fifth comprised a feedlot and associated forage base developed as a greenfield site. The field evaluation was conducted over 4 years under more adverse environmental conditions than anticipated with below average rainfall and restrictions on irrigation. For the grazed systems, mean annual milk yield per cow ranged from 6330 kg/year (1.9 cows/ha) for a herd based on rain-grown tropical pastures to 7617 kg/year (3.0 cows/ha) where animals were based on temperate and tropical irrigated forages. For the feedlot herd, production of 9460 kg/cow.year (4.3 cows/ha of forage base) was achieved. For all herds, the level of production achieved required annual inputs of concentrates of similar to 3 t DM/animal and purchased conserved fodder from 0.3 to 1.5 t DM/animal. This level of supplementary feeding made a major contribution to total farm nutrient inputs, contributing 50% or more of the nitrogen, phosphorus and potassium entering the farming system, and presents challenges to the management of manure and urine that results from the higher stocking rates enabled. Mean annual milk production for the five systems ranged from 88 to 105% of that predicted by the desktop modelling. This level of agreement for the grazed systems was achieved with minimal overall change in predicted feed inputs; however, the feedlot system required a substantial increase in inputs over those predicted. Reproductive performance for all systems was poorer than anticipated, particularly over the summer mating period. We conclude that the desktop model, developed as a rapid response to assist farmers modify their current farming systems, provided a reasonable prediction of inputs required and milk production. Further model development would need to consider more closely climate variability, the limitations summer temperatures place on reproductive success and the feed requirements of feedlot herds.
Resumo:
It is essential to provide experimental evidence and reliable predictions of the effects of water stress on crop production in the drier, less predictable environments. A field experiment undertaken in southeast Queensland, Australia with three water regimes (fully irrigated, rainfed and irrigated until late canopy expansion followed by rainfed) was used to compare effects of water stress on crop production in two maize (Zea mays L.) cultivars (Pioneer 34N43 and Pioneer 31H50). Water stress affected growth and yield more in Pioneer 34N43 than in Pioneer 31H50. A crop model APSIM-Maize, after having been calibrated for the two cultivars, was used to simulate maize growth and development under water stress. The predictions on leaf area index (LAI) dynamics, biomass growth and grain yield under rain fed and irrigated followed by rain fed treatments was reasonable, indicating that stress indices used by APSIM-Maize produced appropriate adjustments to crop growth and development in response to water stress. This study shows that Pioneer 31H50 is less sensitive to water stress and thus a preferred cultivar in dryland conditions, and that it is feasible to provide sound predictions and risk assessment for crop production in drier, more variable conditions using the APSIM-Maize model.
Resumo:
We compared daily net radiation (Rn) estimates from 19 methods with the ASCE-EWRI Rn estimates in two climates: Clay Center, Nebraska (sub-humid) and Davis, California (semi-arid) for the calendar year. The performances of all 20 methods, including the ASCE-EWRI Rn method, were then evaluated against Rn data measured over a non-stressed maize canopy during two growing seasons in 2005 and 2006 at Clay Center. Methods differ in terms of inputs, structure, and equation intricacy. Most methods differ in estimating the cloudiness factor, emissivity (e), and calculating net longwave radiation (Rnl). All methods use albedo (a) of 0.23 for a reference grass/alfalfa surface. When comparing the performance of all 20 Rn methods with measured Rn, we hypothesized that the a values for grass/alfalfa and non-stressed maize canopy were similar enough to only cause minor differences in Rn and grass- and alfalfa-reference evapotranspiration (ETo and ETr) estimates. The measured seasonal average a for the maize canopy was 0.19 in both years. Using a = 0.19 instead of a = 0.23 resulted in 6% overestimation of Rn. Using a = 0.19 instead of a = 0.23 for ETo and ETr estimations, the 6% difference in Rn translated to only 4% and 3% differences in ETo and ETr, respectively, supporting the validity of our hypothesis. Most methods had good correlations with the ASCE-EWRI Rn (r2 > 0.95). The root mean square difference (RMSD) was less than 2 MJ m-2 d-1 between 12 methods and the ASCE-EWRI Rn at Clay Center and between 14 methods and the ASCE-EWRI Rn at Davis. The performance of some methods showed variations between the two climates. In general, r2 values were higher for the semi-arid climate than for the sub-humid climate. Methods that use dynamic e as a function of mean air temperature performed better in both climates than those that calculate e using actual vapor pressure. The ASCE-EWRI-estimated Rn values had one of the best agreements with the measured Rn (r2 = 0.93, RMSD = 1.44 MJ m-2 d-1), and estimates were within 7% of the measured Rn. The Rn estimates from six methods, including the ASCE-EWRI, were not significantly different from measured Rn. Most methods underestimated measured Rn by 6% to 23%. Some of the differences between measured and estimated Rn were attributed to the poor estimation of Rnl. We conducted sensitivity analyses to evaluate the effect of Rnl on Rn, ETo, and ETr. The Rnl effect on Rn was linear and strong, but its effect on ETo and ETr was subsidiary. Results suggest that the Rn data measured over green vegetation (e.g., irrigated maize canopy) can be an alternative Rn data source for ET estimations when measured Rn data over the reference surface are not available. In the absence of measured Rn, another alternative would be using one of the Rn models that we analyzed when all the input variables are not available to solve the ASCE-EWRI Rn equation. Our results can be used to provide practical information on which method to select based on data availability for reliable estimates of daily Rn in climates similar to Clay Center and Davis.
Resumo:
Drought during the pre-flowering stage can increase yield of peanut. There is limited information on genotypic variation for tolerance to and recovery from pre-flowering drought (PFD) and more importantly the physiological traits underlying genotypic variation. The objectives of this study were to determine the effects of moisture stress during the pre-flowering phase on pod yield and to understand some of the physiological responses underlying genotypic variation in response to and recovery from PFD. A glasshouse and field experiments were conducted at Khon Kaen University, Thailand. The glasshouse experiment was a randomized complete block design consisting of two watering regimes, i.e. fully-irrigated control and 1/3 available soil water from emergence to 40 days after emergence followed by adequate water supply, and 12 peanut genotypes. The field experiment was a split-plot design with two watering regimes as main-plots, and 12 peanut genotypes as sub-plots. Measurements of N-2 fixation, leaf area (LA) were made in both experiments. In addition, root growth was measured in the glasshouse experiment. Imposition of PFD followed by recovery resulted in an average increase in yield of 24 % (range from 10 % to 57 %) and 12 % (range from 2 % to 51 %) in the field and glasshouse experiments, respectively. Significant genotypic variation for N-2 fixation, LA and root growth was also observed after recovery. The study revealed that recovery growth following release of PFD had a stronger influence on final yield than tolerance to water deficits during the PFD. A combination of N-2 fixation, LA and root growth accounted for a major portion of the genotypic variation in yield (r = 0.68-0.93) suggesting that these traits could be used as selection criteria for identifying genotypes with rapid recovery from PFD. A combined analysis of glasshouse and field experiments showed that LA and N-2 fixation during the recovery had low genotype x environment interaction indicating potential for using these traits for selecting genotypes in peanut improvement programs.