979 resultados para Seasons.
Resumo:
Physical and chemical properties of sap and sap concentrations of constitutive alk(en)ylresorcinols were determined in several varieties of mango grown in different locations in Queensland, Australia, over two consecutive cropping seasons. Sap weight from individual fruit, sap pH, percentage of non-aqueous sap and concentrations of constitutive alk(en)ylresorcinols (5-n-heptadecenylresorcinol and 5-n-pentadecylresorcinol) in sap varied significantly among the varieties. 'Calypso', 'Keitt', 'Kensington Pride' and 'Celebration' had the greatest proportion of non-aqueous sap, whereas 'Nam Doc Mai' had the least. The highest concentrations of 5-n-heptadecenylresorcinol were found in the sap of 'Kensington Pride', and the lowest in 'Honey Gold' and 'Nam Doc Mai'. Highest concentrations of 5-n-pentadecylresorcinol were found in sap of 'Calypso' and 'Celebration', and the lowest levels were in 'Honey Gold' and 'Nam Doc Mai'. There was a direct relationship between the percentage of non-aqueous sap and the concentrations of alk(en)ylresorcinols (r(2) = 0.77 for 5-n-heptadecenylresorcinol, and r(2) = 0.87 for 5-n-pentadecylresorcinol). The alk(en)ylresorcinols were distributed mainly in the upper non-aqueous phase of 'Kensington Pride' sap. Growing location also had significant effects on the composition of mango sap but the effects appeared to be related to differences in maturity. Sap removal is necessary to prevent sapburn, but considerable quantities of alk(en)ylresorcinols that assist in protecting the harvested fruit from anthracnose disease are also removed.
Resumo:
A total of 2115 heifers from two tropical genotypes (1007 Brahman and 1108 Tropical Composite) raised in four locations in northern Australia were ovarian-scanned every 4-6 weeks to determine the age at the first-observed corpus luteum (CL) and this was used to de. ne the age at puberty for each heifer. Other traits recorded at each time of ovarian scanning were liveweight, fat depths and body condition score. Reproductive tract size was measured close to the start of the first joining period. Results showed significant effects of location and birth month on the age at first CL and associated puberty traits. Genotypes did not differ significantly for the age or weight at first CL; however, Brahman were fatter at first CL and had a small reproductive tract size compared with that of Tropical Composite. Genetic analyses estimated the age at first CL to be moderately to highly heritable for Brahman (0.57) and Tropical Composite (0.52). The associated traits were also moderately heritable, except for reproductive tract size in Brahmans (0.03) and for Tropical Composite, the presence of an observed CL on the scanning day closest to the start of joining (0.07). Genetic correlations among puberty traits were mostly moderate to high and generally larger in magnitude for Brahman than for Tropical Composite. Genetic correlations between the age at CL and heifer- and steer-production traits showed important genotype differences. For Tropical Composite, the age at CL was negatively correlated with the heifer growth rate in their first postweaning wet season (-0.40) and carcass marbling score (-0.49), but was positively correlated with carcass P8 fat depth (0.43). For Brahman, the age at CL was moderately negatively genetically correlated with heifer measures of bodyweight, fatness, body condition score and IGF-I, in both their first postweaning wet and second dry seasons, but was positively correlated with the dry-season growth rate. For Brahman, genetic correlations between the age at CL and steer traits showed possible antagonisms with feedlot residual feed intake (-0.60) and meat colour (0.73). Selection can be used to change the heifer age at puberty in both genotypes, with few major antagonisms with steer- and heifer- production traits.
Resumo:
Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.
Resumo:
Bemisia tabaci, biotype B, commonly known as the silverleaf whitefly (SLW) is an alien species that invaded Australia in the mid-90s. This paper reports on the invasion ecology of SLW and the factors that are likely to have contributed to the first outbreak of this major pest in an Australian cotton cropping system, population dynamics of SLW within whitefly-susceptible crop (cotton and cucurbit) and non-crop vegetation (sowthistle, Sonchus spp.) components of the cropping system were investigated over four consecutive growing seasons (September-June) 2001/02-2004/05 in the Emerald Irrigation Area (EIA) of Queensland, Australia. Based on fixed geo-referenced sampling sites, variation in spatial and temporal abundance of SLW within each system component was quantified to provide baseline data for the development of ecologically sustainable pest management strategies. Parasitism of large (3rd and 4th instars) SLW nymphs by native aphelinid wasps was quantified to determine the potential for natural control of SLW populations. Following the initial outbreak in 2001/02, SLW abundance declined and stabilised over the next three seasons. The population dynamics of SLW is characterised by inter-seasonal population cycling between the non-crop (weed) and cotton components of the EIA cropping system. Cotton was the largest sink for and source of SLW during the study period. Over-wintering populations dispersed from weed host plant sources to cotton in spring followed by a reverse dispersal in late summer and autumn to broad-leaved crops and weeds. A basic spatial source-sink analysis showed that SLW adult and nymph densities were higher in cotton fields that were closer to over-wintering weed sources throughout spring than in fields that were further away. Cucurbit fields were not significant sources of SLW and did not appear to contribute significantly to the regional population dynamics of the pest. Substantial parasitism of nymphal stages throughout the study period indicates that native parasitoid species and other natural enemies are important sources of SLW mortality in Australian cotton production systems. Weather conditions and use of broad-spectrum insecticides for pest control are implicated in the initial outbreak and on-going pest status of SLW in the region.
Resumo:
The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.
Resumo:
The present review identifies various constraints relating to poor adoption of ley-pastures in south-west Queensland, and suggests changes in research, development and extension efforts for improved adoption. The constraints include biophysical, economic and social constraints. In terms of biophysical constraints, first, shallower soil profiles with subsoil constraints (salt and sodicity), unpredictable rainfall, drier conditions with higher soil temperature and evaporative demand in summer, and frost and subzero temperature in winter, frequently result in a failure of established, or establishing, pastures. Second, there are limited options for legumes in a ley-pasture, with the legumes currently being mostly winter-active legumes such as lucerne and medics. Winter-active legumes are ineffective in improving soil conditions in a region with summer-dominant rainfall. Third, most grain growers are reluctant to include grasses in their ley-pasture mix, which can be uneconomical for various reasons, including nitrogen immobilisation, carryover of cereal diseases and depressed yields of the following cereal crops. Fourth, a severe depletion of soil water following perennial ley-pastures (grass + legumes or lucerne) can reduce the yields of subsequent crops for several seasons, and the practice of longer fallows to increase soil water storage may be uneconomical and damaging to the environment. Economic assessments of integrating medium- to long-term ley-pastures into cropping regions are generally less attractive because of reduced capital flow, increased capital investment, economic loss associated with establishment and termination phases of ley-pastures, and lost opportunities for cropping in a favourable season. Income from livestock on ley-pastures and soil productivity gains to subsequent crops in rotation may not be comparable to cropping when grain prices are high. However, the economic benefits of ley-pastures may be underestimated, because of unaccounted environmental benefits such as enhanced water use, and reduced soil erosion from summer-dominant rainfall, and therefore, this requires further investigation. In terms of social constraints, the risk of poor and unreliable establishment and persistence, uncertainties in economic and environmental benefits, the complicated process of changing from crop to ley-pastures and vice versa, and the additional labour and management requirements of livestock, present growers socially unattractive and complex decision-making processes for considering adoption of an existing medium- to long-term ley-pasture technology. It is essential that research, development and extension efforts should consider that new ley-pasture options, such as incorporation of a short-term summer forage legume, need to be less risky in establishment, productive in a region with prevailing biophysical constraints, economically viable, less complex and highly flexible in the change-over processes, and socially attractive to growers for adoption in south-west Queensland.
Resumo:
The response of soybean (Glycine max) and dry bean (Phaseolus vulgaris) to feeding by Helicoverpa armigera during the pod-fill stage was studied in irrigated field cages over three seasons to determine the relationship between larval density and yield loss, and to develop economic injury levels. H. armigera intensity was calculated in Helicoverpa injury equivalent (HIE) units, where 1 HIE was the consumption of one larva from the start of the infestation period to pupation. In the dry bean experiment, yield loss occurred at a rate 6.00 ± 1.29 g/HIE while the rates of loss in the three soybean experiments were 4.39 ± 0.96 g/HIE, 3.70 ± 1.21 g/HIE and 2.12 ± 0.71 g/HIE. These three slopes were not statistically different (P > 0.05) and the pooled estimate of the rate of yield loss was 3.21 ± 0.55 g/HIE. The first soybean experiment also showed a split-line form of damage curve with a rate of yield loss of 26.27 ± 2.92 g/HIE beyond 8.0 HIE and a rapid decline to zero yield. In dry bean, H. armigera feeding reduced total and undamaged pod numbers by 4.10 ± 1.18 pods/HIE and 12.88 ± 1.57 pods/HIE respectively, while undamaged seed numbers were reduced by 35.64 ± 7.25 seeds/HIE. In soybean, total pod numbers were not affected by H. armigera infestation (out to 8.23 HIE in Experiment 1) but seed numbers (in Experiments 1 and 2) and the number of seeds/pod (in all experiments) were adversely affected. Seed size increased with increases in H. armigera density in two of the three soybean experiments, indicating plant compensatory responses to H. armigera feeding. Analysis of canopy pod profiles indicated that loss of pods occurred from the top of the plant downwards, but with an increase in pod numbers close to the ground at higher pest densities as the plant attempted to compensate for damage. Based on these results, the economic injury levels for H. armigera on dry bean and soybean are approximately 0.74 HIE and 2.31 HIE/m2, respectively (0.67 and 2.1 HIE/row-m for 91 cm rows).
Resumo:
The variation in liveweight gain in grazing beef cattle as influenced by pasture type, season and year effects has important economic implications for mixed crop-livestock systems and the ability to better predict such variation would benefit beef producers by providing a guide for decision making. To identify key determinants of liveweight change of Brahman-cross steers grazing subtropical pastures, measurements of pasture quality and quantity, and diet quality in parallel with liveweight were made over two consecutive grazing seasons (48 and 46 weeks, respectively), on mixed Clitoria ternatea/grass, Stylosanthes seabrana/grass and grass swards (grass being a mixture of Bothriochloa insculpta cv. Bisset, Dichanthium sericeum and Panicum maximum var. trichoglume cv. Petrie). Steers grazing the legume-based pastures had the highest growth rate and gained between 64 and 142 kg more than those grazing the grass pastures in under 12 months. Using an exponential model, green leaf mass, green leaf %, adjusted green leaf % (adjusted for inedible woody legume stems), faecal near infrared reflectance spectroscopy predictions of diet crude protein and diet dry matter digestibility, accounted for 77, 74, 80, 63 and 60%, respectively, of the variation in daily weight gain when data were pooled across pasture types and grazing seasons. The standard error of the regressions indicated that 95% prediction intervals were large (+/- 0.42-0.64 kg/head.day) suggesting that derived regression relationships have limited practical application for accurately estimating growth rate. In this study, animal factors, especially compensatory growth effects, appeared to have a major influence on growth rate in relation to pasture and diet attributes. It was concluded that predictions of growth rate based only on pasture or diet attributes are unlikely to be accurate or reliable. Nevertheless, key pasture attributes such as green leaf mass and green leaf% provide a robust indication of what proportion of the potential growth rate of the grazing animals can be achieved.
Resumo:
Varying the spatial distribution of applied nitrogen (N) fertilizer to match demand in crops has been shown to increase profits in Australia. Better matching the timing of N inputs to plant requirements has been shown to improve nitrogen use efficiency and crop yields and could reduce nitrous oxide emissions from broad acre grains. Farmers in the wheat production area of south eastern Australia are increasingly splitting N application with the second timing applied at stem elongation (Zadoks 30). Spectral indices have shown the ability to detect crop canopy N status but a robust method using a consistent calibration that functions across seasons has been lacking. One spectral index, the canopy chlorophyll content index (CCCI) designed to detect canopy N using three wavebands along the "red edge" of the spectrum was combined with the canopy nitrogen index (CNI), which was developed to normalize for crop biomass and correct for the N dilution effect of crop canopies. The CCCI-CNI index approach was applied to a 3-year study to develop a single calibration derived from a wheat crop sown in research plots near Horsham, Victoria, Australia. The index was able to predict canopy N (g m-2) from Zadoks 14-37 with an r2 of 0.97 and RMSE of 0.65 g N m-2 when dry weight biomass by area was also considered. We suggest that measures of N estimated from remote methods use N per unit area as the metric and that reference directly to canopy %N is not an appropriate method for estimating plant concentration without first accounting for the N dilution effect. This approach provides a link to crop development rather than creating a purely numerical relationship. The sole biophysical input, biomass, is challenging to quantify robustly via spectral methods. Combining remote sensing with crop modelling could provide a robust method for estimating biomass and therefore a method to estimate canopy N remotely. Future research will explore this and the use of active and passive sensor technologies for use in precision farming for targeted N management.
Resumo:
Vertebrates play a major role in dispersing seeds of fleshy-fruited alien plants. However, we know little of how the traits of alien fleshy fruits compare with indigenous fleshy fruits, and how these differences might contribute to invasion success. In this study, we characterised up to 38 fruit morphology, pulp nutrient and phenology traits of an assemblage of 34 vertebrate-dispersed alien species in south-eastern Queensland, Australia. Most alien fruits were small (81%\15 mm in mean width), and had watery fruit pulps that were high in sugars and low in nitrogen and lipids. When compared to indigenous species, alien fruits had significantly smaller seeds. Further, alien fruit pulps contained more sugar and more variable (and probably greater) nitrogen per pulp wet weight, and species tended to have longer fruiting seasons than indigenous species. Our analyses suggest that fruit traits could be important in determining invasiveness and could be used to improve pre- and post-border weed risk assessment.
Resumo:
When exposed to hot (22-35 degrees C) and dry climatic conditions in the field during the final 4-6 weeks of pod filling, peanuts (Arachis hypogaea L.) can accumulate highly carcinogenic and immuno-suppressing aflatoxins. Forecasting of the risk posed by these conditions can assist in minimizing pre-harvest contamination. A model was therefore developed as part of the Agricultural Production Systems Simulator (APSIM) peanut module, which calculated an aflatoxin risk index (ARI) using four temperature response functions when fractional available soil water was <0.20 and the crop was in the last 0.40 of the pod-filling phase. ARI explained 0.95 (P <= 0.05) of the variation in aflatoxin contamination, which varied from 0 to c. 800 mu g/kg in 17 large-scale sowings in tropical and four sowings in sub-tropical environments carried out in Australia between 13 November and 16 December 2007. ARI also explained 0.96 (P <= 0.01) of the variation in the proportion of aflatoxin-contaminated loads (>15 mu g/kg) of peanuts in the Kingaroy region of Australia during the period between the 1998/99 and 2007/08 seasons. Simulation of ARI using historical climatic data from 1890 to 2007 indicated a three-fold increase in its value since 1980 compared to the entire previous period. The increase was associated with increases in ambient temperature and decreases in rainfall. To facilitate routine monitoring of aflatoxin risk by growers in near real time, a web interface of the model was also developed. The ARI predicted using this interface for eight growers correlated significantly with the level of contamination in crops (r=095, P <= 0.01). These results suggest that ARI simulated by the model is a reliable indicator of aflatoxin contamination that can be used in aflatoxin research as well as a decision-support tool to monitor pre-harvest aflatoxin risk in peanuts.
Resumo:
Pseudocercospora macadamiae causes husk spot of macadamia. Husk spot control would be improved by verifying the stages in fruit development susceptible to infection, and determine some of the climatic conditions likely to lead to high disease pressure periods in the field. Our results showed that the percent conidia germination and growth of germ tubes and mycelia of P. macadamiae were greatest at 26 degrees C, with better conidia germination associated with high relative humidity and free water. The exposure of match-head-sized and pea-sized fruit stages to natural P. macadamiae inoculum in the field led to 2 5-fold increases in husk spot incidence, and up to 8.5-fold increases in premature abscission, compared with unexposed fruit. Exposure of fruit stages later than match-head-sized and pea-sized fruit generally caused no further increases in disease incidence or premature abscission. Climatic conditions were found to have a strong influence on the behaviour of P. macadamiae, the host, oil accumulation, and the subsequent impact of husk spot on premature abscission. Our findings suggest that fungicide application should target fruit at the match-head-sized stage of development in order to best reduce yield losses, particularly in seasons where oil accumulation in fruit is prolonged and climatic conditions are optimal for P. macadamiae.
Resumo:
In grassland reserves, managed disturbance is often necessary to maintain plant species diversity. We carried out experiments to determine the impact of fire, kangaroo grazing, mowing and disc ploughing on grassland species richness and composition in a nature reserve in semi-arid eastern Australia. Vegetation response was influenced by winter-spring drought after establishment of the experiments, but moderate rainfall followed in late summer-autumn. Species composition varied greatly between sampling times, and the variability due to rainfall differences between seasons and years was greater than the effects of fire, kangaroo grazing, mowing or disc ploughing. In the fire experiment, species richness and composition recovered more rapidly after spring than autumn burning. Species richness and composition were similar to control sites within 12 months of burning and mowing, suggesting that removal of the dominant grass canopy is unnecessary to enhance plant diversity. Two fires (separated by 3 years) and post-fire kangaroo grazing had only minor influence on species richness and composition. Even disc ploughing caused only a small reduction in native richness. The minor impact of ploughing was explained by the small areas that were ploughed, the once-off nature of the treatment, and the high degree of natural movement and cracking in these shrink-swell soils. Recovery of the composition and richness of these grasslands was rapid because of the high proportion of perennial species that resprout vegetatively after fire and mowing. There appears to be little conservation benefit from fire, mowing or ploughing ungrazed areas, as we could identify no native plant species dependent on frequent disturbance for persistence in this grassland community. However, the ability of the Astrebla- and Dichanthium-dominated grasslands to recover quickly after disturbance, given favourable seasonal conditions, suggests that they are well adapted to natural disturbances (e.g. droughts, fire, flooding and native grazing).
Resumo:
We compared daily net radiation (Rn) estimates from 19 methods with the ASCE-EWRI Rn estimates in two climates: Clay Center, Nebraska (sub-humid) and Davis, California (semi-arid) for the calendar year. The performances of all 20 methods, including the ASCE-EWRI Rn method, were then evaluated against Rn data measured over a non-stressed maize canopy during two growing seasons in 2005 and 2006 at Clay Center. Methods differ in terms of inputs, structure, and equation intricacy. Most methods differ in estimating the cloudiness factor, emissivity (e), and calculating net longwave radiation (Rnl). All methods use albedo (a) of 0.23 for a reference grass/alfalfa surface. When comparing the performance of all 20 Rn methods with measured Rn, we hypothesized that the a values for grass/alfalfa and non-stressed maize canopy were similar enough to only cause minor differences in Rn and grass- and alfalfa-reference evapotranspiration (ETo and ETr) estimates. The measured seasonal average a for the maize canopy was 0.19 in both years. Using a = 0.19 instead of a = 0.23 resulted in 6% overestimation of Rn. Using a = 0.19 instead of a = 0.23 for ETo and ETr estimations, the 6% difference in Rn translated to only 4% and 3% differences in ETo and ETr, respectively, supporting the validity of our hypothesis. Most methods had good correlations with the ASCE-EWRI Rn (r2 > 0.95). The root mean square difference (RMSD) was less than 2 MJ m-2 d-1 between 12 methods and the ASCE-EWRI Rn at Clay Center and between 14 methods and the ASCE-EWRI Rn at Davis. The performance of some methods showed variations between the two climates. In general, r2 values were higher for the semi-arid climate than for the sub-humid climate. Methods that use dynamic e as a function of mean air temperature performed better in both climates than those that calculate e using actual vapor pressure. The ASCE-EWRI-estimated Rn values had one of the best agreements with the measured Rn (r2 = 0.93, RMSD = 1.44 MJ m-2 d-1), and estimates were within 7% of the measured Rn. The Rn estimates from six methods, including the ASCE-EWRI, were not significantly different from measured Rn. Most methods underestimated measured Rn by 6% to 23%. Some of the differences between measured and estimated Rn were attributed to the poor estimation of Rnl. We conducted sensitivity analyses to evaluate the effect of Rnl on Rn, ETo, and ETr. The Rnl effect on Rn was linear and strong, but its effect on ETo and ETr was subsidiary. Results suggest that the Rn data measured over green vegetation (e.g., irrigated maize canopy) can be an alternative Rn data source for ET estimations when measured Rn data over the reference surface are not available. In the absence of measured Rn, another alternative would be using one of the Rn models that we analyzed when all the input variables are not available to solve the ASCE-EWRI Rn equation. Our results can be used to provide practical information on which method to select based on data availability for reliable estimates of daily Rn in climates similar to Clay Center and Davis.
Resumo:
Better understanding of seed-bank dynamics of Echinochloa colona, Urochloa panicoides and Hibiscus trionum, major crop weeds in sub-tropical Australia, was needed to improve weed control. Emergence patterns and seed persistence were investigated, with viable seeds sown at different depths in large in-ground pots. Seedlings of all species emerged between October and March when mean soil temperatures were 21-23C. However, E. colona emerged as a series of flushes predominantly in the first year, with most seedlings emerging from 0-2 cm. Urochloa panicoides emerged mostly as a single large flush in the first two years, with most seedlings emerging from 5 cm. Hibiscus trionum emerged as a series of flushes over three seasons, initially with majority from 5 cm and then 0-2 cm in the later seasons. Longevity of the grass seed was short, with <5% remaining after burial at 0-2 cm for 24 months. In contrast, 38% of H. trionum seeds remained viable after the same period. Persistence of all species increased significantly with burial depth. These data highlight that management strategies need to be tailored for each species, particularly relating to the need for monitoring, application times for control tactics, impact of tillage, and time needed to reduce the seed-bank to low numbers.