6 resultados para Yam
em University of Queensland eSpace - Australia
Resumo:
Recent multidisciplinary investigations document an independent emergence of agriculture at Kuk Swamp in the highlands of Papua New Guinea. In this paper we report preliminary usewear analysis and details of prehistoric use of stone tools for processing starchy food and other plants at Kuk Swamp. Morphological diagnostics for starch granules are reported for two potentially significant economic species, taro (Colocasia esculenta) and yam (Dioscorea sp.), following comparisons between prehistoric and botanical reference specimens. Usewear and residue analyses of starch granules indicate that both these species were processed on the wetland margin during the early and mid Holocene. We argue that processing of taro and yam commences by at least 10,200 calibrated years before present (cal BP), although the taro and yam starch granules do not permit us to distinguish between wild or cultivated forms. From at least 6950 to 6440 cal BP the processing of taro, yam and other plants indicates that they are likely to have been integrated into cultivation practices on the wetland edge.
Resumo:
This paper describes the first systematic study of nutritional deficiencies of greater yam (Dioscorea alata). Yam plants (cv. 'Mahoa'a') were propagated from tuber discs and grown in nutrient solution, with nutrients supplied following a modified programmed nutrient-addition method. After an establishment period of four weeks, deficiencies of nitrogen (N), phosphorus (P), potassium (K), calcium (Ca), magnesium (Mg), sulfur (S), iron (Fe), boron (B), manganese (Mn), copper (Cu), zinc (Zn), and molybdenum (Mo) were induced by omitting the relevant nutrient from the solution. Foliar symptoms were recorded photographically. Notably, deficiencies of the mobile macronutrients failed to induce senescence of oldest leaves, while vine growth and younger leaves were affected. Leaf blades of the main stem were sampled in sequence and analyzed chemically, providing the distribution of each nutrient from youngest to oldest leaves in both adequately supplied and deficient plants. The nutrient-concentration profiles, together with the visible symptoms, indicated that little remobilization of mobile macronutrients had occurred. For both macro- and micronutrients, young leaves gave the best separation of nutrient concentrations between well-nourished and deficient plants.
Resumo:
We present optical, near-IR, and radio follow-up of 16 Swift bursts, including our discovery of nine afterglows and a redshift determination for three. These observations, supplemented by data from the literature, provide an afterglow recovery rate of 52% in the optical/near-IR, much higher than in previous missions (BeppoSAX, HETE-2, INTEGRAL, and IPN). The optical/near-IR afterglows of Swift events are on average 1.8 mag fainter at t = 12 hr than those of previous missions. The X-ray afterglows are similarly fainter than those of pre-Swift bursts. In the radio the limiting factor is the VLA threshold, and the detection rate for Swift bursts is similar to that for past missions. The redshift distribution of pre-Swift bursts peaked at z similar to 1, whereas the six Swift bursts with measured redshifts are distributed evenly between 0.7 and 3.2. From these results we conclude that ( 1) the pre-Swift distributions were biased in favor of bright events and low-redshift events, ( 2) the higher sensitivity and accurate positions of Swift result in a better representation of the true burst redshift and brightness distributions ( which are higher and dimmer, respectively), and (3) similar to 10% of the bursts are optically dark, as a result of a high redshift and/or dust extinction. We remark that the apparent lack of low-redshift, low-luminosity Swift bursts and the lower event rate than prelaunch estimates ( 90 vs. 150 per year) are the result of a threshold that is similar to that of BATSE. In view of these inferences, afterglow observers may find it advisable to make significant changes in follow-up strategies of Swift events. The faintness of the afterglows means that large telescopes should be employed as soon as the burst is localized. Sensitive observations in RIz and near-IR bands will be needed to discriminate between a typical z similar to 2 burst with modest extinction and a high-redshift event. Radio observations will be profitable for a small fraction (similar to 10%) of events. Finally, we suggest that a search for bright host galaxies in untriggered BAT localizations may increase the chance of finding nearby low-luminosity GRBs.
Resumo:
A study was carried out on a previously eroded Oxic Paleustalf in Ibadan, southwestern Nigeria to determine the extent of soil degradation under mound tillage with some herbaceous legumes and residue management methods. A series of factorial experiments was carried out on 12 existing runoff plots. The study commenced in 1996 after a 5-year natural fallow. Mound tillage was introduced in 1997 till 1999. The legumes - Vigna unguiculata (cowpea), Mucuna pruriens and Pueraria phaseoloides - were intercropped with maize in 1996 and 1998 while yam was planted alone in 1997 and 1999. This paper covers 1997-1999. At the end of each year, residues were either burned or mulched on respective plots. Soil loss, runoff, variations in mound height, bulk density, soil water retention and sorptivity were measured. Cumulative runoff was similar among interactions of legume and residue management in 1997 (57-151 mm) and 1999 (206-397 mm). However, in 1998, cumulative runoff of 95 mm observed for Mucuna-burned residue was significantly greater than the 46 mm observed for cowpea-burned residue and the 39-51 mm observed for mulched residues of cowpea, Mucuna and Pueraria. Cumulative soil loss of 7.6 Mg ha(-1) observed for Mucuna-burned residue in 1997 was significantly greater than those for Pueraria-mulched (0.9 Mg ha(-1)) and Mucuna-mulched (1.4 Mg ha(-1)) residues whereas in 1999 it was similar to soil loss from cowpea treatments and Pueraria-burned residue (2.3-5.3 Mg ha(-1)). There were no significant differences in soil loss in 1998 (1-3.2 Mg ha(-1)) whereas Mucuna-burned residue had a greater soil loss (28.6 Mg ha(-1)) than mulched cowpea (6.9 Mg ha(-1)) and Pueraria (5.4 Ms ha(-1)). Mound heights (23 cm average) decreased non-linearly with cumulative rainfall. A cumulative rainfall of 500 mm removed 0.3-2.3 cm of soil from mounds in 1997, 3.5-6.9 cm in 1998 and 2.3-4.6 cm in 1999, indicating that (detached but less transported) soil from mounds was far higher than observed soil loss in each year. Soil water retention was improved at potentials ranging from -1 to -1500 kPa by Mucuna-mulched residue compared to the various burned-residue treatments. Also, mound sorptivity at -1 cm water head (14.3 cm h(-1/2)) was higher than furrow sorptivity (8.5 cm h(-1/2)), indicating differences in hydraulic characteristics between mound and furrow. Pueraria-mulched residues for mounds had the highest sorptivity of 17.24 cm h(-1/2), whereas the least value of 6.96 cm h(-1/2) was observed in furrow of Mucuna-burned residue. Pueraria phas eoloides was considered the best option for soil conservation on the previously eroded soil, cultivated with mound tillage. (c) 2005 Elsevier B.V. All rights reserved.