951 resultados para soil total digestion
Resumo:
A comparison of the DNase I digestion products of the 32P-5’-end-labeled pachytene nucleosome core particles (containing histones H2A, TH2A, X2, H2B, THPB, H3a, nd H4) and liver nucleosome core particles (containing somatic histones H2A, H2B, H3, and H4) revealed that the cleavage sites that are 30, 40, and 110 nucleotidesa way from the 5’-enda re significantly more accessiblei n the pachytene core particles than in the liver core particles. These cleavage sites correspond to the region wherein H2B interacts with the nucleosome core DNA. These results, therefore, suggest that the histone-DNA interactiona t these sites in the pachytene core particles is weaker, possibly because of the presence of the histone variant THBB interacting at similar topological positions in the nucleosome core as that of its somatic counterpart H2B. Such a loosened structumrea y also be maintainede ven in the native pachytene chromatin since micrococcal nuclease digestion of pachytene nuclei resulted in a higher ratio of subnucleosomes (SN4 + SN?) to mononucleosomes than that observed liinv er chromatin
Resumo:
It has been reported that high-density planting of sugarcane can improve cane and sugar yield through promoting rapid canopy closure and increasing radiation interception earlier in crop growth. It is widely known that the control of adverse soil biota through fumigation (removes soil biological constraints and improves soil health) can improve cane and sugar yield. Whether the responses to high-density planting and improved soil health are additive or interactive has important implications for the sugarcane production system. Field experiments established at Bundaberg and Mackay, Queensland, Australia, involved all combinations of 2-row spacings (0.5 and 1.5 m), two planting densities (27 000 and 81 000 two-eyed setts/ha), and two soil fumigation treatments (fumigated and non-fumigated). The Bundaberg experiment had two cultivars (Q124, Q155), was fully irrigated, and harvested 15 months after planting. The Mackay experiment had one cultivar (Q117), was grown under rainfed conditions, and harvested 10 months after planting. High-density planting (81 000 setts/ha in 0.5-m rows) did not produce any more cane or sugar yield at harvest than low-density planting (27 000 setts/ha in 1.5-m rows) regardless of location, crop duration (15 v. 10 months), water supply (irrigated v. rainfed), or soil health (fumigated v. non-fumigated). Conversely, soil fumigation generally increased cane and sugar yields regardless of site, row spacing, and planting density. In the Bundaberg experiment there was a large fumigation x cultivar x density interaction (P<0.01). Cultivar Q155 responded positively to higher planting density in non-fumigated soil but not in fumigated soil, while Q124 showed a negative response to higher planting density in non-fumigated soil but no response in fumigated soil. In the Mackay experiment, Q117 showed a non-significant trend of increasing yield in response to increasing planting density in non-fumigated soil, similar to the Q155 response in non-fumigated soil at Bundaberg. The similarity in yield across the range of row spacings and planting densities within experiments was largely due to compensation between stalk number and stalk weight, particularly when fumigation was used to address soil health. Further, the different cultivars (Q124 and Q155 at Bundaberg and Q117 at Mackay) exhibited differing physiological responses to the fumigation, row spacing, and planting density treatments. These included the rate of tiller initiation and subsequent loss, changes in stalk weight, and propensity to lodging. These responses suggest that there may be potential for selecting cultivars suited to different planting configurations.
Resumo:
Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.
Resumo:
To investigate the effects of soil type on seed persistence in a manner that controlled for location and climate variables, three weed species—Gomphocarpus physocarpus (swan plant), Avena sterilis ssp. ludoviciana (wild oat) and Ligustrum lucidum (broadleaf privet)—were buried for 21 months in three contrasting soils at a single location. Soil type had a significant effect on seed persistence and seedling vigour, but soil water content and temperature varied between soils due to differences in physical and chemical properties. Warmer, wetter conditions favoured shorter persistence. A laboratory-based test was developed to accelerate the rate of seed ageing within soils, using controlled superoptimal temperature and moisture conditions (the soil-specific accelerated ageing test, SSAAT). The SSAAT demonstrated that soil type per se did not influence seed longevity. Moreover, the order in which seeds aged was the same whether aged in the field or SSAAT, with L. lucidum being shortest-lived and A. sterilis being longest-lived of the three species.
Resumo:
This two-year study examined the impacts of feral pig diggings on five ecological indicators: seedling survival, surface litter, subsurface plant biomass, earthworm biomass and soil moisture content. Twelve recovery exclosures were established in two habitats (characterised by wet and dry soil moisture) by fencing off areas of previous pig diggings. A total of 0.59 ha was excluded from further pig diggings and compared with 1.18 ha of unfenced control areas. Overall, seedling numbers increased 7% within the protected exclosures and decreased 37% within the unprotected controls over the two-year study period. A significant temporal interaction was found in the dry habitat, with seedling survival increasing with increasing time of protection from diggings. Feral pig diggings had no significant effect on surface litter biomass, subsurface plant biomass, earthworm biomass or soil moisture content.
Resumo:
Pratylenchus thornei is widespread throughout the wheat-growing regions in Australia and overseas and can cause yield losses of up to 70% in some intolerant cultivars. The most effective forms of management of P. thornei populations are crop rotation and plant breeding. There have been no wheat accessions identified as completely resistant to P. thornei, therefore breeding programs have used moderately resistant parents. The objective of the present research was to evaluate 274 Iranian landrace wheats for resistance to P. thornei and identify accessions with resistance superior to the current best resistance source (GS50a). Plants were grown in P. thornei inoculated soil under controlled conditions in a glasshouse pot experiment for 16 weeks. Ninety-two accessions found to be resistant or moderately so were retested in a second experiment. From combined analysis of these experiments, 34 accessions were identified as resistant with reproduction factors (final population per kg soil/initial inoculum rate per kg soil) <= 1. In total, 25 accessions were more resistant than GS50a, with AUS28470 significantly (P < 0.05) more resistant. The resistant Iranian landraces identified in the present study are a valuable untapped genetic pool offering improved levels of P. thornei resistance over current parents in Australian wheat-breeding programs.
Resumo:
An assessment of the relative influences of management and environment on the composition of floodplain grasslands of north-western New South Wales was made using a regional vegetation survey sampling a range of land tenures (e. g. private property, travelling stock routes and nature reserves). A total of 364 taxa belonging to 55 different plant families was recorded. Partitioning of variance with redundancy analysis determined that environmental variables accounted for a greater proportion (61.3%) of the explained variance in species composition than disturbance-related variables (37.6%). Soil type (and fertility), sampling time and rainfall had a strong influence on species composition and there were also east-west variations in composition across the region. Of the disturbance-related variables, cultivation, stocking rate and flooding frequency were all influential. Total, native, forb, shrub and subshrub richness were positively correlated with increasing time since cultivation. Flood frequency was positively correlated with graminoid species richness and was negatively correlated with total and forb species richness. Site species richness was also influenced by environmental variables (e. g. soil type and rainfall). Despite the resilience of these grasslands, some forms of severe disturbance (e. g. several years of cultivation) can result in removal of some dominant perennial grasses (e. g. Astrebla spp.) and an increase in disturbance specialists. A simple heuristic transitional model is proposed that has conceptual thresholds for plant biodiversity status. This knowledge representation may be used to assist in the management of these grasslands by defining four broad levels of community richness and the drivers that change this status.
Resumo:
Parthenium hysterophorus L. (Asteraceae) is a weed of national significance in Australia. Among the several arthropod agents introduced into Australia to control populations of P. hysterophorus biologically, Epiblema strenuana Walker (Lepidoptera: Tortricidae) is the most widespread and abundant agent. By intercepting the normal transport mechanisms of P. hysterophorus, the larvae of E. strenuana drain nutrients, other metabolic products, and energy, and place the host plant under intense metabolic stress. In this study, determinations of total non-structural carbohydrates (TNC) levels and carbon and nitrogen isotope ratios of fixed products in different parts of the plant tissue, including the gall, have been made to establish the function of gall as a sink for the nutrients. Values of δ13C and δ15N in galls were significantly different than those in proximal and distal stems, whereas the TNC levels were insignificant, when measured in the total population of P. hysterophorus, regardless of plant age. However, carbon, nitrogen, and TNC signatures presented significant results, when assayed in different developmental stages of P. hysterophorus. Carbon isotope ratios in galls were consistently more negative than those from the compared plant organs. Nitrogen isotope ratios in galls, on the contrary, were either similar to or less negative than the compared plant organs, especially within a single host-plant stage population (i.e., either rosette, preflowering, or flowering stage). TNC levels varied within compared plant populations. The stem distal to the gall functioned more efficiently as a nodal channel than the stem proximal to the gall, especially in the translocation of nitrogenous nutrients. Our findings indicate that the gall induced by E. strenuana functions as a sink for the assayed nutrients, although some variations have been observed in the patterns of nutrient mobilization. By creating a sink for the nutrients in the gall, E. strenuana is able to place the overall plant metabolism under stress, and this ability indicates E. strenuana has the necessary potential for use as a biological-control agent.
Resumo:
Surface losses of nitrogen from horticulture farms in coastal Queensland, Australia, may have the potential to eutrophy sensitive coastal marine habitats nearby. A case-study of the potential extent of such losses was investigated in a coastal macadamia plantation. Nitrogen losses were quantified in 5 consecutive runoff events during the 13-month study. Irrigation did not contribute to surface flows. Runoff was generated by storms at combined intensities and durations that were 20–40 mm/h for >9 min. These intensities and durations were within expected short-term (1 year) and long-term (up to 20 years) frequencies of rainfall in the study area. Surface flow volumes were 5.3 ± 1.1% of the episodic rainfall generated by such storms. Therefore, the largest part of each rainfall event was attributed to infiltration and drainage in this farm soil (Kandosol). The estimated annual loss of total nitrogen in runoff was 0.26 kg N/ha.year, representing a minimal loading of nitrogen in surface runoff when compared to other studies. The weighted average concentrations of total sediment nitrogen (TSN) and total dissolved nitrogen (TDN) generated in the farm runoff were 2.81 ± 0.77% N and 1.11 ± 0.27 mg N/L, respectively. These concentrations were considerably greater than ambient levels in an adjoining catchment waterway. Concentrations of TSN and TDN in the waterway were 0.11 ± 0.02% N and 0.50 ± 0.09 mg N/L, respectively. The steep concentration gradient of TSN and TDN between the farm runoff and the waterway demonstrated the occurrence of nutrient loading from the farming landscapes to the waterway. The TDN levels in the stream exceeded the current specified threshold of 0.2–0.3 mg N/L for eutrophication of such a waterway. Therefore, while the estimate of annual loading of N from runoff losses was comparatively low, it was evident that the stream catchment and associated agricultural land uses were already characterised by significant nitrogen loadings that pose eutrophication risks. The reported levels of nitrogen and the proximity of such waterways (8 km) to the coastline may have also have implications for the nearshore (oligotrophic) marine environment during periods of turbulent flow.
Resumo:
Background and purpose — Osseointegrated implants are an alternative for prosthetic attachment in individuals with amputation who are unable to wear a socket. However, the load transmitted through the osseointegrated fixation to the residual tibia and knee joint can be unbearable for those with transtibial amputation and knee arthritis. We report on the feasibility of combining total knee replacement (TKR) with an osseointegrated implant for prosthetic attachment. Patients and methods — We retrospectively reviewed all 4 cases (aged 38–77 years) of transtibial amputations managed with osseointegration and TKR in 2012–2014. The below-the-knee prosthesis was connected to the tibial base plate of a TKR, enabling the tibial residuum and knee joint to act as weight-sharing structures. A 2-stage procedure involved connecting a standard hinged TKR to custom-made implants and creation of a skin-implant interface. Clinical outcomes were assessed at baseline and after 1–3 years of follow-up using standard measures of health-related quality of life, ambulation, and activity level including the questionnaire for transfemoral amputees (Q-TFA) and the 6-minute walk test. Results — There were no major complications, and there was 1 case of superficial infection. All patients showed improved clinical outcomes, with a Q-TFA improvement range of 29–52 and a 6-minute walk test improvement range of 37–84 meters. Interpretation — It is possible to combine TKR with osseointegrated implants.
Resumo:
Three experiments were conducted on the use of water retaining amendments under newly-laid turf mats. The work focused on the first 12 weeks of establishment. In soils that already possessed a good water-holding capacity, water retaining amendments did not provide any benefit. On a sand-based profile, a rooting depth of 200 mm was achieved with soil amendment products within three weeks of laying turf. Most products differed in their performance relative to each other at each three weekly measurement interval. Polyacrylamide gels gave superior results when the crystals were incorporated into the soil profile. They were not suitable for broadcasting at the soil/sod interface. Finer grades of crystals were less likely to be subject to excessive expansion than medium grade crystals after heavy rainfall. Turf establishment was more responsive to products at higher application rates, however these higher rates may result in surface stability problems.
Resumo:
Soft-leaf buffalo grass is increasing in popularity as an amenity turfgrass in Australia. This project was instigated to assess the adaptation of and establish management guidelines for its use in Australias vast array of growing environments. There is an extensive selection of soft-leaf buffalo grass cultivars throughout Australia and with the countrys changing climates from temperate in the south to tropical in the north not all cultivars are going to be adapted to all regions. The project evaluated 19 buffalo grass cultivars along with other warm-season grasses including green couch, kikuyu and sweet smother grass. The soft-leaf buffalo grasses were evaluated for their growth and adaptation in a number of regions throughout Australia including Western Australia, Victoria, ACT, NSW and Queensland. The growth habit of the individual cultivars was examined along with their level of shade tolerance, water use, herbicide tolerance, resistance to wear, response to nitrogen applications and growth potential in highly alkaline (pH) soils. The growth habit of the various cultivars currently commercially available in Australia differs considerably from the more robust type that spreads quicker and is thicker in appearance (Sir Walter, Kings Pride, Ned Kelly and Jabiru) to the dwarf types that are shorter and thinner in appearance (AusTine and AusDwarf). Soft-leaf buffalo grass types tested do not differ in water use when compared to old-style common buffalo grass. Thus, soft-leaf buffalo grasses, like other warm-season turfgrass species, are efficient in water use. These grasses also recover after periods of low water availability. Individual cultivar differences were not discernible. In high pH soils (i.e. on alkaline-side) some elements essential for plant growth (e.g. iron and manganese) may be deficient causing turfgrass to appear pale green, and visually unacceptable. When 14 soft-leaf buffalo grass genotypes were grown on a highly alkaline soil (pH 7.5-7.9), cultivars differed in leaf iron, but not in leaf manganese, concentrations. Nitrogen is critical to the production of quality turf. The methods for applying this essential element can be manipulated to minimise the maintenance inputs (mowing) during the peak growing period (summer). By applying the greatest proportion of the turfs total nitrogen requirements in early spring, peak summer growth can be reduced resulting in a corresponding reduction in mowing requirements. Soft-leaf buffalo grass cultivars are more shade and wear tolerant than other warm-season turfgrasses being used by homeowners. There are differences between the individual buffalo grass varieties however. The majority of types currently available would be classified as having moderate levels of shade tolerance and wear reasonably well with good recovery rates. The impact of wear in a shaded environment was not tested and there is a need to investigate this as this is a typical growing environment for many homeowners. The use of herbicides is required to maintain quality soft-leaf buffalo grass turf. The development of softer herbicides for other turfgrasses has seen an increase in their popularity. The buffalo grass cultivars currently available have shown varying levels of susceptibility to the chemicals tested. The majority of the cultivars evaluated have demonstrated low levels of phytotoxicity to the herbicides chlorsulfuron (Glean) and fluroxypyr (Starane and Comet). In general, soft leaf buffalo grasses are varied in their makeup and have demonstrated varying levels of tolerance/susceptibility/adaptation to the conditions they are grown under. Consequently, there is a need to choose the cultivar most suited to the environment it is expected to perform in and the management style it will be exposed to. Future work is required to assess how the structure of the different cultivars impacts on their capacity to tolerate wear, varying shade levels, water use and herbicide tolerance. The development of a growth model may provide the solution.
Resumo:
Soil water repellency occurs widely in horticultural and agricultural soils when very dry. The gradual accumulation and breakdown of surface organic matter over time produces wax-like organic acids, which coat soil particles preventing uniform entry of water into the soil. Water repellency is usually managed by regular surfactant applications. Surfactants, literally, are surface active agents (SURFace ACTive AgeNTS). Their mode of action is to reduce the surface tension of water, allowing it to penetrate and wet the soil more easily and completely. This practice improves water use efficiency (by requiring less water to wet the soil and by capturing rainfall and irrigation more effectively and rapidly). It also reduces nutrient losses through run-off erosion or leaching. These nutrients have the potential to pollute the surrounding environment and water courses. This project investigated potential improvements to standard practices (product combination and scheduling) for surfactant use to overcome localised dry spots on water repellent soils and thus improve turf quality and water use efficiency. Weather conditions for the duration of the trial prevented the identification of improved practices in terms of combination and scheduling. However, the findings support previous research that the use of soil surfactants decreased the time for water to infiltrate dry soil samples taken from a previously severely hydrophobic site. Data will be continually collected from this trial site on a private contractual basis, with the hope that improvements to standard practices will be observed during the drier winter months when moisture availability is a limiting factor for turfgrass growth and quality.
Resumo:
Wear resistance and recovery of 8 Bermudagrass (Cynodon dactylon (L.) Pers.) and hybrid Bermudagrass (C. Dactylon x C. transvaalensis Burtt-Davey) cultivars grown on a sandbased soil profile near Brisbane, Australia, were assessed in 4 wear trials conducted over a two year period. Wear was applied on a 7-day or a 14-day schedule by a modified Brinkman Traffic Simulator for 6-14 weeks at a time, either during winter-early spring or during summer-early autumn. The more frequent wear under the 7-day treatment was more damaging to the turf than the 14-day wear treatment, particularly during winter when its capacity for recovery from wear was severely restricted. There were substantial differences in wear tolerance among the 8 cultivars investigated, and the wear tolerance rankings of some cultivars changed between years. Wear tolerance was associated with high shoot density, a dense stolon mat strongly rooted to the ground surface, high cell wall strength as indicated by high total cell wall content, and high levels of lignin and neutral detergent fiber. Wear tolerance was also affected by turf age, planting sod quality, and wet weather. Resistance to wear and recovery from wear are both important components of wear tolerance, but the relative importance of their contributions to overall wear tolerance varies seasonally with turf growth rate.