62 resultados para Productivity tools competency
Resumo:
Maize (Zea mays L.) is a chill-susceptible crop cultivated in northern latitude environments. The detrimental effects of cold on growth and photosynthetic activity have long been established. However, a general overview of how important these processes are with respect to the reduction of productivity reported in the field is still lacking. In this study, a model-assisted approach was used to dissect variations in productivity under suboptimal temperatures and quantify the relative contributions of light interception (PARc) and radiation use efficiency (RUE) from emergence to flowering. A combination of architectural and light transfer models was used to calculate light interception in three field experiments with two cold-tolerant lines and at two sowing dates. Model assessment confirmed that the approach was suitable to infer light interception. Biomass production was strongly affected by early sowings. RUE was identified as the main cause of biomass reduction during cold events. Furthermore, PARc explained most of the variability observed at flowering, its relative contributions being more or less important according to the climate experienced. Cold temperatures resulted in lower PARc, mainly because final leaf length and width were significantly reduced for all leaves emerging after the first cold occurrence. These results confirm that virtual plants can be useful as fine phenotyping tools. A scheme of action of cold on leaf expansion, light interception and radiation use efficiency is discussed with a view towards helping breeders define relevant selection criteria. This paper originates from a presentation at the 5th International Workshop on Functional–Structural Plant Models, Napier, New Zealand, November 2007.
Resumo:
Spotted gum dominant forests occur from Cooktown in northern Queensland (Qld) to Orbost in Victoria (Boland et al. 2006) and these forests are commercially very important with spotted gum the most commonly harvested hardwood timber in Qld and one of the most important in New South Wales (NSW). Spotted gum has a wide range of end uses from solid wood products through to power transmission poles and generally has excellent sawing and timber qualities (Hopewell 2004). The private native forest resource in southern Qld and northern NSW is a critical component of the hardwood timber industry (Anon 2005, Timber Qld 2006) and currently half or more of the native forest timber resource harvested in northern NSW and Qld is sourced from private land. However, in many cases productivity on private lands is well below what could be achieved with appropriate silvicultural management. This project provides silvicultural management tools to assist extension staff, land owners and managers in the south east Qld and north eastern NSW regions. The intent was that this would lead to improvement of the productivity of the private estate through implementation of appropriate management. The other intention of this project was to implement a number of silvicultural experiments and demonstration sites to provide data on growth rates of managed and unmanaged forests so that landholders can make informed decisions on the future management of their forests. To assist forest managers and improve the ability to predict forest productivity in the private resource, the project has developed: • A set of spotted gum specific silvicultural guidelines for timber production on private land that cover both silvicultural treatment and harvesting. The guidelines were developed for extension officers and property owners. • A simple decision support tool, referred to as the spotted gum productivity assessment tool (SPAT), that allows an estimation of: 1. Tree growth productivity on specific sites. Estimation is based on the analysis of site and growth data collected from a large number of yield and experimental plots on Crown land across a wide range of spotted gum forest types. Growth algorithms were developed using tree growth and site data and the algorithms were used to formulate basic economic predictors. 2. Pasture development under a range of tree stockings and the expected livestock carrying capacity at nominated tree stockings for a particular area. 3. Above-ground tree biomass and carbon stored in trees. •A series of experiments in spotted gum forests on private lands across the study area to quantify growth and to provide measures of the effect of silvicultural thinning and different agro-forestry regimes. The adoption and use of these tools by farm forestry extension officers and private land holders in both field operations and in training exercises will, over time, improve the commercial management of spotted gum forests for both timber and grazing. Future measurement of the experimental sites at ages five, 10 and 15 years will provide longer term data on the effects of various stocking rates and thinning regimes and facilitate modification and improvement of these silvicultural prescriptions.
Resumo:
The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.
Resumo:
The cDNAs coding for the brain GnRHs (AY373449-51), pituitary GH, SL and PRL, and liver IGFs (AY427954-5) were isolated. Partial cDNA sequences of the brain (Cyp19b) and gonadal (Cyp19a) aromatases have also been obtained. These tools would be utilized to study the endocrine regulation of puberty in the grey mullet.
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
Australia’s rangelands are the extensive arid and semi-arid grazing lands that cover approximately 70% of the Australian continent. They are characterised by low and generally variable rainfall, low productivity and a sparse population. They support a number of industries including mining and tourism, but pastoralism is the primary land use. In some areas, the rangelands have a history of biological decline (Noble 1997), with erosion, loss of perennial native grasses and incursion of woody vegetation commonly reported in the scientific and lay literature. Despite our historic awareness of these trends, the establishment of systems to measure and monitor degradation, has presented numerous problems. The size and accessibility of Australia’s rangeland often mitigates development of extensive monitoring programs. So, too, securing on-going commitment from Government agencies to fund rangeland monitoring activities have led to either abandonment or a scaled-down approach in some instances (Graetz et al. 1986; Holm 1993). While a multiplicity of monitoring schemes have been developed for landholders at the property scale, and some have received promising initial uptake, relatively few have been maintained for more than a few years on any property without at least some agency support (Pickup et al. 1998). But, ironically, such property level monitoring tools can contribute significantly to local decisions about stock, infrastructure and sustainability. Research in recent decades has shown the value of satellites for monitoring change in rangelands (Wallace et al. 2004), especially in terms of tree and ground cover. While steadily improving, use of satellite data as a monitoring tool has been limited by the cost of the imagery, and the equipment and expertise needed to extract useful information from it. A project now under way in the northern rangelands of Australia is attempting to circumvent many of the problems through a monitoring system that allows property managers to use long-term satellite image sequences to quickly and inexpensively track changes in land cover on their properties
Resumo:
Results from the humid tropics of Australia demonstrate that diverse plantations can achieve greater productivity than monocultures. We found that increases in both the observed species number and the effective species richness were significantly related to increased levels of productivity as measured by stand basal area or mean individual tree basal area. Four of five plantation species were more productive in mixtures with other species than in monocultures, offering on average, a 55% increase in mean tree basal area. A general linear model suggests that species richness had a significant effect on mean individual tree basal area when environmental variables were included in the model. As monoculture plantations are currently the preferred reforestation method throughout the tropics these results suggest that significant productivity and ecological gains could be made if multi-species plantations are more broadly pursued.
Resumo:
Two examples of GIS-based multiple-criteria evaluations of plantation forests are presented. These desktop assessments use available topographical, geological and pedological information to establish the risk of occurrence of certain environmentally detrimental processes. The first case study is concerned with the risk that chemical additives (i.e. simazine) applied within the forestry landscape may reach the drainage system. The second case study assesses the vulnerability of forested areas to landslides. The subject of the first multiple-criteria evaluation (MCE) was a 4 km2 logging area, which had been recently site-prepared for a Pinus plantation. The criteria considered relevant to the assessment were proximity to creeks, slope, soil depth to the restrictive layer (i.e. potential depth to a perched water table) and soil erodability (based on clay content). The output of the MCE was in accordance with field observations, showing that this approach has the potential to provide management support by highlighting areas vulnerable to waterlogging, which in turn can trigger overland flow and export of pollutants to the local stream network. The subject of the second evaluation was an Araucaria plantation which is prone to landslips during heavy rain. The parameters included in the assessment were drainage system, the slope of the terrain and geological features such as rocks and structures. A good correlation between the MCE results and field observations was found, suggesting that this GIS approach is useful for the assessment of natural hazards. Multiple-criteria evaluations are highly flexible as they can be designed in either vector or raster format, depending on the type of available data. Although tested on specific areas, the MCEs presented here can be easily used elsewhere and assist both management intervention and the protection of the adjacent environment by assessing the vulnerability of the forest landscape to either introduced chemicals or natural hazards.
Resumo:
Detailed data on seagrass distribution, abundance, growth rates and community structure information were collected at Orman Reefs in March 2004 to estimate the above-ground productivity and carbon assimilated by seagrass meadows. Seagrass meadows were re-examined in November 2004 for comparison at the seasonal extremes of seagrass abundance. Ten seagrass species were identified in the meadows on Orman Reefs. Extensive seagrass coverage was found in March (18,700 ha) and November (21,600 ha), with seagrass covering the majority of the intertidal reef-top areas and a large proportion of the subtidal areas examined. There were marked differences in seagrass above-ground biomass, distribution and species composition between the two surveys. Major changes between March and November included a substantial decline in biomass for intertidal meadows and an expansion in area of subtidal meadows. Changes were most likely a result of greater tidal exposure of intertidal meadows prior to November leading to desiccation and temperature-related stress. The Orman Reef seagrass meadows had a total above-ground productivity of 259.8 t DW day-1 and estimated carbon assimilation of 89.4 t C day-1 in March. The majority of this production came from the intertidal meadows which accounted for 81% of the total production. Intra-annual changes in seagrass species composition, shoot density and size of meadows measured in this study were likely to have a strong influence on the total above-ground production during the year. The net estimated above-ground productivity of Orman Reefs meadows in March 2004 (1.19 g C m-2 day-1) was high compared with other tropical seagrass areas that have been studied and also higher than many other marine, estuarine and terrestrial plant communities.
Resumo:
Phosphine is the primary fumigant used to protect the majority of the world' s grain and a variety of other stored commodities from insect pests. Phosphine is playing an increasingly important role in the protection of commodities for two primary reasons. Firstly, use of the alternative fumigant, methyl bromide, has been sharply curtailed and is tightly regulated due to its role in ozone depletion, and secondly, consumers are becoming increasingly intolerant of contact pesticides. Niche alternatives to phosphine exist, but they suffer from a range of factors that limit their use, including: 1) Limited commercial adoption due to expense or slow mode of action; 2) Poor efficacy due to low toxicity, rapid sorption, limited volatility or high density; 3) Public health concerns due to toxicity to handlers or nearby residents, as well as risk of explosion; 4) Poor consumer acceptance due to toxic residues or smell. These same factors limit the prospects of quickly identifying and deploying a new fumigant. Given that resistance toward phosphine is increasing among insect pests, improved monitoring and management of resistance is a priority. Knowledge of the mode of action of phosphine as well as the mechanisms of resistance may also greatly reduce the effort and expense of identifying synergists or novel replacement compounds.
Resumo:
Residue retention is an important issue in evaluating the sustainability of production forestry. However, its long-term impacts have not been studied extensively, especially in sub-tropical environments. This study investigated the long-term impact of harvest residue retention on tree nutrition, growth and productivity of a F1 hybrid (Pinus elliottii var. elliottii × Pinus caribaea var. hondurensis) exotic pine plantation in sub-tropical Australia, under three harvest residue management regimes: (1) residue removal, RR0; (2) single residue retention, RR1; and (3) double residue retention, RR2. The experiment, established in 1996, is a randomised complete block design with 4 replicates. Tree growth measurements in this study were carried out at ages 2, 4, 6, 8 and 10 years, while foliar nutrient analyses were carried out at ages 2, 4, 6 and 10 years. Litter production and litter nitrogen (N) and phosphorus (P) measurements were carried out quarterly over a 15-month period between ages 9 and 10 years. Results showed that total tree growth was still greater in residue-retained treatments compared to the RR0 treatment. However, mean annual increments of diameter at breast height (MAID) and basal area (MAIB) declined significantly after age 4 years to about 68-78% at age 10 years. Declining foliar N and P concentrations accounted for 62% (p < 0.05) of the variation of growth rates after age 4 years, and foliar N and P concentrations were either marginal or below critical concentrations. In addition, litter production, and litter N and P contents were not significantly different among the treatments. This study suggests that the impact of residue retention on tree nutrition and growth rates might be limited over a longer period, and that the integration of alternative forest management practices is necessary to sustain the benefits of harvest residues until the end of the rotation.
Resumo:
Reduced supplies of nitrogen (N) in many soils of southern Queensland that were cropped exhaustively with cereals over many decades have been the focus of much research to avoid declines in profitability and sustainability of farming systems. A 45-month period of mixed grass (purple pigeon grass, Setaria incrassata Stapf; Rhodes grass, Chloris gayana Kunth.) and legume (lucerne, Medicago sativa L.; annual medics, M. scutellata L. Mill. and M. truncatula Gaertn.) pasture was one of several options that were compared at a fertility-depleted Vertosol at Warra, southern Queensland, to improve grain yields or increase grain protein concentration of subsequent wheat crops. Objectives of the study were to measure the productivity of a mixed grass and legume pasture grown over 45 months (cut and removed over 36 months) and its effects on yield and protein concentrations of the following wheat crops. Pasture production (DM t/ha) and aboveground plant N yield (kg/ha) for grass, legume (including a small amount of weeds) and total components of pasture responded linearly to total rainfall over the duration of each of 3 pastures sown in 1986, 1987 and 1988. Averaged over the 3 pastures, each 100 mm of rainfall resulted in 0.52 t/ha of grass, 0.44 t/ha of legume and 0.97 t/ha of total pasture DM, there being little variation between the 3 pastures. Aboveground plant N yield of the 3 pastures ranged from 17.2 to 20.5 kg/ha per 100 mm rainfall. Aboveground legume N in response to total rainfall was similar (10.6 - 13.2 kg/ha. 100 mm rainfall) across the 3 pastures in spite of very different populations of legumes and grasses at establishment. Aboveground grass N yield was 5.2 - 7.0 kg/ha per 100mm rainfall. In most wheat crops following pasture, wheat yields were similar to that of unfertilised wheat except in 1990 and 1994, when grain yields were significantly higher but similar to that for continuous wheat fertilised with 75 kg N/ha. In contrast, grain protein concentrations of most wheat crops following pasture responded positively, being substantially higher than unfertilised wheat but similar to that of wheat fertilised with 75 kg N/ha. Grain protein averaged over all years of assay was increased by 25 - 40% compared with that of unfertilised wheat. Stored water supplies after pasture were < 134mm (< 55% of plant available water capacity); for most assay crops water storages were 67 - 110 mm, an equivalent wet soil depth of only 0.3 - 0.45 m. Thus, the crop assays of pasture benefits were limited by low water supply to wheat crops. Moreover, the severity of common root rot in wheat crop was not reduced by pasture - wheat rotation.
Resumo:
Tick fever is an important disease of cattle where Rhipicephalus (Boophilus) microplus acts as a vector for the three causal organisms Babesia bovis, Babesia bigemina and Anaplasma marginale. Bos indicus cattle and their crosses are more resistant to the clinical effects of infection with B. bovis and B. bigemina than are Bos taurus cattle. Resistance is not complete, however, and herds of B. indicus-cross cattle are still at risk of babesiosis in environments where exposure to B. bovis is light in most years but occasionally high. The susceptibility of B. indicus cattle and their crosses to infection with A. marginale is similar to that of B. taurus cattle. In herds of B. indicus cattle and their crosses the infection rate of Babesia spp. and A. marginale is lowered because fewer ticks are likely to attach per day due to reduced numbers of ticks in the field (long-term effect on population, arising from high host resistance) and because a smaller proportion of ticks that do develop to feed on infected cattle will in turn be infected (due to lower parasitaemia). As a consequence, herds of B. indicus cattle are less likely than herds of B. taurus cattle to have high levels of population immunity to babesiosis or anaplasmosis. The effects of acaricide application on the probability of clinical disease due to anaplasmosis and babesiosis are unpredictable and dependent on the prevalence of infection in ticks and in cattle at the time of application. Attempting to manipulate population immunity through the toleration of specific threshold numbers of ticks with the aim of controlling tick fever is not reliable and the justification for acaricide application should be for the control of ticks rather than for tick fever. Vaccination of B. indicus cattle and their crosses is advisable in all areas where ticks exist, although vaccination against B. bigemina is probably not essential in pure B. indicus animals.
Resumo:
Dairy farms in subtropical Australia use irrigated, annually sown short-term ryegrass (Lolium multiflorum) or mixtures of short-term ryegrass and white (Trifolium repens) and Persian (shaftal) (T. resupinatum) clover during the winter-spring period in all-year-round milk production systems. A series of small plot cutting experiments was conducted in 3 dairying regions (tropical upland, north Queensland, and subtropical southeast Queensland and northern New South Wales) to determine the most effective rate and frequency of application of nitrogen (N) fertiliser. The experiments were not grazed, nor was harvested material returned to the plots, after sampling. Rates up to 100 kg N/ha.month (as urea or calcium ammonium nitrate) and up to 200 kg N/ha every 2 months (as urea) were applied to pure stands of ryegrass in 1991. In 1993 and 1994, urea, at rates up to 150 kg N/ha.month and to 200 kg N/ha every 2 months, was applied to pure stands of ryegrass; urea, at rates up to 50 kg N/ha.month, was also applied to ryegrass-clover mixtures. The results indicate that applications of 50-85 kg N/ha.month can be recommended for short-term ryegrass pastures throughout the subtropics and tropical uplands of eastern Australia, irrespective of soil type. At this rate, dry matter yields will reach about 90% of their potential, forage nitrogen concentration will be increased, there is minimal risk to stock from nitrate poisoning and there will be no substantial increase in soil N. The rate of N for ryegrass-clover pastures is slightly higher than for pure ryegrass but, at these rates, the clover component will be suppressed. However, increased ryegrass yields and higher forage nitrogen concentrations will compensate for the reduced clover component. At application rates up to 100 kg N/ha.month, build-up of NO3--N and NH4+-N in soil was generally restricted to the surface layers (0-20 cm) of the soil, but there was a substantial increase throughout the soil profile at 150 kg N/ha.month. The build-up of NO3--N and NH4+-N was greater and was found at lower rates on the lighter soil compared with heavy clays. Generally, most of the soil N was in the NO3--N form and most was in the top 20 cm.