13 resultados para Block triangulation with additional parameters
em eResearch Archive - Queensland Department of Agriculture
Resumo:
This project built upon the successful outcomes of a previous project (TU02005) by adding to the database of salt tolerance among warm season turfgrass cultivars, through further hydroponic screening trials. Hydroponic screening trials focussed on new cultivars or cultivars that were not possible to cover in the time available under TU02005, including: 11 new cultivars of Paspalum vaginatum; 13 cultivars of Cynodon dactylon; six cultivars of Stenotaphrum secundatum; one accession of Cynodon transvaalensis; 12 Cynodon dactylon x transvaalensis hybrids; two cultivars of Sporobolus virginicus; five cultivars of Zoysia japonica; one cultivar of Z. macrantha, one common form of Z. tenuifolia and one Z. japonica x tenuifolia hybrid. The relative salinity tolerance of different turfgrasses is quantified in terms of their growth response to increasing levels of salinity, often defined by the salt level that equates to a 50% reduction in shoot yield, or alternatively the threshold salinity. The most salt tolerant species in these trials were Sporobolus virginicus and Paspalum vaginatum, consistent with the findings from TU02005 (Loch, Poulter et al. 2006). Cynodon dactylon showed the largest range in threshold values with some cultivars highly sensitive to salt, while others were tolerant to levels approaching that of the more halophytic grasses. Coupled with the observational and anecdotal evidence of high drought tolerance, this species and other intermediately tolerant species provide options for site specific situations in which soil salinity is coupled with additional challenges such as shade and high traffic conditions. By recognising the fact that a salt tolerant grass is not the complete solution to salinity problems, this project has been able to further investigate sustainable long-term establishment and management practices that maximise the ability of the selected grass to survive and grow under a particular set of salinity and usage parameters. Salt-tolerant turf grasses with potential for special use situations were trialled under field conditions at three sites within the Gold Coast City Council, while three sites, established under TU02005 within the Redland City Council boundaries were monitored for continued grass survival. Several randomised block experiments within Gold Coast City were established to compare the health and longevity of seashore paspalum (Paspalum vaginatum), Manila grass (Zoysia matrella), as well as the more tolerant cultivars of other species like buffalo grass (Stenotaphrum secundatum) and green couch (Cynodon dactylon). Whilst scientific results were difficult to achieve in the field situation, where conditions cannot be controlled, these trials provided valuable observational evidence of the likely survival of these species. Alternatives to laying full sod such as sprigging were investigated, and were found to be more appropriate for areas of low traffic as the establishment time is greater. Trials under controlled and protected conditions successfully achieved a full cover of Paspalum vaginatum from sprigs in a 10 week time frame. Salt affected sites are often associated with poor soil structure. Part of the research investigated techniques for the alleviation of soil compaction frequently found on saline sites. Various methods of soil de-compaction were investigated on highly compacted heavy clay soil in Redlands City. It was found that the heavy duplex soil of marine clay sediments required the most aggressive of treatments in order to achieve limited short-term effects. Interestingly, a well constructed sports field showed a far greater and longer term response to de-compaction operations, highlighting the importance of appropriate construction in the successful establishment and management of turfgrasses on salt affected sites. Fertiliser trials in this project determined plant demand for nitrogen (N) to species level. This work produced data that can be used as a guide when fertilising, in order to produce optimal growth and quality in the major turf grass species used in public parkland. An experiment commenced during TU02005 and monitored further in this project, investigated six representative warm-season turfgrasses to determine the optimum maintenance requirements for fertiliser N in south-east Queensland. In doing so, we recognised that optimum level is also related to use and intensity of use, with high profile well-used parks requiring higher maintenance N than low profile parks where maintaining botanical composition at a lower level of turf quality might be acceptable. Kikuyu (Pennisetum clandestinum) seemed to require the greatest N input (300-400 kg N/ha/year), followed by the green couch (Cynodon dactylon) cultivars ‘Wintergreen’ and ‘FLoraTeX’ requiring approximately 300 kg N/ha/year for optimal condition and growth. ‘Sir Walter’ (Stenotaphrum secundatum) and ‘Sea Isle 1’ (Paspalum vaginatum) had a moderate requirement of approximately 200 kg/ha/year. ‘Aussiblue’ (Digitaria didactyla)maintained optimal growth and quality at 100-200 kg N/ha/year. A set of guidelines has been prepared to provide various options from the construction and establishment of new grounds, through to the remediation of existing parklands by supporting the growth of endemic grasses. They describe a best management process through which salt affected sites should be assessed, remediated and managed. These guidelines, or Best Management Practices, will be readily available to councils. Previously, some high salinity sites have been turfed several times over a number of years (and Council budgets) for a 100% failure record. By eliminating this budgetary waste through targeted workable solutions, local authorities will be more amenable to investing appropriate amounts into these areas. In some cases, this will lead to cost savings as well as resulting in better quality turf. In all cases, however, improved turf quality will be of benefit to ratepayers, directly through increased local use of open space in parks and sportsfields and indirectly by attracting tourists and other visitors to the region bringing associated economic benefits. At the same time, environmental degradation and erosion of soil in bare areas will be greatly reduced.
Resumo:
Trichinella surveillance in wildlife relies on muscle digestion of large samples which are logistically difficult to store and transport in remote and tropical regions as well as labour-intensive to process. Serological methods such as enzyme-linked immunosorbent assays (ELISAs) offer rapid, cost-effective alternatives for surveillance but should be paired with additional tests because of the high false-positive rates encountered in wildlife. We investigated the utility of ELISAs coupled with Western blot (WB) in providing evidence of Trichinella exposure or infection in wild boar. Serum samples were collected from 673 wild boar from a high- and low-risk region for Trichinella introduction within mainland Australia, which is considered Trichinella-free. Sera were examined using both an 'in-house' and a commercially available indirect-ELISA that used excretory secretory (E/S) antigens. Cut-off values for positive results were determined using sera from the low-risk population. All wild boar from the high-risk region (352) and 139/321 (43.3%) of the wild boar from the low-risk region were tested by artificial digestion. Testing by Western blot using E/S antigens, and a Trichinella-specific real-time PCR was also carried out on all ELISA-positive samples. The two ELISAs correctly classified all positive controls as well as one naturally infected wild boar from Gabba Island in the Torres Strait. In both the high- and low-risk populations, the ELISA results showed substantial agreement (k-value = 0.66) that increased to very good (k-value = 0.82) when WB-positive only samples were compared. The results of testing sera collected from the Australian mainland showed the Trichinella seroprevalence was 3.5% (95% C.I. 0.0-8.0) and 2.3% (95% C.I. 0.0-5.6) using the in-house and commercial ELISA coupled with WB respectively. These estimates were significantly higher (P < 0.05) than the artificial digestion estimate of 0.0% (95% C.I. 0.0-1.1). Real-time PCR testing of muscle from seropositive animals did not detect Trichinella DNA in any mainland animals, but did reveal the presence of a second larvae-positive wild boar on Gabba Island, supporting its utility as an alternative, highly sensitive method in muscle examination. The serology results suggest Australian wildlife may have been exposed to Trichinella parasites. However, because of the possibility of non-specific reactions with other parasitic infections, more work using well-defined cohorts of positive and negative samples is required. Even if the specificity of the ELISAs is proven to be low, their ability to correctly classify the small number of true positive sera in this study indicates utility in screening wild boar populations for reactive sera which can be followed up with additional testing. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
A spatially explicit multi-competitor coexistence model was developed for meta-populations of prawns (shrimp) occupying habitat patches across the Great Barrier Reef, where dispersal was localised and dispersal rates varied between species. Prawns were modelled as individuals moving to and from patches or cells according to pre-set decision rules. The landscape was simulated as a matrix of cells with each cell having a spatially explicit survival index for each species. Mixed species prawn assemblages moved over this simplified spatially explicit landscape. A low level of chronic random environmental disturbance was assumed (cyclone and tropical storm damage) with additional acute spatially confined disturbance due to commercial trawling, modelled as an increase in mortality affecting inter-specific competition. The general form of the results was for increased disturbance to favour good-colonising "generalist" species at the expense of good-competitor "specialists". Increasing fishing mortality (local patch extinctions) combined with poor colonising ability resulted in low equilibrium abundance for even the best competitor, while in the same circumstances the poorest competitor but best coloniser could have the highest equilibrium abundance. This mimics the switch from high-value prawn species to lower-value prawn species as trawl effort increases, reflected in historic catch and effort logbook data and reported anecdotaly from the north Queensland trawl fleet. To match the observed distribution and behaviour of prawn assemblages, a combination inter-species competition, a spatially explicit landscape, and a defined pattern of disturbance (trawling) was required. Modelling this combination could simulate not only general trends in spatial distribution of each of prawn species but also localised concentrations observed in the survey data
Resumo:
Models that implement the bio-physical components of agro-ecosystems are ideally suited for exploring sustainability issues in cropping systems. Sustainability may be represented as a number of objectives to be maximised or minimised. However, the full decision space of these objectives is usually very large and simplifications are necessary to safeguard computational feasibility. Different optimisation approaches have been proposed in the literature, usually based on mathematical programming techniques. Here, we present a search approach based on a multiobjective evaluation technique within an evolutionary algorithm (EA), linked to the APSIM cropping systems model. A simple case study addressing crop choice and sowing rules in North-East Australian cropping systems is used to illustrate the methodology. Sustainability of these systems is evaluated in terms of economic performance and resource use. Due to the limited size of this sample problem, the quality of the EA optimisation can be assessed by comparison to the full problem domain. Results demonstrate that the EA procedure, parameterised with generic parameters from the literature, converges to a useable solution set within a reasonable amount of time. Frontier ‘‘peels’’ or Pareto-optimal solutions as described by the multiobjective evaluation procedure provide useful information for discussion on trade-offs between conflicting objectives.
Resumo:
Trials were conducted in southern Queensland, Australia between March and May 2003, 2004 and 2005 to study patterns of hourly and daily release of the secondary conidia of Claviceps africana and their relationships with weather parameters. Conidia were trapped for at least one hour on most (> 90%) days in 2003 and 2004, but only on 55% of days in 2005. Both the highest daily concentration of conidia, and the highest number of hours per day when conidia were trapped, were recorded 1-3 days after rainfall events. Although the pattern of conidial release was different every day, the highest hourly conidial concentrations occurred between 10.00 hours and 17.00 hours on 73% of all days in the three trials. Hours when conidia were trapped were characterized by higher median values of temperature, windspeed and vapour pressure deficit, lower relative humidity, and leaf wetness values of 0%, than hours when no conidia were recorded. The results indicate that fungicides need to be applied to the highly ergot-susceptible male sterile (A-) lines of sorghum in hybrid seed production blocks and breeders' nurseries as soon as possible after rainfall events to minimize ergot severity.
Resumo:
Residue retention is an important issue in evaluating the sustainability of production forestry. However, its long-term impacts have not been studied extensively, especially in sub-tropical environments. This study investigated the long-term impact of harvest residue retention on tree nutrition, growth and productivity of a F1 hybrid (Pinus elliottii var. elliottii × Pinus caribaea var. hondurensis) exotic pine plantation in sub-tropical Australia, under three harvest residue management regimes: (1) residue removal, RR0; (2) single residue retention, RR1; and (3) double residue retention, RR2. The experiment, established in 1996, is a randomised complete block design with 4 replicates. Tree growth measurements in this study were carried out at ages 2, 4, 6, 8 and 10 years, while foliar nutrient analyses were carried out at ages 2, 4, 6 and 10 years. Litter production and litter nitrogen (N) and phosphorus (P) measurements were carried out quarterly over a 15-month period between ages 9 and 10 years. Results showed that total tree growth was still greater in residue-retained treatments compared to the RR0 treatment. However, mean annual increments of diameter at breast height (MAID) and basal area (MAIB) declined significantly after age 4 years to about 68-78% at age 10 years. Declining foliar N and P concentrations accounted for 62% (p < 0.05) of the variation of growth rates after age 4 years, and foliar N and P concentrations were either marginal or below critical concentrations. In addition, litter production, and litter N and P contents were not significantly different among the treatments. This study suggests that the impact of residue retention on tree nutrition and growth rates might be limited over a longer period, and that the integration of alternative forest management practices is necessary to sustain the benefits of harvest residues until the end of the rotation.
Resumo:
In semi-arid areas such as western Nebraska, interest in subsurface drip irrigation (SDI) for corn is increasing due to restricted irrigation allocations. However, crop response quantification to nitrogen (N) applications with SDI and the environmental benefits of multiple in-season (IS) SDI N applications instead of a single early-season (ES) surface application are lacking. The study was conducted in 2004, 2005, and 2006 at the University of Nebraska-Lincoln West Central Research and Extension Center in North Platte, Nebraska, comparing two N application methods (IS and ES) and three N rates (128, 186, and 278 kg N ha(-1)) using a randomized complete block design with four replications. No grain yield or biomass response was observed in 2004. In 2005 and 2006, corn grain yield and biomass production increased with increasing N rates, and the IS treatment increased grain yield, total N uptake, and gross return after N application costs (GRN) compared to the ES treatment. Chlorophyll meter readings taken at the R3 corn growth stage in 2006 showed that less N was supplied to the plant with ES compared to the IS treatment. At the end of the study, soil NO3-N masses in the 0.9 to 1.8 m depth were greater under the IS treatment compared to the ES treatment. Results suggested that greater losses of NO3-N below the root zone under the ES treatment may have had a negative effect on corn production. Under SDI systems, fertigating a recommended N rate at various corn growth stages can increase yields, GRN, and reduce NO3-N leaching in soils compared to concentrated early-season applications.
Resumo:
Cat’s claw creeper, Macfadyena unguis-cati (L.) Gentry (Bignoniaceae) is a major environmental weed of riparian areas, rainforest communities and remnant natural vegetation in coastal Queensland and New South Wales, Australia. In densely infested areas, it smothers standing vegetation, including large trees, and causes canopy collapse. Quantitative data on the ecology of this invasive vine are generally lacking. The present study examines the underground tuber traits of M. unguis-cati and explores their links with aboveground parameters at five infested sites spanning both riparian and inland vegetation. Tubers were abundant in terms of density (~1000 per m2), although small in size and low in level of interconnectivity. M. unguis-cati also exhibits multiple stems per plant. Of all traits screened, the link between stand (stem density) and tuber density was the most significant and yielded a promising bivariate relationship for the purposes of estimation, prediction and management of what lies beneath the soil surface of a given M. unguis-cati infestation site. The study also suggests that new recruitment is primarily from seeds, not from vegetative propagation as previously thought. The results highlight the need for future biological-control efforts to focus on introducing specialist seed- and pod-feeding insects to reduce seed-output.
Resumo:
The Horticulture Australia funded project, Management Guidelines for Warm-Season Grasses in Australia (TU05001), has allowed a detailed greens grass study to take place and enabled researchers and superintendents to work together to collect meaningful data on a range of Cynodon dactylon (L.) Pers. x Cynodon transvaalensis Burtt-Davy (Cynodon hybrid) and Paspalum vaginatum O. Swartz (seashore paspalum) cultivars suitable for golf or lawn bowls use. The end result provides superintendents and greenkeepers with additional knowledge to accompany their skills in managing or upgrading their greens to produce a denser, smoother and faster putting or bowls surface. However, neither turfgrass selection nor finely tuned management program will overcome unrealistic expectations (especially in relation to usage), poor growing environments, or limitations due to improper construction techniques.
Resumo:
Converting from an existing irrigation system is often seen as high risk by the land owner. The significant financial investment and the long period over which the investment runs is also complicated by the uncertainty associated with long term input costs (such as energy), crop production, and the continually evolving natural resource management rules and policy. Irrigation plays a pivotal part in the Burdekin sugarcane farming system. At present the use of furrow irrigation is by far the most common form due to the ease of use, relatively low operating cost and well established infrastructure currently in place. The Mulgrave Area Farmer Integrated Action (MAFIA) grower group, located near Clare in the lower Burdekin region, identified the need to learn about sustainable farming systems with a focus on the environment, social and economic implications. In early 2007, Hesp Faming established a site to investigate the use of overhead irrigation as an alternative to furrow irrigation and its integration with new farming system practices, including Green Cane Trash Blanketing (GCTB). Although significant environmental and social benefits exist, the preliminary investment analysis indicates that the Overhead Low Pressure (OHLP) irrigation system is not adding financial value to the Hesp Farming business. A combination of high capital costs and other offsetting factors resulted in the benefits not being fully realised. A different outcome is achieved if Hesp Farming is able to realise value on the water saved, with both OHLP irrigation systems displaying a positive NPV. This case study provides a framework to further investigate the economics of OHLP irrigation in sugarcane and it is anticipated that with additional data a more definitive outcome will be developed in the future.
Resumo:
Three experiments were conducted to determine liveweight (W) gain and feed and water intake of weaned Bali cattle offered a range of feed types. In each experiment, 18 weaned entire male Bali cattle were allocated to three treatment groups in a completely randomised block design, with six replicates (animals) per treatment. The dietary treatments were: Experiment 1, native grass fed ad libitum, native grass supplemented with rice bran at 10 g dry matter (DM)/kg W.day and native grass supplemented with a mixture of rice bran and copra meal in equal proportions fed at 10 g DM/kg W.day; Experiment 2, elephant grass hay fed ad libitum, elephant grass supplemented with gliricidia at 10 g DM/kg W.day, and gliricidia fed ad libitum; and Experiment 3, corn stover fed ad libitum, corn stover supplemented with gliricidia at 10 g DM/kg W.day, and corn stover supplemented with rice bran/copra meal in equal amounts (w/w) at 10 g DM/kg W.day. Each experiment was 10 weeks in duration, consisting of a 2-week preliminary period for adaptation to diets and an 8-week experimental period for the measurement of W change, feed and water intake and digestibility of the diet. Growth rates of 6-12-month-old, entire male Bali cattle fed a range of local diets ranged from 0.10 and 0.40 kg/day. Lowest growth rates occurred when the cattle were given the basal diets of native grass (0.104 kg/day), elephant grass (0.174 kg/day) and corn stover (0.232 kg/day). With the addition of supplements such as rice bran, rice bran/copra meal or gliricidia to these basal diets liveweight gains increased to between 0.225 and 0.402 kg/day. Forage DM intake was reduced with these supplements by on average 22.6% while total DM intake was increased by an average of 10.5%. The growth rate on gliricidia alone was 0.269 kg/day and feed DM intake was 28.0 g/kg W.day. Water intake was not affected by supplement type or intake. In conclusion, inclusion of small quantities of locally available, high quality feed supplements provide small-holder farmers with the potential to increase growth rates of Bali calves from 0.1 to 0.2 kg/day, under prevailing feeding scenarios, to over 0.4 kg/day.
Resumo:
The in vivo faecal egg count reduction test (FECRT) is the most commonly used test to detect anthelmintic resistance (AR) in gastrointestinal nematodes (GIN) of ruminants in pasture based systems. However, there are several variations on the method, some more appropriate than others in specific circumstances. While in some cases labour and time can be saved by just collecting post-drench faecal worm egg counts (FEC) of treatment groups with controls, or pre- and post-drench FEC of a treatment group with no controls, there are circumstances when pre- and post-drench FEC of an untreated control group as well as from the treatment groups are necessary. Computer simulation techniques were used to determine the most appropriate of several methods for calculating AR when there is continuing larval development during the testing period, as often occurs when anthelmintic treatments against genera of GIN with high biotic potential or high re-infection rates, such as Haemonchus contortus of sheep and Cooperia punctata of cattle, are less than 100% efficacious. Three field FECRT experimental designs were investigated: (I) post-drench FEC of treatment and controls groups, (II) pre- and post-drench FEC of a treatment group only and (III) pre- and post-drench FEC of treatment and control groups. To investigate the performance of methods of indicating AR for each of these designs, simulated animal FEC were generated from negative binominal distributions with subsequent sampling from the binomial distributions to account for drench effect, with varying parameters for worm burden, larval development and drench resistance. Calculations of percent reductions and confidence limits were based on those of the Standing Committee for Agriculture (SCA) guidelines. For the two field methods with pre-drench FEC, confidence limits were also determined from cumulative inverse Beta distributions of FEC, for eggs per gram (epg) and the number of eggs counted at detection levels of 50 and 25. Two rules for determining AR: (1) %reduction (%R) < 95% and lower confidence limit <90%; and (2) upper confidence limit <95%, were also assessed. For each combination of worm burden, larval development and drench resistance parameters, 1000 simulations were run to determine the number of times the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been declared. When continuing larval development occurs during the testing period of the FECRT, the simulations showed AR should be calculated from pre- and post-drench worm egg counts of an untreated control group as well as from the treatment group. If the widely used resistance rule 1 is used to assess resistance, rule 2 should also be applied, especially when %R is in the range 90 to 95% and resistance is suspected.
Resumo:
Climate change and on-going water policy reforms will likely contribute to on-farm and regional structural adjustment in Australia. This paper gathers empirical evidence of farm-level structural adjustments and integrates these with a regional equilibrium model to investigate sectoral and regional impacts of climate change and recent water use policy on rice industry. We find strong evidence of adjustments to the farming system, enabled by existing diversity in on-farm production. A further loss of water with additional pressures to adopt less intensive and larger-scale farming, will however reduce the net number of farm businesses, which may affect regional rice production. The results from a regional CGE model show impacts on the regional economy over and above the direct cost of the environmental water, although a net reduction in real economic output and real income is partially offset by gains in rest of the Australia through the reallocation or resources. There is some interest within the industry and from potential new corporate entrants in the relocation of some rice production to the north. However, strong government support would be crucial to implement such relocation.