17 resultados para Two-year programs
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
Supplements containing urea or biuret were fed in the dry season to yearling and two year old pregnant heifers grazing native spear grass pastures in north Queensland. Liveweight change and survival during the dry season and fertility in the following year were measured. In the first experiment during a relatively favourable dry season, supplementation significantly (P<0.01) reduced liveweight loss in yearling heifers (5 vs. 32 kg). In the following year during a drought, supplement significantly (P<.01) reduced liveweight loss in yearling heifers (32 vs. 41 kg) and significantly (P <0.01) reduced mortalities (23.5% vs. 5.2%) in pregnant and lactating heifers. The supplement had no significant effect on subsequent fertility in either experiment. 14th Biennial Conference.
Resumo:
Data on catch sizes, catch rates, length-frequency and age composition from the Australian east coast tailor fishery are analysed by three different population dynamic models: a surplus production model, an age-structured model, and a model in which the population is structured by both age and length. The population is found to be very heavily exploited, with its ability to reproduce dependent on the fishery’s incomplete selectivity of one-year-old fish. Estimates of recent harvest rates (proportion of fish available to the fishery that are actually caught in a single year) are over 80%. It is estimated that only 30–50% of one-year-old fish are available to the fishery. Results from the age-length-structured model indicate that both exploitable biomass (total mass of fish selected by the fishery) and egg production have fallen to about half the levels that prevailed in the 1970s, and about 40% of virgin levels. Two-year-old fish appear to have become smaller over the history of the fishery. This is assumed to be due to increased fishing pressure combined with non-selectivity of small one-year-old fish, whereby the one-year-old fish that survive fishing are small and grow into small two-year-old fish the following year. An alternative hypothesis is that the stock has undergone a genetic change towards smaller fish; the true explanation is unknown. The instantaneous natural mortality rate of tailor is hypothesised to be higher than previously thought, with values between 0.8 and 1.3 yr–1 consistent with the models. These values apply only to tailor up to about three years of age, and it is possible that a lower value applies to fish older than three. The analysis finds no evidence that fishing pressure has yet affected recruitment. If a recruitment downturn were to occur, however, under current management and fishing pressure there is a strong chance that the fishery would need a complete closure for several years to recover, and even then recovery would be uncertain. Therefore it is highly desirable to better protect the spawning stock. The major recommendations are • An increase in the minimum size limit from 30cm to 40cm in order to allow most one-year-old fish to spawn, and • An experiment on discard mortality to gauge the proportion of fish between 30cm and 40cm that are likely to survive being caught and released by recreational line fishers (the dominant component of the fishery, currently harvesting roughly 1000t p.a. versus about 200t p.a. from the commercial fishery).
Resumo:
Twelve strains of Pseudomonas pseudomallei were isolated from the soil and water of a sheep paddock over a two-year period. The organism was recovered from the clay layer of the soil profile as well as from water that seeps into this layer during the "wet" season. Five isolates were obtained before the commencement of the "wet" season; environmental factors appear to play an important role in the survival of Ps. pseudomallei during the "dry" season. Lower isolation rates were recorded than those indicated by workers in southeast Asia and Iran.
Resumo:
Runoff and sediment loss from forest roads were monitored for a two-year period in a Pinus plantation in southeast Queensland. Two classes of road were investigated: a gravelled road, which is used as a primary daily haulage route for the logging area, and an ungravelled road, which provides the main access route for individual logging compartments and is intensively used as a haulage route only during the harvest of these areas (approximately every 30 years). Both roads were subjected to routine traffic loads and maintenance during the study. Surface runoff in response to natural rainfall was measured and samples taken for the determination of sediment and nutrient (total nitrogen, total phosphorus, dissolved organic carbon and total iron) loads from each road. Results revealed that the mean runoff coefficient (runoff depth/rainfall depth) was consistently higher from the gravelled road plot with 0.57, as compared to the ungravelled road with 0.38. Total sediment loss over the two-year period was greatest from the gravelled road plot at 5.7 t km−1 compared to the ungravelled road plot with 3.9 t km−1. Suspended solids contributed 86% of the total sediment loss from the gravelled road, and 72% from the ungravelled road over the two years. Nitrogen loads from the two roads were both relatively constant throughout the study, and averaged 5.2 and 2.9 kg km−1 from the gravelled and ungravelled road, respectively. Mean annual phosphorus loads were 0.6 kg km−1 from the gravelled road and 0.2 kg km−1 from the ungravelled road. Organic carbon and total iron loads increased in the second year of the study, which was a much wetter year, and are thought to reflect the breakdown of organic matter in roadside drains and increased sediment generation, respectively. When road and drain maintenance (grading) was performed runoff and sediment loss were increased from both road types. Additionally, the breakdown of the gravel road base due to high traffic intensity during wet conditions resulted in the formation of deep (10 cm) ruts which increased erosion. The Water Erosion Prediction Project (WEPP):Road model was used to compare predicted to observed runoff and sediment loss from the two road classes investigated. For individual rainfall events, WEPP:Road predicted output showed strong agreement with observed values of runoff and sediment loss. WEPP:Road predictions for annual sediment loss from the entire forestry road network in the study area also showed reasonable agreement with the extrapolated observed values.
Resumo:
This two-year study examined the impacts of feral pig diggings on five ecological indicators: seedling survival, surface litter, subsurface plant biomass, earthworm biomass and soil moisture content. Twelve recovery exclosures were established in two habitats (characterised by wet and dry soil moisture) by fencing off areas of previous pig diggings. A total of 0.59 ha was excluded from further pig diggings and compared with 1.18 ha of unfenced control areas. Overall, seedling numbers increased 7% within the protected exclosures and decreased 37% within the unprotected controls over the two-year study period. A significant temporal interaction was found in the dry habitat, with seedling survival increasing with increasing time of protection from diggings. Feral pig diggings had no significant effect on surface litter biomass, subsurface plant biomass, earthworm biomass or soil moisture content.
Resumo:
Premature or abnormal softening of persimmon fruit within 3-7 days after harvest is a major physiological problem of non-astringent persimmon cultivars grown in subtropical regions of Australia. Up to 30% of consignments may soften rapidly frequently overnight, often resulting in the flesh becoming very soft, completely translucent, and impossible to handle. Incidence of premature soft fruit can vary with season and production location. To study the incidence of this problem, we conducted surveys of fruit harvested from five environmentally-diverse regions of Australia over a two-year period. We found wide variation in the rate of both premature softening and normal softening with differences of up 37 days between orchards in percentage of fruit reaching 50% soft. We found that the rate of fruit softening was exacerbated by lower calcium concentrations at fruit set, shorter fruit development periods and heavier rainfall during the fruit development period. The implications of our findings, in terms of orchard management, export and domestic marketing strategies are discussed.
Resumo:
Wear resistance and recovery of 8 Bermudagrass (Cynodon dactylon (L.) Pers.) and hybrid Bermudagrass (C. Dactylon x C. transvaalensis Burtt-Davey) cultivars grown on a sandbased soil profile near Brisbane, Australia, were assessed in 4 wear trials conducted over a two year period. Wear was applied on a 7-day or a 14-day schedule by a modified Brinkman Traffic Simulator for 6-14 weeks at a time, either during winter-early spring or during summer-early autumn. The more frequent wear under the 7-day treatment was more damaging to the turf than the 14-day wear treatment, particularly during winter when its capacity for recovery from wear was severely restricted. There were substantial differences in wear tolerance among the 8 cultivars investigated, and the wear tolerance rankings of some cultivars changed between years. Wear tolerance was associated with high shoot density, a dense stolon mat strongly rooted to the ground surface, high cell wall strength as indicated by high total cell wall content, and high levels of lignin and neutral detergent fiber. Wear tolerance was also affected by turf age, planting sod quality, and wet weather. Resistance to wear and recovery from wear are both important components of wear tolerance, but the relative importance of their contributions to overall wear tolerance varies seasonally with turf growth rate.
Resumo:
South African citrus thrips (Scirtothrips aurantii) established adventitiously in Australia. Although it is a major horticultural pest in Africa, it is now advocated as a possible biological control agent against Bryophyllum delagoense Eckl. & Zeyh. (Crassulaceae). To evaluate the biocontrol potential of S. aurantii a two year field study was conducted on the western Darling Downs of southern Queensland. Imidacloprid insecticide was applied to two quadrats at each of 18 field sites to assess, in the absence of S. aurantii, the persistence of individual plants and to quantify propagule production and recruitment by this declared weed. A third quadrat was left, as a control, to be infested naturally by S. aurantii. When released from herbivory by thrips in the field, plants grew significantly more, flowered more, and were significantly more fecund than plants in the quadrats with S. aurantii. Increases in growth and fecundity translated into significantly increased plant numbers but not increased recruitment. Recruitment even declined in experimental quadrats, through the indirect effects of releasing plants from herbivory. Field sampling also revealed that S. aurantii may be sensitive to seasonal climatic fluctuations. These and other local climatic influences may limit the biological control potential of the insect.
Resumo:
Parthenium hysterophorus L. is a weed of global significance that has become a major weed in Australia and many other parts of the world. A combined approach for the management of parthenium weed using biological control and plant suppression, was tested under field conditions over a two-year period in southern central Queensland. The six suppressive plant species, selected for their demonstrably suppressive ability in earlier glasshouse studies, worked synergistically with the biological control agents (Epiblema strenuana Walker, Zygogramma bicolorata Pallister, Listronotus setosipennis Hustache and Puccinia abrupta var. partheniicola) present in the field to reduce the growth (above ground biomass) of parthenium weed, by between 60–86% and 47–91%, in Years 1 and 2, respectively. The biomass of the suppressive plants was between 6% and 23% greater when biological control agents were present than when the biological control agents had been excluded. This shows that parthenium weed can be more effectively managed by combining the current biological control management strategy with selected sown suppressive plant species, both in Australia and elsewhere.
Resumo:
The aim of this study was to investigate the effects on follicle stimulating hormone (FSH) secretion and dominant follicle (OF) growth, of treatment of Bos indicus heifers with different combinations of intra-vaginal progesterone releasing devices (IPRD), oestradiol benzoate (ODB), PGF(2 alpha), and eCG. Two-year-old Brahman (BN; n=30) and Brahman-cross (BNX; n=34) heifers were randomly allocated to three IPRD-treatments: (i) standard-dose IPRD [CM 1.56 g; 1.56 g progesterone (P-4); n = 17]; (ii) half-dose IPRD (CM 0.78 g; 0.78 g p(4); n=15); (iii) half-dose IPRD + 300 IU eCG at IPRD removal (CM 0.78 g+G; n=14); and, (iv) non-IPRD control (2 x PGF(2 alpha); n=18) 500 mu g cloprostenol on Days -16 and -2. IPRD-treated heifers received 250 mu g PGF(2 alpha) at IPRD insertion (Day 10) and IPRD removal (Day -2) and 1 mg ODB on Day -10 and Day -1. Follicular dynamics were monitored daily by trans-rectal ultrasonography from Day -10 to Day 1. Blood samples for determination of P-4 were collected daily and samples for FSH determination were collected at 12 h intervals from Day -9 to Day -2. A significant surge in concentrations of FSH was observed in the 2 x PGF(2 alpha), treatment 12 h prior and 48 h after follicular wave emergence, but not in the IPRD-treated heifers. Estimated mean concentrations of total plasma P-4 during the 8 days of IPRD insertion was greater (P<0.001) in the CM 1.56 g P-4 treated heifers compared to the CM 0.78 g P-4 treated heifers (18.38 ng/ml compared with 11.09 ng/ml, respectively). A treatment by genotype interaction (P=0.036) was observed in the mean plasma P4 concentration in heifers with no CL during IPRD insertion, whereby BN heifers in the CM 1.56 g treatment had greater plasma P-4 than the BNX heifers on Days-9, -7, -6, -5, and -4. However, there was no genotype effect in the CM 0.78 g +/- G or the 2 x PGF(2 alpha) treatment. Treatment had no effect on the DF growth from either day of wave emergence (P=0.378) or day of IPRD removal (P=0.780) to ovulation. This study demonstrates that FSH secretion in B. indicus heifers treated with a combination of IPRD's and ODB to synchronise ovulation was suppressed during the period of IPRD insertion but no significant effect on growth of the DF was observed. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
The primary objective of this study was to investigate the impact of animal-level factors including energy balance and environmental/management stress, on the ovarian function of Bos indicus heifers treated to synchronize ovulation. Two-year-old Brahman (BN) (n = 30) and BN-cross (n = 34) heifers were randomly allocated to three intravaginal progesterone-releasing device (IPRD) treatment groups: (i) standard-dose IPRD [Cue-Mate (R) (CM) 1.56 g; n = 17]; (ii) half-dose IPRD [0.78 g progesterone (P4); CM 0.78 g; n = 15]; (iii) half-dose IPRD + 300 IU equine chorionic gonadotrophin at IPRD removal (CM 0.78 g + G; n = 14); (iv) and a control group, 2x PGF2a [500 mu g prostaglandin F2a (PGF2a)] on Day -16 and -2 (n = 18). Intravaginal progesterone-releasing device-treated heifers received 250 mu g PGF2a at IPRD insertion (Day -10) and IPRD removal (Day -2) and 1 mg oestradiol benzoate on Day -10 and -1. Heifers were managed in a small feedlot and fed a defined ration. Ovarian function was evaluated by ultrasonography and plasma P4 throughout the synchronized and return cycles. Energy balance was evaluated using plasma insulin-like growth factor 1 (IGF-I) and glucose concentrations. The impact of environmental stressors was evaluated using plasma cortisol concentration. Heifers that had normal ovarian function had significantly higher IGF-I concentrations at commencement of the experiment (p = 0.008) and significantly higher plasma glucose concentrations at Day -2 (p = 0.040) and Day 4 (p = 0.043), than heifers with abnormal ovarian function. There was no difference between the mean pre-ovulatory cortisol concentrations of heifers that ovulated or did not ovulate. However, heifers that ovulated had higher cortisol concentrations at Day 4 (p = 0.056) and 6 (p = 0.026) after ovulation than heifers that did not ovulate.
Resumo:
To break the yield ceiling of rice production, a super rice project was developed in 1996 to breed rice varieties with super high yield. A two-year experiment was conducted to evaluate yield and nitrogen (N)-use response of super rice to different planting methods in the single cropping season. A total of 17 rice varieties, including 13 super rice and four non-super checks (CK), were grown under three N levels [0 (N0), 150 (N150), and 225 (N225) kg ha−1] and two planting methods [transplanting (TP) and direct-seeding in wet conditions (WDS)]. Grain yield under WDS (7.69 t ha−1) was generally lower than TP (8.58 t ha−1). However, grain yield under different planting methods was affected by N rates as well as variety groups. In both years, there was no difference in grain yield between super and CK varieties at N150, irrespective of planting methods. However, grain yield difference was dramatic in japonica groups at N225, that is, there was an 11.3% and 14.1% average increase in super rice than in CK varieties in WDS and TP, respectively. This suggests that high N input contributes to narrowing the yield gap in super rice varieties, which also indicates that super rice was bred for high fertility conditions. In the japonica group, more N was accumulated in super rice than in CK at N225, but no difference was found between super and CK varieties at N0 and N150. Similar results were also found for N agronomic efficiency. The results suggest that super rice varieties have an advantage for N-use efficiency when high N is applied. The response of super rice was greater under TP than under WDS. The results suggest that the need to further improve agronomic and other management practices to achieve high yield and N-use efficiency for super rice varieties in WDS.
Resumo:
Cat’s claw creeper vine, Dolichandra unguis-cati (L.) L.G.Lohmann (formerly known as Macfadyena unguis-cati (L.) A.H.Gentry), a Weed of National Significance (WoNS), is a structural woody parasite that is highly invasive along sensitive riparian corridors and native forests of coastal and inland eastern Australia. As part of evaluation of the impact of herbicide and mechanical/physical control techniques on the long-term reduction of biomass of the weed and expected return of native flora, we have set-up permanent vegetation plots in: (a) infested and now chemically/physically treated, (b) infested but untreated and (c) un-infested patches. The treatments were set up in both riparian and non-riparian habitats to document changes that occur in seed bank flora over a two-year post-treatment period. Response to treatment varied spatially and temporally. However, following chemical and physical removal treatments, treated patches exhibited lower seed bank abundance and diversity than infested and patches lacking the weed, but differences were not statistically significant. Thus it will be safe to say that spraying herbicides using the recommended rate does not undermine restoration efforts.
Resumo:
Background: The development of a horse vaccine against Hendra virus has been hailed as a good example of a One Health approach to the control of human disease. Although there is little doubt that this is true, it is clear from the underwhelming uptake of the vaccine by horse owners to date (approximately 10%) that realisation of a One Health approach requires more than just a scientific solution. As emerging infectious diseases may often be linked to the development and implementation of novel vaccines this presentation will discuss factors influencing their uptake; using Hendra virus in Australia as a case study. Methods: This presentation will draw on data collected from the Horse owners and Hendra virus: A Longitudinal cohort study To Evaluate Risk (HHALTER) study. The HHALTER study is a mixed methods research study comprising a two-year survey-based longitudinal cohort study and qualitative interview study with horse owners in Australia. The HHALTER study has investigated and tracked changes in a broad range of issues around early uptake of vaccination, horse owner uptake of other recommended disease risk mitigation strategies, and attitudes to government policy and disease response. Interviews provide further insights into attitudes towards risk and decision-making in relation to vaccine uptake. A combination of quantitative and qualitative data analysis will be reported. Results: Data collected from more than 1100 horse owners shortly after vaccine introduction indicated that vaccine uptake and intention to vaccinate was associated with a number of risk perception factors and financial cost factors. In addition, concerns about side effects and veterinarians refusing to treat unvaccinated horses were linked to uptake. Across the study period vaccine uptake in the study cohort increased to more than 50%, however, concerns around side effects, equine performance and breeding impacts, delays to full vaccine approvals, and attempts to mandate vaccination by horse associations and event organisers have all impacted acceptance. Conclusion: Despite being provided with a safe and effective vaccine for Hendra virus that can protect horses and break the transmission cycle of the virus to humans, Australian horse owners have been reluctant to commit to it. General issues pertinent to novel vaccines, combined with challenges in the implementation of the vaccine have led to issues of mistrust and misconception with some horse owners. Moreover, factors such as cost, booster dose schedules, complexities around perceived risk, and ulterior motives attributed to veterinarians have only served to polarise attitudes to vaccine acceptance.