9 resultados para post 25 glorious years

em eResearch Archive - Queensland Department of Agriculture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context. Irregular plagues of house mice cause high production losses in grain crops in Australia. If plagues can be forecast through broad-scale monitoring or model-based prediction, then mice can be proactively controlled by poison baiting. Aims. To predict mouse plagues in grain crops in Queensland and assess the value of broad-scale monitoring. Methods. Regular trapping of mice at the same sites on the Darling Downs in southern Queensland has been undertaken since 1974. This provides an index of abundance over time that can be related to rainfall, crop yield, winter temperature and past mouse abundance. Other sites have been trapped over a shorter time period elsewhere on the Darling Downs and in central Queensland, allowing a comparison of mouse population dynamics and cross-validation of models predicting mouse abundance. Key results. On the regularly trapped 32-km transect on the Darling Downs, damaging mouse densities occur in 50% of years and a plague in 25% of years, with no detectable increase in mean monthly mouse abundance over the past 35 years. High mouse abundance on this transect is not consistently matched by high abundance in the broader area. Annual maximum mouse abundance in autumn–winter can be predicted (R2 = 57%) from spring mouse abundance and autumn–winter rainfall in the previous year. In central Queensland, mouse dynamics contrast with those on the Darling Downs and lack the distinct annual cycle, with peak abundance occurring in any month outside early spring.Onaverage, damaging mouse densities occur in 1 in 3 years and a plague occurs in 1 in 7 years. The dynamics of mouse populations on two transects ~70 km apart were rarely synchronous. Autumn–winter rainfall can indicate mouse abundance in some seasons (R2 = ~52%). Conclusion. Early warning of mouse plague formation in Queensland grain crops from regional models should trigger farm-based monitoring. This can be incorporated with rainfall into a simple model predicting future abundance that will determine any need for mouse control. Implications. A model-based warning of a possible mouse plague can highlight the need for local monitoring of mouse activity, which in turn could trigger poison baiting to prevent further mouse build-up.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sown pasture rundown and declining soil fertility for forage crops are too serious to ignore with losses in beef production of up to 50% across Queensland. The feasibility of using strategic applications of nitrogen (N) fertiliser to address these losses was assessed by analysing a series of scenarios using data drawn from published studies, local fertiliser trials and expert opinion. While N fertilser can dramatically increase productivity (growth, feed quality and beef production gains of over 200% in some scenarios), the estimated economic benefits, derived from paddock level enterprise budgets for a fattening operation, were much more modest. In the best-performing sown grass scenarios, average gross margins were doubled or tripled at the assumed fertiliser response rates, and internal rates of return of up to 11% were achieved. Using fertiliser on forage sorghum or oats was a much less attractive option and, under the paddock level analysis and assumptions used, forages struggled to be profitable even on fertile sites with no fertiliser input. The economics of nitrogen fertilising on grass pasture were sensitive to the assumed response rates in both pasture growth and liveweight gain. Consequently, targeted research is proposed to re-assess the responses used in this analysis, which are largely based on research 25-40 years ago when soils were generally more fertile and pastures less rundown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the dry tropics of northern Australia heifers are generally weaned mid-year at about six months of age and experience two dry seasons and a wet season prior to first mating at 2 years of age when only 60% are likely to conceive (Entwistle 19830. Pre-mating liveweight (PMLW) explains much of the variation in conception rate, but year effects explain further variations (Rudder et al 1985).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A restricted maximum likelihood analysis applied to an animal model showed no significant differences (P > 0.05) in pH value of the longissimus dorsi measured at 24 h post-mortem (pH24) between high and low lines of Large White pigs selected over 4 years for post-weaning growth rate on restricted feeding. Genetic and phenotypic correlations between pH24 and production and carcass traits were estimated using all performance testing records combined with the pH24 measurements (5.05-7.02) on slaughtered animals. The estimate of heritability for pH24 was moderate (0.29 ± 0.18). Genetic correlations between pH24 and production or carcass composition traits, except for ultrasonic backfat (UBF), were not significantly different from zero. UBF had a moderate, positive genetic correlation with pH24 (0.24 ± 0.33). These estimates of genetic correlations affirmed that selection for increased growth rate on restricted feeding is likely to result in limited changes in pH24 and pork quality since the selection does not put a high emphasis on reduced fatness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When recapturing satellite collared wild dogs that had been trapped one month previous in padded foothold traps, we noticed varying degrees of pitting on the pads of their trapped paw. Veterinary advice, based on images taken of the injuries, suggests that the necrosis was caused by vascular compromise. Five of six dingoes we recaptured had varying degrees of necrosis restricted only to the trapped foot and ranging from single 5 mm holes to 25% sections of the toe pads missing or deformed, including loss of nails. The traps used were rubber-padded, two–coiled, Victor Soft Catch #3 traps. The springs are not standard Victor springs but were Beefer springs; these modifications slightly increase trap speed and the jaw pressure on the trapped foot. Despite this modification the spring pressure is still relatively mild in comparison to conventional long spring or four-coiled wild dog traps. The five wild dogs developing necrosis were trapped in November 2006 at 5-6 months of age. Traps were checked each morning so the dogs were unlikely to have been restrained in the trap for more than 12 hours. All dogs exhibited a small degree of paw damage at capture which presented itself as a swollen paw and compression at the capture point. In contrast, eight wild dogs, 7-8 month-old, were captured two months later in February. Upon their release, on advice from a veterinarian, we massaged the trapped foot to get blood flow back in to the foot and applied a bruise treatment (Heparinoid 8.33 mg/ml) to assist restoring blood flow. These animals were subsequently recaptured several months later and showed no signs of necrosis. While post-capture foot injuries are unlikely to be an issue in conventional control programs where the animal is immediately destroyed, caution needs to be used when releasing accidentally captured domestic dogs or research animals captured in rubber-padded traps. We have demonstrated that 7-8 month old dogs can be trapped and released without any evidence of subsequent necrosis following minimal veterinary treatment. We suspect that the rubber padding on traps may increase the tourniquet effect by wrapping around the paw and recommend the evaluation of offset laminated steel jaw traps as an alternative. Offset laminated steel jaw traps have been shown to be relatively humane producing as few foot injuries as rubber-jawed traps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the post-rainy (rabi) season in India around 3 million tonnes of sorghum grain is produced from 5.7 million ha of cropping. This underpins the livelihood of about 5 million households. Severe drought is common as the crop grown in these areas relies largely on soil moisture stored during the preceding rainy season. Improvement of rabi sorghum cultivars through breeding has been slow but could be accelerated if drought scenarios in the production regions were better understood. The sorghum crop model within the APSIM (Agricultural Production Systems sIMulator) platform was used to simulate crop growth and yield and the pattern of crop water status through each season using available historical weather data. The current model reproduced credibly the observed yield variation across the production region (R2=0.73). The simulated trajectories of drought stress through each crop season were clustered into five different drought stress patterns. A majority of trajectories indicated terminal drought (43%) with various timings of onset during the crop cycle. The most severe droughts (25% of seasons) were when stress began before flowering and resulted in failure of grain production in most cases, although biomass production was not affected so severely. The frequencies of drought stress types were analyzed for selected locations throughout the rabi tract and showed different zones had different predominating stress patterns. This knowledge can help better focus the search for adaptive traits and management practices to specific stress situations and thus accelerate improvement of rabi sorghum via targeted specific adaptation. The case study presented here is applicable to other sorghum growing environments. © 2012 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data from 9296 calves born to 2078 dams over 9 years across five sites were used to investigate factors associated with calf mortality for tropically adapted breeds (Brahman and Tropical Composite) recorded in extensive production systems, using multivariate logistic regression. The average calf mortality pre-weaning was 9.5% of calves born, varying from 1.5% to 41% across all sites and years. In total, 67% of calves that died did so within a week of their birth, with cause of death most frequently recorded as unknown. The major factors significantly (P < 0.05) associated with mortality for potentially large numbers of calves included the specific production environment represented by site-year, low calf birthweight (more so than high birthweight) and horn status at branding. Almost all calf deaths post-branding (assessed from n = 8348 calves) occurred in calves that were dehorned, totalling 2.1% of dehorned calves and 15.9% of all calf deaths recorded. Breed effects on calf mortality were primarily the result of breed differences in calf birthweight and, to a lesser extent, large teat size of cows; however, differences in other breed characteristics could be important. Twin births and calves assisted at birth had a very high risk of mortality, but <1% of calves were twins and few calves were assisted at birth. Conversely, it could not be established how many calves would have benefitted from assistance at birth. Cow age group and outcome from the previous season were also associated with current calf mortality; maiden or young cows (<4 years old) had increased calf losses overall. More mature cows with a previous outcome of calf loss were also more likely to have another calf loss in the subsequent year, and this should be considered for culling decisions. Closer attention to the management of younger cows is warranted to improve calf survival.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fire is a major driver of ecosystem change and can disproportionately affect the cycling of different nutrients. Thus, a stoichiometric approach to investigate the relationships between nutrient availability and microbial resource use during decomposition is likely to provide insight into the effects of fire on ecosystem functioning. We conducted a field litter bag experiment to investigate the long-term impact of repeated fire on the stoichiometry of leaf litter C, N and P pools, and nutrient-acquiring enzyme activities during decomposition in a wet sclerophyll eucalypt forest in Queensland, Australia. Fire frequency treatments have been maintained since 1972, including burning every two years (2yrB), burning every four years (4yrB) and no burning (NB). C:N ratios in freshly fallen litter were 29-42% higher and C:P ratios were 6-25% lower for 2yrB than NB during decomposition, with correspondingly lower 2yrB N:P ratios (27-32) than for NB (34-49). Trends in litter soluble and microbial N:P ratios were similar to the overall litter N:P ratios across fire treatments. Consistent with these, the ratio of activities for N-acquiring to P-acquiring enzymes in litter was higher for 2yrB than NB while 4yrB was generally intermediate between 2yrB and NB. Decomposition rates of freshly fallen litter were significantly lower for 2yrB (72±2% mass remaining at the end of experiment) than for 4yrB (59±3%) and NB (62±3%), a difference that may be related to effects of N limitation, lower moisture content, and/or litter C quality. Results for older mixed-age litter were similar to those for freshly fallen litter although treatment differences were less pronounced. Overall, these findings show that frequent fire (2yrB) decoupled N and P cycling, as manifested in litter C:N:P stoichiometry and in microbial biomass N:P ratio and enzymatic activities. These data indicate that fire induced a transient shift to N-limited ecosystem conditions during the post-fire recovery phase. This article is protected by copyright. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A rare opportunity to test hypotheses about potential fishery benefits of large-scale closures was initiated in July 2004 when an additional 28.4% of the 348 000 km2 Great Barrier Reef (GBR) region of Queensland, Australia was closed to all fishing. Advice to the Australian and Queensland governments that supported this initiative predicted these additional closures would generate minimal (10%) initial reductions in both catch and landed value within the GBR area, with recovery of catches becoming apparent after three years. To test these predictions, commercial fisheries data from the GBR area and from the two adjacent (non-GBR) areas of Queensland were compared for the periods immediately before and after the closures were implemented. The observed means for total annual catch and value within the GBR declined from pre-closure (2000–2003) levels of 12 780 Mg and Australian $160 million, to initial post-closure (2005–2008) levels of 8143 Mg and $102 million; decreases of 35% and 36% respectively. Because the reference areas in the non-GBR had minimal changes in catch and value, the beyond-BACI (before, after, control, impact) analyses estimated initial net reductions within the GBR of 35% for both total catch and value. There was no evidence of recovery in total catch levels or any comparative improvement in catch rates within the GBR nine years after implementation. These results are not consistent with the advice to governments that the closures would have minimal initial impacts and rapidly generate benefits to fisheries in the GBR through increased juvenile recruitment and adult spillovers. Instead, the absence of evidence of recovery in catches to date currently supports an alternative hypothesis that where there is already effective fisheries management, the closing of areas to all fishing will generate reductions in overall catches similar to the percentage of the fished area that is closed.