28 resultados para SUMMER MORTALITY
Resumo:
Bemisia tabaci, biotype B, commonly known as the silverleaf whitefly (SLW) is an alien species that invaded Australia in the mid-90s. This paper reports on the invasion ecology of SLW and the factors that are likely to have contributed to the first outbreak of this major pest in an Australian cotton cropping system, population dynamics of SLW within whitefly-susceptible crop (cotton and cucurbit) and non-crop vegetation (sowthistle, Sonchus spp.) components of the cropping system were investigated over four consecutive growing seasons (September-June) 2001/02-2004/05 in the Emerald Irrigation Area (EIA) of Queensland, Australia. Based on fixed geo-referenced sampling sites, variation in spatial and temporal abundance of SLW within each system component was quantified to provide baseline data for the development of ecologically sustainable pest management strategies. Parasitism of large (3rd and 4th instars) SLW nymphs by native aphelinid wasps was quantified to determine the potential for natural control of SLW populations. Following the initial outbreak in 2001/02, SLW abundance declined and stabilised over the next three seasons. The population dynamics of SLW is characterised by inter-seasonal population cycling between the non-crop (weed) and cotton components of the EIA cropping system. Cotton was the largest sink for and source of SLW during the study period. Over-wintering populations dispersed from weed host plant sources to cotton in spring followed by a reverse dispersal in late summer and autumn to broad-leaved crops and weeds. A basic spatial source-sink analysis showed that SLW adult and nymph densities were higher in cotton fields that were closer to over-wintering weed sources throughout spring than in fields that were further away. Cucurbit fields were not significant sources of SLW and did not appear to contribute significantly to the regional population dynamics of the pest. Substantial parasitism of nymphal stages throughout the study period indicates that native parasitoid species and other natural enemies are important sources of SLW mortality in Australian cotton production systems. Weather conditions and use of broad-spectrum insecticides for pest control are implicated in the initial outbreak and on-going pest status of SLW in the region.
Resumo:
The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.
Resumo:
Bellyache bush (Jatropha gossypiifolia L.) is an invasive weed that has the potential to greatly reduce biodiversity and pasture productivity in northern Australia’s rangelands. This paper reports an approach to develop best practice options for controlling medium to dense infestations of bellyache bush using combinations of control methods. The efficacy of five single treatments including foliar spraying, slashing, stick raking, burning and do nothing (control) were compared against 15 combinations of these treatments over 4 successive years. Treatments were evaluated using several attributes, including plant mortality, changes in population demographics, seedling recruitment, pasture yield and cost of treatment. Foliar spraying once each year for 4 years proved the most cost-effective control strategy, with no bellyache bush plants recorded at the end of the study. Single applications of slashing, stick raking and to a lesser extent burning, when followed up with foliar spraying also led to significantly reduced densities of bellyache bush and changed the population from a growing one to a declining one. Total experimental cost estimates over 4 successive years for treatments where burning, stick raking, foliar spraying, and slashing were followed with foliar spraying were AU$408, AU$584, AU$802 and AU$789 ha–1, respectively. Maximum pasture yield of 5.4 t ha–1 occurred with repeated foliar spraying. This study recommends that treatment combinations using either foliar spraying alone or as a follow up with slashing, stick raking or burning are best practice options following consideration of the level of control, changes in pasture yield and cost effectiveness.
Resumo:
In 2001 a scoping study (phase I) was commissioned to determine and prioritise the weed issues of cropping systems with dryland cotton. The main findings were that the weed flora was diverse, cropping systems complex, and weeds had a major financial and economical impact. Phase II 'Best weed management strategies for dryland cropping systems with cotton' focused on improved management of the key weeds, bladder ketmia, sowthistle, fleabane, barnyard grass and liverseed grass.In Phase III 'Improving management of summer weeds in dryland cropping systems with cotton', more information on the seed-bank dynamics of key weeds was gained in six pot and field studies. The studies found that these characteristics differed between species, and even climate in the case of bladder ketmia. Species such as sowthistle, fleabane and barnyard grass emerged predominately from the surface soil. Sweet summer grass was also in this category but also had a significant proportion emerging from 5 cm depth. Bladder ketmia in central Queensland emerged mainly from the top 2 cm, whereas in southern Queensland it emerged mainly from 5 cm. Liverseed grass had its highest emergence from 5 cm below the surface. In all cases the persistence of seed increased with increasing soil depth. Fleabane was also found to be sensitive to soil type with no seedlings emerging in the self-mulching black vertisol soil. A strategic tillage trial showed that burial of fleabane seed, using a disc or chisel plough, to a depth of greater than 2 cm can significantly reduce subsequent fleabane emergence. In contrast, tillage increased barnyard grass emergence and tended to decrease persistence. This research showed that weed management plans can not be blanketed across all weed species, rather they need to be targeted for each main weed species.This project has also resulted in an increased knowledge of how to manage fleabane from the eight experiments; one in wheat, two in sorghum, one in cotton and three in fallow on double knock. For summer crops, the best option is to apply a highly effective fallow treatment prior to sowing the crops. For winter crops, the strategy is the integration of competitive crops, residual herbicide followed by a knockdown to control survivors. This project explored further the usefulness of the double knock tactic for weed control and preventing seed set. Two field and one pot experiments have shown that this tactic was highly effective for fleabane control. Paraquat products provided good control when followed by glyphosate. When 2, 4-D was added in a tank mix with glyphosate and followed by paraquat products, 99-100% control was achieved in all cases. The ideal follow-up times for paraquat products after glyphosate were 5-7 days. The preferred follow-up times for 2, 4-D after glyphosate were on the same day and one day later. The pot trial, which compared a population from a cropping field with previous glyphosate exposure and a population from a non-cropping area with no previous glyphosate herbicide exposure, showed that the pervious herbicide exposure affected the response of fleabane to herbicidal control measures. The web-based brochure on managing fleabane has been updated.Knowledge on management of summer grasses and safe use of residual herbicides was derived from eight field and pot experiments. Residual grass and broadleaf weed control was excellent with atrazine pre-plant and at-planting treatments, provided rain was received within a short interval after application. Highly effective fallow treatments (cultivation and double knock), not only gave excellent grass control in the fallow, also gave very good control in the following cotton. In the five re-cropping experiments, there were no adverse impacts on cotton from atrazine, metolachlor, metsulfuron and chlorsulfuron residues following use in previous sorghum, wheat and fallows. However, imazapic residues did reduce cotton growth.The development of strategies to reduce the heavy reliance on glyphosate in our cropping systems, and therefore minimise the risk of glyphosate resistance development, was a key factor in the research undertaken. This work included identifying suitable tactics for summer grass control, such as double knock with glyphosate followed by paraquat and tillage. Research on fleabane also concentrated on minimising emergence through tillage, and applying the double knock tactic. Our studies have shown that these strategies can be used to prevent seed set with the goal of driving down the seed bank. Utilisation of the strategies will also reduce the reliance on glyphosate, and therefore reduce the risk of glyphosate resistance developing in our cropping systems.Information from this research, including ecological and management data were collected from an additional eight paddock monitoring sites, was also incorporated into the Weeds CRC seed bank model "Weed Seed Wizard", which will be able to predict the impact of different management options on weed populations in cotton and grain farming systems. Extensive communication activities were undertaken throughout this project to ensure adoption of the new strategies for improved weed management and reduced risk for glyphosate resistance.
Resumo:
To undertake a scoping study to identify the major issues in weed management in dryland cropping systems with cotton.
Resumo:
Dairy farms located in the subtropical cereal belt of Australia rely on winter and summer cereal crops, rather than pastures, for their forage base. Crops are mostly established in tilled seedbeds and the system is vulnerable to fertility decline and water erosion, particularly over summer fallows. Field studies were conducted over 5 years on contrasting soil types, a Vertosol and Sodosol, in the 650-mm annual-rainfall zone to evaluate the benefits of a modified cropping program on forage productivity and the soil-resource base. Growing forage sorghum as a double-crop with oats increased total mean annual production over that of winter sole-crop systems by 40% and 100% on the Vertosol and Sodosol sites respectively. However, mean annual winter crop yield was halved and overall forage quality was lower. Ninety per cent of the variation in winter crop yield was attributable to fallow and in-crop rainfall. Replacing forage sorghum with the annual legume lablab reduced fertiliser nitrogen (N) requirements and increased forage N concentration, but reduced overall annual yield. Compared with sole-cropped oats, double-cropping reduced the risk of erosion by extending the duration of soil water deficits and increasing the time ground was under plant cover. When grown as a sole-crop, well fertilised forage sorghum achieved a mean annual cumulative yield of 9.64 and 6.05 t DM/ha on the Vertosol and Sodosol, respectively, being about twice that of sole-cropped oats. Forage sorghum established using zero-tillage practices and fertilised at 175 kg N/ha. crop achieved a significantly higher yield and forage N concentration than did the industry-standard forage sorghum (conventional tillage and 55 kg N/ha. crop) on the Vertosol but not on the Sodosol. On the Vertosol, mean annual yield increased from 5.65 to 9.64 t DM/ha (33 kg DM/kg N fertiliser applied above the base rate); the difference in the response between the two sites was attributed to soil type and fertiliser history. Changing both tillage practices and N-fertiliser rate had no affect on fallow water-storage efficiency but did improve fallow ground cover. When forage sorghum, grown as a sole crop, was replaced with lablab in 3 of the 5 years, overall forage N concentration increased significantly, and on the Vertosol, yield and soil nitrate-N reserves also increased significantly relative to industry-standard sorghum. All forage systems maintained or increased the concentration of soil nitrate-N (0-1.2-m soil layer) over the course of the study. Relative to sole-crop oats, alternative forage systems were generally beneficial to the concentration of surface-soil (0-0.1 m) organic carbon and systems that included sorghum showed most promise for increasing soil organic carbon concentration. We conclude that an emphasis on double-or summer sole-cropping rather than winter sole-cropping will advantage both farm productivity and the soil-resource base.
Resumo:
In current simulation packages for the management of extensive beef-cattle enterprises, the relationships for the key biological rates (namely conception and mortality) are quite rudimentary. To better estimate these relationships, cohort-level data covering 17 100 cow-years from six sites across northern Australia were collated and analysed. Further validation data, from 7200 cow-years, were then used to test these relationships. Analytical problems included incomplete and non-standardised data, considerable levels of correlation among the 'independent' variables, and the close similarity of alternate possible models. In addition to formal statistical analyses of these data, the theoretical equations for predicting mortality and conception rates in the current simulation models were reviewed, and then reparameterised and recalibrated where appropriate. The final models explained up to 80% of the variation in the data. These are now proposed as more accurate and useful models to be used in the prediction of biological rates in simulation studies for northern Australia. © The State of Queensland (through the Department of Agriculture, Fisheries and Forestry) 2012. © CSIRO.
Resumo:
Data from 9296 calves born to 2078 dams over 9 years across five sites were used to investigate factors associated with calf mortality for tropically adapted breeds (Brahman and Tropical Composite) recorded in extensive production systems, using multivariate logistic regression. The average calf mortality pre-weaning was 9.5% of calves born, varying from 1.5% to 41% across all sites and years. In total, 67% of calves that died did so within a week of their birth, with cause of death most frequently recorded as unknown. The major factors significantly (P < 0.05) associated with mortality for potentially large numbers of calves included the specific production environment represented by site-year, low calf birthweight (more so than high birthweight) and horn status at branding. Almost all calf deaths post-branding (assessed from n = 8348 calves) occurred in calves that were dehorned, totalling 2.1% of dehorned calves and 15.9% of all calf deaths recorded. Breed effects on calf mortality were primarily the result of breed differences in calf birthweight and, to a lesser extent, large teat size of cows; however, differences in other breed characteristics could be important. Twin births and calves assisted at birth had a very high risk of mortality, but <1% of calves were twins and few calves were assisted at birth. Conversely, it could not be established how many calves would have benefitted from assistance at birth. Cow age group and outcome from the previous season were also associated with current calf mortality; maiden or young cows (<4 years old) had increased calf losses overall. More mature cows with a previous outcome of calf loss were also more likely to have another calf loss in the subsequent year, and this should be considered for culling decisions. Closer attention to the management of younger cows is warranted to improve calf survival.
Resumo:
The root-lesion nematode, Pratylenchus thornei, can reduce wheat yields by >50%. Although this nematode has a broad host range, crop rotation can be an effective tool for its management if the host status of crops and cultivars is known. The summer crops grown in the northern grain region of Australia are poorly characterised for their resistance to P. thornei and their role in crop sequencing to improve wheat yields. In a 4-year field experiment, we prepared plots with high or low populations of P. thornei by growing susceptible wheat or partially resistant canaryseed (Phalaris canariensis); after an 11-month, weed-free fallow, several cultivars of eight summer crops were grown. Following another 15-month, weed-free fallow, P. thornei-intolerant wheat cv. Strzelecki was grown. Populations of P. thornei were determined to 150 cm soil depth throughout the experiment. When two partially resistant crops were grown in succession, e.g. canaryseed followed by panicum (Setaria italica), P. thornei populations were <739/kg soil and subsequent wheat yields were 3245 kg/ha. In contrast, after two susceptible crops, e.g. wheat followed by soybean, P. thornei populations were 10 850/kg soil and subsequent wheat yields were just 1383 kg/ha. Regression analysis showed a linear, negative response of wheat biomass and grain yield with increasing P. thornei populations and a predicted loss of 77% for biomass and 62% for grain yield. The best predictor of wheat yield loss was P. thornei populations at 0-90 cm soil depth. Crop rotation can be used to reduce P. thornei populations and increase wheat yield, with greatest gains being made following two partially resistant crops grown sequentially.
Resumo:
A disease outbreak investigation was conducted in western Queensland to investigate a rare suspected outbreak of pyrrolizidine alkaloid (PA) toxicosis in horses. Thirty five of 132 horses depastured on five properties on the Mitchell grass plains of western Queensland died in the first six months of 2010. Clinical–pathological findings were consistent with PA toxicosis. A local variety of Crotalaria medicaginea was the only hepatotoxic plant found growing on affected properties. Pathology reports and departure and arrival dates of two brood mares provided evidence of a pre wet season exposure period. All five affected properties experienced a very dry spring and early summer preceded by a large summer wet season. The outbreak was characterised as a point epidemic with a sudden peak of deaths in March followed by mortalities steadily declining until the end of June. The estimated morbidity (serum IGG > 50 IU/L) rate was 76%. Average crude mortality was 27% but higher in young horses (67%) and brood mares (44%). Logistic regression analysis showed that young horses and brood mares and those grazing denuded pastures in December were most strongly associated with dying whereas those fed hay and/or grain based supplements were less likely to die. This is the first detailed study of an outbreak of PA toxicosis in central western Queensland and the first to provide evidence that environmental determinants were associated with mortality, that the critical exposure period was towards the end of the dry season, that supplementary feeding is protective and that denuded pastures and the horses physiological protein requirement are risk factors.
Resumo:
This is a retrospective study of 38 cases of infection by Babesia macropus, associated with a syndrome of anaemia and debility in hand-reared or free-ranging juvenile eastern grey kangaroos (Macropus giganteus) from coastal New South Wales and south-eastern Queensland between 1995 and 2013. Infection with B. macropus is recorded for the first time in agile wallabies (Macropus agilis) from far north Queensland. Animals in which B. macropus infection was considered to be the primary cause of morbidity had marked anaemia, lethargy and neurological signs, and often died. In these cases, parasitised erythrocytes were few or undetectable in peripheral blood samples but were sequestered in large numbers within small vessels of visceral organs, particularly in the kidney and brain, associated with distinctive clusters of extraerythrocytic organisms. Initial identification of this piroplasm in peripheral blood smears and in tissue impression smears and histological sections was confirmed using transmission electron microscopy and molecular analysis. Samples of kidney, brain or blood were tested using PCR and DNA sequencing of the 18S ribosomal RNA and heat shock protein 70 gene using primers specific for piroplasms. The piroplasm detected in these samples had 100 sequence identity in the 18S rRNA region with the recently described Babesia macropus in two eastern grey kangaroos from New South Wales and Queensland, and a high degree of similarity to an unnamed Babesia sp. recently detected in three woylies (Bettongia penicillata ogilbyi) in Western Australia.
Resumo:
Fisheries management agencies around the world collect age data for the purpose of assessing the status of natural resources in their jurisdiction. Estimates of mortality rates represent a key information to assess the sustainability of fish stocks exploitation. Contrary to medical research or manufacturing where survival analysis is routinely applied to estimate failure rates, survival analysis has seldom been applied in fisheries stock assessment despite similar purposes between these fields of applied statistics. In this paper, we developed hazard functions to model the dynamic of an exploited fish population. These functions were used to estimate all parameters necessary for stock assessment (including natural and fishing mortality rates as well as gear selectivity) by maximum likelihood using age data from a sample of catch. This novel application of survival analysis to fisheries stock assessment was tested by Monte Carlo simulations to assert that it provided unbiased estimations of relevant quantities. The method was applied to the data from the Queensland (Australia) sea mullet (Mugil cephalus) commercial fishery collected between 2007 and 2014. It provided, for the first time, an estimate of natural mortality affecting this stock: 0.22±0.08 year −1 .
Resumo:
It is common to model the dynamics of fisheries using natural and fishing mortality rates estimated independently using two separate analyses. Fishing mortality is routinely estimated from widely available logbook data, whereas natural mortality estimations have often required more specific, less frequently available, data. However, in the case of the fishery for brown tiger prawn (Penaeus esculentus) in Moreton Bay, both fishing and natural mortality rates have been estimated from logbook data. The present work extended the fishing mortality model to incorporate an eco-physiological response of tiger prawn to temperature, and allowed recruitment timing to vary from year to year. These ecological characteristics of the dynamics of this fishery were ignored in the separate model that estimated natural mortality. Therefore, we propose to estimate both natural and fishing mortality rates within a single model using a consistent set of hypotheses. This approach was applied to Moreton Bay brown tiger prawn data collected between 1990 and 2010. Natural mortality was estimated by maximum likelihood to be equal to 0.032 ± 0.002 week−1, approximately 30% lower than the fixed value used in previous models of this fishery (0.045 week−1).