984 resultados para multiculturalism in Australia
Resumo:
The leaf-tying moth Hypocosmia pyrochroma Jones (Lepidoptera: Pyralidae), a native of sub tropical South America, has been introduced as a biological control agent for cat’s claw creeper, Dolichandra unguis-cati (L.) Lohman (Bignoniaceae), in Australia and South Africa. So far there has been no evidence of its field establishment in either country. A narrow temperature tolerance is a potential limiting factor for the establishment of weed biological control insects in novel habitats. In this study, we evaluated the effect of seven constant temperatures (12–40 °C) on the survival and development of H. pyrochroma in temperature-controlled cabinets. Temperatures between 20 and 30 °C were the most favorable for adult survival, oviposition, egg hatching, and larval and pupal development. Adult survival (12–40 °C) and egg development (15–35 °C) showed tolerance for wider temperature ranges than oviposition, and larval and pupal development, which were all negatively affected by both high (>30 °C) and low (<20 °C) temperatures. The degree-day (DD) requirement to complete a generation was estimated as 877 above a threshold temperature of 12 °C. Based on DD requirements and an obligatory winter diapause of pupae from mid-autumn to mid-spring, the potential number of generations (egg to adult) the leaf-tying moth can complete in a year in Australia or South Africa range from one to three. A climate-matching model predicted that the inland regions of both Australia and South Africa are less favorable for H. pyrochroma than the coastal areas. The study suggested that H. pyrochroma is more likely to establish in the coastal areas of Australia where most of the cat’s claw creeper infestations occur, than in South Africa where most of the cat’s claw creeper infestations are inland.
Resumo:
The aim of this investigation was to determine the persistence of biofilm-associated antibiotic resistance developed by methicillin-sensitive Staphylococcus aureus (MSSA), of different capsular types, during biofilm formation. Because of superiority of the tissue culture plate (TCP) over the Congo Red Agar (CRA) method for measuring biofilm formation, it was used to determine the persistence of the antibiotic resistance developed by the isolates in biofilms. The antibiotic resistance was found to persist for 3-4 wk post-propagation as planktonic subcultures. Interestingly, some strains even developed resistance to vancomycin and/or teicoplanin. However, no association of either biofilm formation or persistent antibiotic resistance with the major capsular phenotype was observed. These observations highlight the potential significance of (a) determining the antibiograms of S. aureus subcultured from biofilms developed in vitro using the TCP method as well as from planktonic cultures for formulation of an optimal therapeutic strategy, and (b) continuing to identify predominant non-capsular antigens contributing to biofilm formation, regardless of the capsular phenotype for the development of an effective potentially broad-spectrum vaccine for prevention of bovine mastitis caused by S. aureus.
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.
Resumo:
Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.
Resumo:
Globally, wild or feral pigs Sus scrofa are a widespread and important pest. Mitigation of their impacts requires a sound understanding of those impacts and the benefits and limitations of different management approaches. Here, we review published and unpublished studies to provide a synopsis of contemporary understanding of wild pig impacts and management in Australia, and to identify important shortcomings. Wild pigs can have important impacts on biodiversity values, ecosystem functioning and agricultural production. However, many of these impacts remain poorly described, and therefore, difficult to manage effectively. Many impacts are highly variable, and innovative experimental and analytical approaches may be necessary to elucidate them. Most contemporary management programmes use lethal techniques to attempt to reduce pig densities, but it is often unclear how effective they are at reducing damage. We conclude that greater integration of experimental approaches into wild pig management programmes is necessary to improve our understanding of wild pig impacts, and our ability to manage those impacts effectively and efficiently.
Resumo:
Persistent organic pollutants (POPs) including polybrominated diphenyl ethers (PBDEs); organochlorine pesticides (OCPs); and polychlorinated biphenyls (PCBs) persist in the environment, bioaccumulate, and pose a risk of causing adverse human health effects. Typically, exposure assessments undertaken by modeling existing intake data underestimate the concentrations of these chemicals in infants. This study aimed to determine concentrations of POPs in infant foods, assess exposure via dietary intake and compare this to historical exposure. Fruit purees, meat and vegetables, dairy desserts, cereals and jelly foods (n = 33) purchased in 2013 in Brisbane, Australia were analyzed. For OCPs and PCBs, concentrations ranged up to 95 pg/g fw and for PBDEs up to 32 pg/g fw with most analytes below the limit of detection. Daily intake is dependent on type and quantity of foods consumed. Consumption of a 140 g meal would result in intake ranging from 0 to 4.2 ng/day, 4.4 ng/day and 13.3 ng/day, for OCPs, PBDEs and PCBs, respectively. PBDEs were detected in 3/33 samples, OCPs in 9/33 samples and PCBs in 13/33 samples. Results from this study indicate exposure for infants via dietary (in contrast to dust and breast milk) intake in Australia contribute only a minor component to total exposure.
Resumo:
Australia has an aging population and workforce, and policy makers and organizations increasingly encourage older workers to remain on the job longer and even beyond traditional retirement age. After a brief review of important demographic and political developments, we introduce the 8 articles included in this special issue on work, aging, and retirement in Australia. The articles include an overview of the Australian retirement income system, 6 articles reporting quantitative analyses of cross-sectional and longitudinal data provided by large samples of Australian workers and retirees, and a qualitative study which analyzes interviews with human resource managers. Overall, the articles demonstrate that research on work, aging, and retirement in Australia is flourishing, sophisticated, and diverse both in terms of content and methodologies. We close with a brief review of topics and research questions related to work, aging, and retirement that remain to be addressed in the Australian context in future research.
Resumo:
The roles and epidemiological features of tick-borne protozoans are not well elicited in wildlife. Babesia spp. are documented in many domestic animals, including cattle, horses, pigs, dogs and cats. Three cases affecting eastern grey kangaroos are described. The kangaroos exhibited neurological signs, depression and marked anaemia, and microscopic examination of blood smears revealed intraerythrocytic piroplasms. One to seven intraerythrocytic spherical, oval, pyriform and irregularly-shaped parasites consistent with Babesia spp. were seen in the blood smears and the percentage of infected erythrocytes was estimated to be approximately 7% in each case. Data suggest that the tick vector for this kangaroo Babesia sp. is a Haemaphysalis species. For Case 2, ultrastructural examination of the erythrocytes of the renal capillaries showed parasites resembling Babesia spp. and 18 of 33 erythrocytes were infected. DNA sequencing of the amplified 18S rDNA confirmed that the observed intraerythrocytic piroplasms belong to the genus Babesia. The phylogenetic position of this new kangaroo Babesia sp. (de novo Babesia macropus), as a sister species to the new Australian woylie Babesia sp., suggests a close affinity to the described Afro-Eurasian species Babesia orientalis and Babesia occultans suggesting perhaps a common ancestor for the Babesia in kangaroos. © 2012 Australian Society for Parasitology.
Resumo:
The leaf-tying moth Hypocosmia pyrochroma Jones (Lepidoptera: Pyralidae), a native of sub tropical South America, has been introduced as a biological control agent for cat’s claw creeper, Dolichandra unguis-cati (L.) Lohman (Bignoniaceae), in Australia and South Africa. So far there has been no evidence of its field establishment in either country. A narrow temperature tolerance is a potential limiting factor for the establishment of weed biological control insects in novel habitats. In this study, we evaluated the effect of seven constant temperatures (12–40 °C) on the survival and development of H. pyrochroma in temperature-controlled cabinets. Temperatures between 20 and 30 °C were the most favorable for adult survival, oviposition, egg hatching, and larval and pupal development. Adult survival (12–40 °C) and egg development (15–35 °C) showed tolerance for wider temperature ranges than oviposition, and larval and pupal development, which were all negatively affected by both high (>30 °C) and low (<20 °C) temperatures. The degree-day (DD) requirement to complete a generation was estimated as 877 above a threshold temperature of 12 °C. Based on DD requirements and an obligatory winter diapause of pupae from mid-autumn to mid-spring, the potential number of generations (egg to adult) the leaf-tying moth can complete in a year in Australia or South Africa range from one to three. A climate-matching model predicted that the inland regions of both Australia and South Africa are less favorable for H. pyrochroma than the coastal areas. The study suggested that H. pyrochroma is more likely to establish in the coastal areas of Australia where most of the cat’s claw creeper infestations occur, than in South Africa where most of the cat’s claw creeper infestations are inland.
Resumo:
Prochloraz as Sportak at 450 g a.i./L is registered for the control of postharvest diseases in papaya in Australia. A project in far north Queensland in 2011, examined the use patterns of postharvest treatments, evaluated treatment dips and sprays for prochloraz concentrations and evaluated the efficacy of prochloraz at 0, 20, 40, 55 and 70 ml/100 L, fludioxonil as Scholar at 260 ml/100 L and azoxystrobin as Amistar at 50 ml/100 L. Results showed that packing shed use of Sportak varied with recycled and stored solutions showing a depletion of the active ingredient. Measured prochloraz in solution was highly pH dependent with nominal solution values only being measured when the pH was less than 3.0. In the fungicide efficacy trial Sportak at the label rate of 55 ml/100 L provided more effective disease control than fludioxonil and azoxystrobin. The trial also suggested that fruit from older trees showed a high degree of disease incidence relative to fruit from young trees.
Resumo:
A recent report to the Australian Government identified concerns relating to Australia's capacity to respond to a medium to large outbreak of FMD. To assess the resources required, the AusSpread disease simulation model was used to develop a plausible outbreak scenario that included 62 infected premises in five different states at the time of detection, 28 days after the disease entered the first property in Victoria. Movements of infected animals and/or contaminated product/equipment led to smaller outbreaks in NSW, Queensland, South Australia and Tasmania. With unlimited staff resources, the outbreak was eradicated in 63 days with 54 infected premises and a 98% chance of eradication within 3 months. This unconstrained response was estimated to involve 2724 personnel. Unlimited personnel was considered unrealistic, and therefore, the course of the outbreak was modelled using three levels of staffing and the probability of achieving eradication within 3 or 6 months of introduction determined. Under the baseline staffing level, there was only a 16% probability that the outbreak would be eradicated within 3 months, and a 60% probability of eradication in 6 months. Deployment of an additional 60 personnel in the first 3 weeks of the response increased the likelihood of eradication in 3 months to 68%, and 100% in 6 months. Deployment of further personnel incrementally increased the likelihood of timely eradication and decreased the duration and size of the outbreak. Targeted use of vaccination in high-risk areas coupled with the baseline personnel resources increased the probability of eradication in 3 months to 74% and to 100% in 6 months. This required 25 vaccination teams commencing 12 days into the control program increasing to 50 vaccination teams 3 weeks later. Deploying an equal number of additional personnel to surveillance and infected premises operations was equally effective in reducing the outbreak size and duration.