942 resultados para Emerging trends
Resumo:
The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.
Resumo:
Background Population pharmacokinetic models combined with multiple sets of age– concentration biomonitoring data facilitate back-calculation of chemical uptake rates from biomonitoring data. Objectives We back-calculated uptake rates of PBDEs for the Australian population from multiple biomonitoring surveys (top-down) and compared them with uptake rates calculated from dietary intake estimates of PBDEs and PBDE concentrations in dust (bottom-up). Methods Using three sets of PBDE elimination half-lives, we applied a population pharmacokinetic model to the PBDE biomonitoring data measured between 2002–2003 and 2010–2011 to derive the top-down uptake rates of four key PBDE congeners and six age groups. For the bottom-up approach, we used PBDE concentrations measured around 2005. Results Top-down uptake rates of Σ4BDE (the sum of BDEs 47, 99, 100, and 153) varied from 7.9 to 19 ng/kg/day for toddlers and from 1.2 to 3.0 ng/kg/day for adults; in most cases, they were—for all age groups—higher than the bottom-up uptake rates. The discrepancy was largest for toddlers with factors up to 7–15 depending on the congener. Despite different elimination half-lives of the four congeners, the age–concentration trends showed no increase in concentration with age and were similar for all congeners. Conclusions In the bottom-up approach, PBDE uptake is underestimated; currently known pathways are not sufficient to explain measured PBDE concentrations, especially in young children. Although PBDE exposure of toddlers has declined in the past years, pre- and postnatal exposure to PBDEs has remained almost constant because the mothers’ PBDE body burden has not yet decreased substantially.
Resumo:
* Plant response to drought is complex, so that traits adapted to a specific drought type can confer disadvantage in another drought type. Understanding which type(s) of drought to target is of prime importance for crop improvement. * Modelling was used to quantify seasonal drought patterns for a check variety across the Australian wheatbelt, using 123 yr of weather data for representative locations and managements. Two other genotypes were used to simulate the impact of maturity on drought pattern. * Four major environment types summarized the variability in drought pattern over time and space. Severe stress beginning before flowering was common (44% of occurrences), with (24%) or without (20%) relief during grain filling. High variability occurred from year to year, differing with geographical region. With few exceptions, all four environment types occurred in most seasons, for each location, management system and genotype. * Applications of such environment characterization are proposed to assist breeding and research to focus on germplasm, traits and genes of interest for target environments. The method was applied at a continental scale to highly variable environments and could be extended to other crops, to other drought-prone regions around the world, and to quantify potential changes in drought patterns under future climates.
Resumo:
The Cotton Catchment Communities Cooperative Research Centre began during a period of rapid uptake of Bollgard II® cotton, which contains genes to express two Bt proteins that control the primary pests of cotton in Australia, Helicoverpa armigera and H. punctigera. The dramatic uptake of this technology presumably resulted in strong selection pressure for resistance in Helicoverpa spp. against the Bt proteins. The discovery of higher than expected levels of resistance in both species against one of the proteins in Bollgard II® cotton (Cry2Ab) led to significant re-evaluation of the resistance management plan developed for this technology, which was a core area of research for the Cotton CRC. The uptake of Bollgard II® cotton also led to a substantial decline in pesticide applications against Helicoverpa spp. (from 10–14 to 0–3 applications per season). The low spray environment allowed some pests not controlled by the Bt proteins to emerge as more significant pests, especially sucking species such as Creontiades dilutus and Nezara viridula. A range of other minor pests have also sporadically arisen as problems. Lack of knowledge and experience with these pests created uncertainty and encouraged insecticide use, which threatened to undermine the gains made with Bollgard II® cotton. Here we chronicle the achievements of the Cotton CRC in providing the industry with new knowledge and management strategies for these pests.
Resumo:
Consumer driven food trends are nothing new. “Organics”, gluten-free, and more recently buying “local” have all captured consumers, encouraging supermarkets around the globe and in Australia to respond. But the next emerging European food trend that may have the biggest impact on what we buy each week is “ugly food”.
Resumo:
Viruses that originate in bats may be the most notorious emerging zoonoses that spill over from wildlife into domestic animals and humans. Understanding how these infections filter through ecological systems to cause disease in humans is of profound importance to public health. Transmission of viruses from bats to humans requires a hierarchy of enabling conditions that connect the distribution of reservoir hosts, viral infection within these hosts, and exposure and susceptibility of recipient hosts. For many emerging bat viruses, spillover also requires viral shedding from bats, and survival of the virus in the environment. Focusing on Hendra virus, but also addressing Nipah virus, Ebola virus, Marburg virus and coronaviruses, we delineate this cross-species spillover dynamic from the within-host processes that drive virus excretion to land-use changes that increase interaction among species. We describe how land-use changes may affect co-occurrence and contact between bats and recipient hosts. Two hypotheses may explain temporal and spatial pulses of virus shedding in bat populations: episodic shedding from persistently infected bats or transient epidemics that occur as virus is transmitted among bat populations. Management of livestock also may affect the probability of exposure and disease. Interventions to decrease the probability of virus spillover can be implemented at multiple levels from targeting the reservoir host to managing recipient host exposure and susceptibility.
Resumo:
Wastewater analysis was used to examine prevalence and temporal trends in the use of two cathinones, methylone and mephedrone, in an urban population (>200,000 people) in South East Queensland, Australia. Wastewater samples were collected from the inlet of the sewage treatment plant that serviced the catchment from 2011 to 2013. Liquid chromatography coupled with tandem mass spectrometry was used to measure mephedrone and methylone in wastewater sample using direct injection mode. Mephedrone was not detected in any samples while methylone was detected in 45% of the samples. Daily mass loads of methylone were normalized to the population and used to evaluate methylone use in the catchment. Methylone mass loads peaked in 2012 but there was no clear temporal trend over the monitoring period. The prevalence of methylone use in the catchment was associated with the use of MDMA, the more popular analogue of methylone, as indicated by other complementary sources. Methylone use was stable in the study catchment during the monitoring period whereas mephedrone use has been declining after its peak in 2010. More research is needed on the pharmacokinetics of emerging illicit drugs to improve the applicability of wastewater analysis in monitoring their use in the population.
Resumo:
Viruses that originate in bats may be the most notorious emerging zoonoses that spill over from wildlife into domestic animals and humans. Understanding how these infections filter through ecological systems to cause disease in humans is of profound importance to public health. Transmission of viruses from bats to humans requires a hierarchy of enabling conditions that connect the distribution of reservoir hosts, viral infection within these hosts, and exposure and susceptibility of recipient hosts. For many emerging bat viruses, spillover also requires viral shedding from bats, and survival of the virus in the environment. Focusing on Hendra virus, but also addressing Nipah virus, Ebola virus, Marburg virus and coronaviruses, we delineate this cross-species spillover dynamic from the within-host processes that drive virus excretion to land-use changes that increase interaction among species. We describe how land-use changes may affect co-occurrence and contact between bats and recipient hosts. Two hypotheses may explain temporal and spatial pulses of virus shedding in bat populations: episodic shedding from persistently infected bats or transient epidemics that occur as virus is transmitted among bat populations. Management of livestock also may affect the probability of exposure and disease. Interventions to decrease the probability of virus spillover can be implemented at multiple levels from targeting the reservoir host to managing recipient host exposure and susceptibility.
Resumo:
Small firms are always vulnerable to complex technological change that may render their existing business model obsolete. This paper emphasises the need to understand how the Internet's ubiquitous World Wide Web is impacting on their operating environments. Consideration of evolutionary theory and the absorptive capacity construct provides the foundation for discussion of how learning and discovery take place within individuals, firms and the environments that interact with. Small firms, we argue, face difficulties identifying what routines and competencies are best aligned with the seemingly invisible dominant designs that support pursuit of new enterprise in web-impacted environments. We argue that such difficulties largely relate to an inability to acquire external knowledge and the subsequent reliance on existing internal selection processes that may reinforce the known, at the expense of the unknown. The paper concludes with consideration as to how managers can overcome the expected difficulties through the development of internal routines that support the continual search, evaluation and acquisition of specific external knowledge.
Resumo:
Radiant spring frosts occurring during reproductive developmental stages can result in catastrophic yield loss for wheat producers. To better understand the spatial and temporal variability of frost, the occurrence and impact of frost events on rain-fed wheat production was estimated across the Australian wheatbelt for 1957–2013 using a 0.05 ° gridded weather data set. Simulated yield outcomes at 60 key locations were compared with those for virtual genotypes with different levels of frost tolerance. Over the last six decades, more frost events, later last frost day, and a significant increase in frost impact on yield were found in certain regions of the Australian wheatbelt, in particular in the South-East and West. Increasing trends in frost-related yield losses were simulated in regions where no significant trend of frost occurrence was observed, due to higher mean temperatures accelerating crop development and causing sensitive post-heading stages to occur earlier, during the frost risk period. Simulations indicated that with frost-tolerant lines the mean national yield could be improved by up to 20 through (i) reduced frost damage (~10 improvement) and (ii) the ability to use earlier sowing dates (adding a further 10 improvement). In the simulations, genotypes with an improved frost tolerance to temperatures 1 °C lower than the current 0 °C reference provided substantial benefit in most cropping regions, while greater tolerance (to 3 °C lower temperatures) brought further benefits in the East. The results indicate that breeding for improved reproductive frost tolerance should remain a priority for the Australian wheat industry, despite warming climates.
Resumo:
Wheat is at peak quality soon after harvest. Subsequently, diverse biota use wheat as a resource in storage, including insects and mycotoxin-producing fungi. Transportation networks for stored grain are crucial to food security and provide a model system for an analysis of the population structure, evolution, and dispersal of biota in networks. We evaluated the structure of rail networks for grain transport in the United States and Eastern Australia to identify the shortest paths for the anthropogenic dispersal of pests and mycotoxins, as well as the major sources, sinks, and bridges for movement. We found important differences in the risk profile in these two countries and identified priority control points for sampling, detection, and management. An understanding of these key locations and roles within the network is a new type of basic research result in postharvest science and will provide insights for the integrated pest management of high-risk subpopulations, such as pesticide-resistant insect pests.
Resumo:
Background: The development of a horse vaccine against Hendra virus has been hailed as a good example of a One Health approach to the control of human disease. Although there is little doubt that this is true, it is clear from the underwhelming uptake of the vaccine by horse owners to date (approximately 10%) that realisation of a One Health approach requires more than just a scientific solution. As emerging infectious diseases may often be linked to the development and implementation of novel vaccines this presentation will discuss factors influencing their uptake; using Hendra virus in Australia as a case study. Methods: This presentation will draw on data collected from the Horse owners and Hendra virus: A Longitudinal cohort study To Evaluate Risk (HHALTER) study. The HHALTER study is a mixed methods research study comprising a two-year survey-based longitudinal cohort study and qualitative interview study with horse owners in Australia. The HHALTER study has investigated and tracked changes in a broad range of issues around early uptake of vaccination, horse owner uptake of other recommended disease risk mitigation strategies, and attitudes to government policy and disease response. Interviews provide further insights into attitudes towards risk and decision-making in relation to vaccine uptake. A combination of quantitative and qualitative data analysis will be reported. Results: Data collected from more than 1100 horse owners shortly after vaccine introduction indicated that vaccine uptake and intention to vaccinate was associated with a number of risk perception factors and financial cost factors. In addition, concerns about side effects and veterinarians refusing to treat unvaccinated horses were linked to uptake. Across the study period vaccine uptake in the study cohort increased to more than 50%, however, concerns around side effects, equine performance and breeding impacts, delays to full vaccine approvals, and attempts to mandate vaccination by horse associations and event organisers have all impacted acceptance. Conclusion: Despite being provided with a safe and effective vaccine for Hendra virus that can protect horses and break the transmission cycle of the virus to humans, Australian horse owners have been reluctant to commit to it. General issues pertinent to novel vaccines, combined with challenges in the implementation of the vaccine have led to issues of mistrust and misconception with some horse owners. Moreover, factors such as cost, booster dose schedules, complexities around perceived risk, and ulterior motives attributed to veterinarians have only served to polarise attitudes to vaccine acceptance.
Resumo:
This project describes how Streptococcus agalactiae can be transmitted experimentally in Queensland grouper. The implications of this research furthers the relatedness between Australian S. agalactiae strains from animals and humans. Additionally, this research has developed diagnostic tools for Australian State Veterinary Laboratories and Universities, which will assist in State and National aquatic animal disease detection, surveillance, disease monitoring and reporting
Resumo:
Vertebrate fauna was studied over 10 years following revegetation of a Eucalyptus tereticornis ecosystem on former agricultural land. We compared four vegetation types: remnant forest, plantings of a mix of native tree species on cleared land, natural regeneration of partially cleared land after livestock removal, and cleared pasture land with scattered paddock trees managed for livestock production. Pasture differed significantly from remnant in both bird and nonbird fauna. Although 10 years of ecosystem restoration is relatively short term in the restoration process, in this time bird assemblages in plantings and natural regeneration had diverged significantly from pasture, but still differed significantly from remnant. After 10 years, 70 and 66% of the total vertebrate species found in remnant had been recorded in plantings and natural regeneration, respectively. Although the fauna assemblages within plantings and natural regeneration were tracking toward those of remnant, significant differences in fauna between plantings and natural regeneration indicated community development along different restoration pathways. Because natural regeneration contained more mature trees (dbh > 30 cm), native shrub species, and coarse woody debris than plantings from the beginning of the study, these features possibly encouraged different fauna to the revegetation areas from the outset. The ability of plantings and natural regeneration to transition to the remnant state will be governed by a number of factors that were significant in the analyses, including shrub cover, herbaceous biomass, tree hollows, time since fire, and landscape condition. Both active and passive restoration produced significant change from the cleared state in the short term.