292 resultados para Total reducing sugars


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In January 2011, Brisbane, Australia, experienced a major river flooding event. We aimed to investigate its effects on air quality and assess the role of prompt cleaning activities in reducing the airborne exposure risk. A comprehensive, multi-parameter indoor and outdoor measurement campaign was conducted in 41 residential houses, 2 and 6 months after the flood. The median indoor air concentrations of supermicrometer particle number (PN), PM10, fungi and bacteria 2 months after the flood were comparable to those previously measured in Brisbane. These were 2.88 p cm-3, 15 µg m-3, 804 cfu m-3 and 177 cfu m-3 for flood-affected houses (AFH), and 2.74 p cm-3, 15 µg m-3, 547 cfu m-3 and 167 cfu m-3 for non-affected houses (NFH), respectively. The I/O (indoor/outdoor) ratios of these pollutants were 1.08, 1.38, 0.74 and 1.76 for AFH and 1.03, 1.32, 0.83 and 2.17 for NFH, respectively. The average of total elements (together with transition metals) in indoor dust was 2296 ± 1328 µg m-2 for AFH and 1454 ± 678 µg m-2 for NFH, respectively. In general, the differences between AFH and NFH were not statistically significant, implying the absence of a measureable effect on air quality from the flood. We postulate that this was due to the very swift and effective cleaning of the flooded houses by 60,000 volunteers. Among the various cleaning methods, the use of both detergent and bleach was the most efficient at controlling indoor bacteria. All cleaning methods were equally effective for indoor fungi. This study provides quantitative evidence of the significant impact of immediate post-flood cleaning on mitigating the effects of flooding on indoor bioaerosol contamination and other pollutants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Level crossing risk continues to be a significant safety concern for the security of rail operations around the world. Over the last decade or so, a third of railway related fatalities occurred as a direct result of collisions between road and rail vehicles in Australia. Importantly, nearly half of these collisions occurred at railway level crossings with no active protection, such as flashing lights or boom barriers. Current practice is to upgrade level crossings that have no active protection. However, the total number of level crossings found across Australia exceed 23,500, and targeting the proportion of these that are considered high risk (e.g. public crossings with passive controls) would cost in excess of AU$3.25 billion based on equipment, installation and commissioning costs of warning devices that are currently type approved. Level crossing warning devices that are low-cost provide a potentially effective control for reducing risk; however, over the last decade, there have been significant barriers and legal issues in both Australia and the US that have foreshadowed their adoption. These devices are designed to have significantly lower lifecycle costs compared with traditional warning devices. They often make use of use of alternative technologies for train detection, wireless connectivity and solar energy supply. This paper describes the barriers that have been encountered for the adoption of these devices in Australia, including the challenges associated with: (1) determining requisite safety levels for such devices; (2) legal issues relating to duty of care obligations of railway operators; and (3) issues of Tort liability around the use of less than fail-safe equipment. This paper provides an overview of a comprehensive safety justification that was developed as part of a project funded by a collaborative rail research initiative established by the Australian government, and describes the conceptual framework and processes being used to justify its adoption. The paper provides a summary of key points from peer review and discusses prospective barriers that may need to be overcome for future adoption. A successful outcome from this process would result in the development of a guideline for decision-making, providing a precedence for adopting low-cost level crossing warning devices in other parts of the world. The framework described in this paper also provides relevance to the review and adoption of analogous technologies in rail and other safety critical industries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The occurrence of extreme water level events along low-lying, highly populated and/or developed coastlines can lead to devastating impacts on coastal infrastructure. Therefore it is very important that the probabilities of extreme water levels are accurately evaluated to inform flood and coastal management and for future planning. The aim of this study was to provide estimates of present day extreme total water level exceedance probabilities around the whole coastline of Australia, arising from combinations of mean sea level, astronomical tide and storm surges generated by both extra-tropical and tropical storms, but exclusive of surface gravity waves. The study has been undertaken in two main stages. In the first stage, a high-resolution (~10 km along the coast) hydrodynamic depth averaged model has been configured for the whole coastline of Australia using the Danish Hydraulics Institute’s Mike21 modelling suite of tools. The model has been forced with astronomical tidal levels, derived from the TPX07.2 global tidal model, and meteorological fields, from the US National Center for Environmental Prediction’s global reanalysis, to generate a 61-year (1949 to 2009) hindcast of water levels. This model output has been validated against measurements from 30 tide gauge sites around Australia with long records. At each of the model grid points located around the coast, time series of annual maxima and the several highest water levels for each year were derived from the multi-decadal water level hindcast and have been fitted to extreme value distributions to estimate exceedance probabilities. Stage 1 provided a reliable estimate of the present day total water level exceedance probabilities around southern Australia, which is mainly impacted by extra-tropical storms. However, as the meteorological fields used to force the hydrodynamic model only weakly include the effects of tropical cyclones the resultant water levels exceedance probabilities were underestimated around western, northern and north-eastern Australia at higher return periods. Even if the resolution of the meteorological forcing was adequate to represent tropical cyclone-induced surges, multi-decadal periods yielded insufficient instances of tropical cyclones to enable the use of traditional extreme value extrapolation techniques. Therefore, in the second stage of the study, a statistical model of tropical cyclone tracks and central pressures was developed using histroic observations. This model was then used to generate synthetic events that represented 10,000 years of cyclone activity for the Australia region, with characteristics based on the observed tropical cyclones over the last ~40 years. Wind and pressure fields, derived from these synthetic events using analytical profile models, were used to drive the hydrodynamic model to predict the associated storm surge response. A random time period was chosen, during the tropical cyclone season, and astronomical tidal forcing for this period was included to account for non-linear interactions between the tidal and surge components. For each model grid point around the coast, annual maximum total levels for these synthetic events were calculated and these were used to estimate exceedance probabilities. The exceedance probabilities from stages 1 and 2 were then combined to provide a single estimate of present day extreme water level probabilities around the whole coastline of Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MOST PAN stages in Australian factories use only five or six batch pans for the high grade massecuite production and operate these in a fairly rigid repeating production schedule. It is common that some of the pans are of large dropping capacity e.g. 150 to 240 t. Because of the relatively small number and large sizes of the pans, steam consumption varies widely through the schedule, often by ±30% about the mean value. Large fluctuations in steam consumption have implications for the steam generation/condensate management of the factory and the evaporators when bleed vapour is used. One of the objectives of a project to develop a supervisory control system for a pan stage is to (a) reduce the average steam consumption and (b) reduce the variation in the steam consumption. The operation of each of the high grade pans within the schedule at Macknade Mill was analysed to determine the idle (or buffer) time, time allocations for essential but unproductive operations (e.g. pan turn round, charging, slow ramping up of steam rates on pan start etc.), and productive time i.e. the time during boil-on of liquor and molasses feed. Empirical models were developed for each high grade pan on the stage to define the interdependence of the production rate and the evaporation rate for the different phases of each pan’s cycle. The data were analysed in a spreadsheet model to try to reduce and smooth the total steam consumption. This paper reports on the methodology developed in the model and the results of the investigations for the pan stage at Macknade Mill. It was found that the operation of the schedule severely restricted the ability to reduce the average steam consumption and smooth the steam flows. While longer cycle times provide increased flexibility the steam consumption profile was changed only slightly. The ability to cut massecuite on the run among pans, or the use of a high grade seed vessel, would assist in reducing the average steam consumption and the magnitude of the variations in steam flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To summarise how costs and health benefits will change with the adoption of total laparoscopic hysterectomy compared to total abdominal hysterectomy for the treatment of early stage endometrial cancer. Design Cost-effectiveness modelling using the information from a randomised controlled trial. Participants Two hypothetical modelled cohorts of 1000 individuals undergoing total laparoscopic hysterectomy and total abdominal hysterectomy. Outcome measures Surgery costs; hospital bed days used; total healthcare costs; quality-adjusted life years; and net monetary benefits. Results For 1000 individuals receiving total laparoscopic hysterectomy surgery, the costs were $509 575 higher, 3548 hospital fewer bed days were used and total health services costs were reduced by $3 746 221. There were 39.13 more quality-adjusted life years for a 5 year period following surgery. Conclusions The adoption of total laparoscopic hysterectomy is almost certainly a good decision for health services policy makers. There is 100% probability that it will be cost saving to health services, a 86.8% probability that it will increase health benefits and a 99.5% chance that it returns net monetary benefits greater than zero.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The world of Construction is changing, so too are the expectations of stakeholders regarding strategies for adapting existing resources (people, equipment and finances), processes and tools to the evolving needs of the industry. Building Information Modelling (BIM) is a data-rich, digital approach for representing building information required for design and construction. BIM tools play a crucial role and are instrumental to current approaches, by industry stakeholders, aimed at harnessing the power of a single information repository for improved project delivery and maintenance. Yet, building specifications - which document information on material quality, and workmanship requirements - remain distinctly separate from model information typically represented in BIM models. BIM adoption for building design, construction and maintenance is an industry-wide strategy aimed at addressing such concerns about information fragmentation. However, to effectively reduce inefficiencies due to fragmentation, BIM models require crucial building information contained in specifications. This paper profiles some specification tools which have been used in industry as a means of bridging the BIM-Specifications divide. We analyse the distinction between current attempts at integrating BIM and specifications and our approach which utilizes rich specification information embedded within objects in a product library as a method for improving the quality of information contained in BIM objects at various levels of model development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic incidents are key contributors to non-recurrent congestion, potentially generating significant delay. Factors that influence the duration of incidents are important to understand so that effective mitigation strategies can be implemented. To identify and quantify the effects of influential factors, a methodology for studying total incident duration based on historical data from an ‘integrated database’ is proposed. Incident duration models are developed using a selected freeway segment in the Southeast Queensland, Australia network. The models include incident detection and recovery time as components of incident duration. A hazard-based duration modelling approach is applied to model incident duration as a function of a variety of factors that influence traffic incident duration. Parametric accelerated failure time survival models are developed to capture heterogeneity as a function of explanatory variables, with both fixed and random parameters specifications. The analysis reveals that factors affecting incident duration include incident characteristics (severity, type, injury, medical requirements, etc.), infrastructure characteristics (roadway shoulder availability), time of day, and traffic characteristics. The results indicate that event type durations are uniquely different, thus requiring different responses to effectively clear them. Furthermore, the results highlight the presence of unobserved incident duration heterogeneity as captured by the random parameter models, suggesting that additional factors need to be considered in future modelling efforts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study is to discover the significant factors causing the bubble defect on the outsoles manufactured by the Case Company. The bubble defect occurs approximately 1.5 per cent of the time or in 36 pairs per day. To understand this problem, experimental studies are undertaken to identify various factors such as injector temperature, mould temperature; that affects the production of waste. The work presented in this paper comprises a review of the relevant literature on the Six Sigma DMAIC improvement process, quality control tools, and the design of the experiments. After the experimentation following the Six Sigma process, the results showed that the defect occurred in approximately 0.5 per cent of the products or in 12 pairs per day; this decreased the production cost from 6,120 AUD per month to 2,040 AUD per month. This research aimed to reduce the amount of waste in men’s flat outsoles. Hence, the outcome of research presented in this paper should be used as a guide for applying the appropriate process for each type of outsole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrous oxide emissions were monitored at three sites over a 2-year period in irrigated cotton fields in Khorezm, Uzbekistan, a region located in the arid deserts of the Aral Sea Basin. The fields were managed using different fertilizer management strategies and irrigation water regimes. N2O emissions varied widely between years, within 1 year throughout the vegetation season, and between the sites. The amount of irrigation water applied, the amount and type of N fertilizer used, and topsoil temperature had the greatest effect on these emissions. Very high N2O emissions of up to 3000 μg N2O-N m−2 h−1 were measured in periods following N-fertilizer application in combination with irrigation events. These “emission pulses” accounted for 80–95% of the total N2O emissions between April and September and varied from 0.9 to 6.5 kg N2O-N ha−1.. Emission factors (EF), uncorrected for background emission, ranged from 0.4% to 2.6% of total N applied, corresponding to an average EF of 1.48% of applied N fertilizer lost as N2O-N. This is in line with the default global average value of 1.25% of applied N used in calculations of N2O emissions by the Intergovernmental Panel on Climate Change. During the emission pulses, which were triggered by high soil moisture and high availability of mineral N, a clear diurnal pattern of N2O emissions was observed, driven by daily changes in topsoil temperature. For these periods, air sampling from 8:00 to 10:00 and from 18:00 to 20:00 was found to best represent the mean daily N2O flux rates. The wet topsoil conditions caused by irrigation favored the production of N2O from NO3− fertilizers, but not from NH4+ fertilizers, thus indicating that denitrification was the main process causing N2O emissions. It is therefore argued that there is scope for reducing N2O emission from irrigated cotton production; i.e. through the exclusive use of NH4+ fertilizers. Advanced application and irrigation techniques such as subsurface fertilizer application, drip irrigation and fertigation may also minimize N2O emission from this regionally dominant agro-ecosystem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Land use and agricultural practices can result in important contributions to the global source strength of atmospheric nitrous oxide (N2O) and methane (CH4). However, knowledge of gas flux from irrigated agriculture is very limited. From April 2005 to October 2006, a study was conducted in the Aral Sea Basin, Uzbekistan, to quantify and compare emissions of N2O and CH4 in various annual and perennial land-use systems: irrigated cotton, winter wheat and rice crops, a poplar plantation and a natural Tugai (floodplain) forest. In the annual systems, average N2O emissions ranged from 10 to 150 μg N2O-N m−2 h−1 with highest N2O emissions in the cotton fields, covering a similar range of previous studies from irrigated cropping systems. Emission factors (uncorrected for background emission), used to determine the fertilizer-induced N2O emission as a percentage of N fertilizer applied, ranged from 0.2% to 2.6%. Seasonal variations in N2O emissions were principally controlled by fertilization and irrigation management. Pulses of N2O emissions occurred after concomitant N-fertilizer application and irrigation. The unfertilized poplar plantation showed high N2O emissions over the entire study period (30 μg N2O-N m−2 h−1), whereas only negligible fluxes of N2O (<2 μg N2O-N m−2 h−1) occurred in the Tugai. Significant CH4 fluxes only were determined from the flooded rice field: Fluxes were low with mean flux rates of 32 mg CH4 m−2 day−1 and a low seasonal total of 35.2 kg CH4 ha−1. The global warming potential (GWP) of the N2O and CH4 fluxes was highest under rice and cotton, with seasonal changes between 500 and 3000 kg CO2 eq. ha−1. The biennial cotton–wheat–rice crop rotation commonly practiced in the region would average a GWP of 2500 kg CO2 eq. ha−1 yr−1. The analyses point out opportunities for reducing the GWP of these irrigated agricultural systems by (i) optimization of fertilization and irrigation practices and (ii) conversion of annual cropping systems into perennial forest plantations, especially on less profitable, marginal lands.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agriculture is responsible for a significant proportion of total anthropogenic greenhouse gas emissions (perhaps 18% globally), and therefore has the potential to contribute to efforts to reduce emissions as a means of minimising the risk of dangerous climate change. The largest contributions to emissions are attributed to ruminant methane production and nitrous oxide from animal waste and fertilised soils. Further, livestock, including ruminants, are an important component of global and Australian food production and there is a growing demand for animal protein sources. At the same time as governments and the community strengthen objectives to reduce greenhouse gas emissions, there are growing concerns about global food security. This paper provides an overview of a number of options for reducing methane and nitrous oxide emissions from ruminant production systems in Australia, while maintaining productivity to contribute to both objectives. Options include strategies for feed modification, animal breeding and herd management, rumen manipulation and animal waste and fertiliser management. Using currently available strategies, some reductions in emissions can be achieved, but practical commercially available techniques for significant reductions in methane emissions, particularly from extensive livestock production systems, will require greater time and resource investment. Decreases in the levels of emissions from these ruminant systems (i.e., the amount of emissions per unit of product such as meat) have already been achieved. However, the technology has not yet been developed for eliminating production of methane from the rumen of cattle and sheep digesting the cellulose and lignin-rich grasses that make up a large part of the diet of animals grazing natural pastures, particularly in arid and semi-arid grazing lands. Nevertheless, the abatement that can be achieved will contribute significantly towards reaching greenhouse gas emissions reduction targets and research will achieve further advances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives The purpose for this study was to determine the relative benefit of nap and active rest breaks for reducing driver sleepiness. Methods Participants were 20 healthy young adults (20-25 years), including 8 males and 12 females. A counterbalanced within-subjects design was used such that each participant completed both conditions on separate occasions, a week apart. The effects of the countermeasures were evaluated by established physiological (EEG theta and alpha absolute power), subjective (Karolinska Sleepiness Scale), and driving performance measures (Hazard Perception Task). Participants woke at 5am, and undertook a simulated driving task for two hours; each participant then had either a 15-minute nap opportunity or a 15-minute active rest break that included 10 minutes of brisk walking, followed by another hour of simulated driving. Results The nap break reduced EEG theta and alpha absolute power and eventually reduced subjective sleepiness levels. In contrast, the active rest break did not reduce EEG theta and alpha absolute power levels with the power levels eventually increasing. An immediate reduction of subjective sleepiness was observed, with subjective sleepiness increasing during the final hour of simulated driving. No difference was found between the two breaks for hazard perception performance. Conclusions Only the nap break produced a significant reduction in physiological sleepiness. The immediate reductions of subjective sleepiness following the active rest break could leave drivers with erroneous perceptions of their sleepiness, particularly as physiological sleepiness continued to increase after the break.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To the Editor—In a recent review article in Infection Control and Hospital Epidemiology, Umscheid et al1 summarized published data on incidence rates of catheter-associated bloodstream infection (CABSI), catheter-associated urinary tract infection (CAUTI), surgical site infection (SSI), and ventilator- associated pneumonia (VAP); estimated how many cases are preventable; and calculated the savings in hospital costs and lives that would result from preventing all preventable cases. Providing these estimates to policy makers, political leaders, and health officials helps to galvanize their support for infection prevention programs. Our concern is that important limitations of the published studies on which Umscheid and colleagues built their findings are incompletely addressed in this review. More attention needs to be drawn to the techniques applied to generate these estimates...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project is led by scientists in conservation decision appraisal and brings together a group of experts working across the Lake Eyre Basin (LEB). The LEB covers a sixth of Australia, with an array of globally significant natural values that are threatened by invasive plants, among other things. Managers at various levels are investing in attempts to control, contain and eradicate these invasive plant species, under severe time and resources limitations. To date there has been no basin-wide assessment of which weed management strategies and locations provide the best investments for maximising outcomes for biodiversity per unit cost. Further, there has been no assessment of the extent of ecosystem intactness that may be lost without effective invasive plant species management strategies. Given that there are insufficient resources to manage all invasive plant species everywhere, this information has the potential to improve current investment decisions. Here, we provide a prioritisation of invasive plant management strategies in the LEB. Prioritisation was based on cost-effectiveness for biodiversity benefits. We identify the key invasive plant species to target to protect ecosystem intactness across the bioregions of the LEB, the level of investment required and the likely reduction in invasive species dominance gained per dollar spent on each strategy. Our focus is on strategies that are technically and socially feasible and reduce the likelihood that high impact invasive plant species will dominate native ecosystems, and therefore change their form and function. The outputs of this work are designed to help guide decision-making and further planning and investment in weed management for the Basin. Experts in weed management, policy-making, community engagement, biodiversity and natural values of the Basin, attended a workshop and agreed upon 12 strategies to manage invasive plants. The strategies focused primarily on 10 weeds which were considered to have a high potential for broad, significant impacts on natural ecosystems in the next 50 years and for which feasible management strategies could be defined. Each strategy consisted of one or more supporting actions, many of which were spatially linked to IBRA (Interim Biogeographical Regionalisation of Australia) bioregions. The first strategy was an over-arching recommendation for improved mapping, information sharing, education and extension efforts in order to facilitate the more specific weed management strategies. The 10 more specific weed management strategies targeted the control and/or eradication of the following high-impact exotic plants: mesquite, parkinsonia, rubber vine, bellyache bush, cacti, mother of millions, chinee apple, athel pine and prickly acacia, as well as a separate strategy for eradicating all invasive plants from one key threatened ecological community, the GAB (Great Artesian Basin dependant) mound springs. Experts estimated the expected biodiversity benefit of each strategy as the reduction in area that an invasive plant species is likely to dominate in over a 50-year period, where dominance was defined as more than 30% coverage at a site. Costs were estimated in present day terms over 50 years largely during follow up discussions post workshop. Cost-effectiveness was then calculated for each strategy in each bioregion by dividing the average expected benefit by the average annual costs. Overall, the total cost of managing 12 invasive plant strategies over the next 50 years was estimated at $1.7 billion. It was estimated that implementation of these strategies would result in a reduction of invasive plant dominance by 17 million ha (a potential 32% reduction), roughly 14% of the LEB. If only targeting Weeds of National Significance (WONS), the total cost was estimated to be $113 million over the next 50 years. Over the next 50 years, $2.3 million was estimated to eradicate all invasive plant species from the Great Artesian Basin Mound Springs threatened ecological community. Prevention and awareness programs were another key strategy targeted across the Basin and estimated at $17.5 million in total over 50 years. The cost of controlling, eradicating and containing buffel grass were the most expensive, over $1.5 billion over 50 years; this strategy was estimated to result in a reduction in buffel grass dominance of a million ha in areas where this species is identified as an environmental problem. Buffel grass has been deliberately planted across the Basin for pasture production and is by far the most widely distributed exotic species. Its management is contentious, having economic value to many graziers while posing serious threats to biodiversity and sites of high cultural and conservation interest. The strategy for containing and locally eradicating buffel grass was a challenge to cost based on expert knowledge, possibly because of the dual nature of this species as a valued pastoral grass and environmental weed. Based on our conversations with experts, it appears that control and eradication programs for this species, in conservation areas, are growing rapidly and that information on the most cost-effective strategies for this species will continue to develop over time. The top five most cost-effective strategies for the entire LEB were for the management of: 1) parkinsonia, 2) chinee apple, 3) mesquite, 4) rubber vine and 5) bellyache bush. Chinee apple and mother of millions are not WONS and have comparatively small populations within the semi-arid bioregions of Queensland. Experts felt that there was an opportunity to eradicate these species before they had the chance to develop into high-impact species within the LEB. Prickly acacia was estimated to have one of the highest benefits, but the costs of this strategy were high, therefore it was ranked 7th overall. The buffel grass strategy was ranked the lowest (10th) in terms of cost effectiveness. The top five most cost-effective strategies within and across the bioregions were the management of: 1) parkinsonia in the Channel Country, 2) parkinsonia in the Desert Uplands, 3) mesquite in the Mitchell Grass Downs, 4) parkinsonia in the Mitchell Grass Downs, and 5) mother of millions in the Desert Uplands. Although actions for several invasive plant species like parkinsonia and prickly acacia were concentrated in the Queensland part of the LEB, the actions involved investing in containment zones to prevent the spread of these species into other states. In the NT and SA bioregions of the LEB, the management of athel pine, parkinsonia and cacti were the main strategies. While outside the scientific research goals of study, this work highlighted a number of important incidental findings that led us to make the following recommendations for future research and implementation of weed management in the Basin: • Ongoing stakeholder engagement, extension and participation is required to ensure this prioritisation effort has a positive impact in affecting on-ground decision making and planning. • Short term funding for weed management was identified as a major reason for failure of current efforts, hence future funding needs to be secure and ongoing. • Improved mapping and information sharing is essential to implement effective weed management. • Due to uncertainties in the outcomes and impacts of management options, strategies should be implemented as part of an adaptive management program. The information provided in this report can be used to guide investment for controlling high-impact invasive plant species for the benefits of biodiversity conservation. We do not present a final prioritisation of invasive plant strategies for the LEB, and we have not addressed the cultural, socio-economic or spatial components necessary for an implementation plan. Cost-effectiveness depends on the objectives used; in our case we used the intactness of ecosystems as a surrogate for expected biodiversity benefits, measured by the extent that each invasive plant species is likely to dominate in a bioregion. When other relevant factors for implementation are considered the priorities may change and some actions may not be appropriate in some locations. We present the costs, ecological benefits and cost-effectiveness of preventing, containing, reducing and eradicating the dominance of high impact invasive plants through realistic management actions over the next 50 years. In doing so, we are able to estimate the size of the weed management problem in the LEB and provide expert-based estimates of the likely outcomes and benefits of implementing weed management strategies. The priorities resulting from this work provide a prospectus for guiding further investment in management and in improving information availability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While economic theory acknowledges that some features of technology (e.g., indivisibilities, economies of scale and specialization) can fundamentally violate the traditional convexity assumption, almost all empirical studies accept the convexity property on faith. In this contribution, we apply two alternative flexible production technologies to measure total factor productivity growth and test the significance of the convexity axiom using a nonparametric test of closeness between unknown distributions. Based on unique field level data on the petroleum industry, the empirical results reveal significant differences, indicating that this production technology is most likely non-convex. Furthermore, we also show the impact of convexity on answers to traditional convergence questions in the productivity growth literature.