859 resultados para cost and benefit


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research shows that gross pollutant traps (GPTs) continue to play an important role in preventing visible street waste—gross pollutants—from contaminating the environment. The demand for these GPTs calls for stringent quality control and this research provides a foundation to rigorously examine the devices. A novel and comprehensive testing approach to examine a dry sump GPT was developed. The GPT is designed with internal screens to capture gross pollutants—organic matter and anthropogenic litter. This device has not been previously investigated. Apart from the review of GPTs and gross pollutant data, the testing approach includes four additional aspects to this research, which are: field work and an historical overview of street waste/stormwater pollution, calibration of equipment, hydrodynamic studies and gross pollutant capture/retention investigations. This work is the first comprehensive investigation of its kind and provides valuable practical information for the current research and any future work pertaining to the operations of GPTs and management of street waste in the urban environment. Gross pollutant traps—including patented and registered designs developed by industry—have specific internal configurations and hydrodynamic separation characteristics which demand individual testing and performance assessments. Stormwater devices are usually evaluated by environmental protection agencies (EPAs), professional bodies and water research centres. In the USA, the American Society of Civil Engineers (ASCE) and the Environmental Water Resource Institute (EWRI) are examples of professional and research organisations actively involved in these evaluation/verification programs. These programs largely rely on field evaluations alone that are limited in scope, mainly for cost and logistical reasons. In Australia, evaluation/verification programs of new devices in the stormwater industry are not well established. The current limitations in the evaluation methodologies of GPTs have been addressed in this research by establishing a new testing approach. This approach uses a combination of physical and theoretical models to examine in detail the hydrodynamic and capture/retention characteristics of the GPT. The physical model consisted of a 50% scale model GPT rig with screen blockages varying from 0 to 100%. This rig was placed in a 20 m flume and various inlet and outflow operating conditions were modelled on observations made during the field monitoring of GPTs. Due to infrequent cleaning, the retaining screens inside the GPTs were often observed to be blocked with organic matter. Blocked screens can radically change the hydrodynamic and gross pollutant capture/retention characteristics of a GPT as shown from this research. This research involved the use of equipment, such as acoustic Doppler velocimeters (ADVs) and dye concentration (Komori) probes, which were deployed for the first time in a dry sump GPT. Hence, it was necessary to rigorously evaluate the capability and performance of these devices, particularly in the case of the custom made Komori probes, about which little was known. The evaluation revealed that the Komori probes have a frequency response of up to 100 Hz —which is dependent upon fluid velocities—and this was adequate to measure the relevant fluctuations of dye introduced into the GPT flow domain. The outcome of this evaluation resulted in establishing methodologies for the hydrodynamic measurements and gross pollutant capture/retention experiments. The hydrodynamic measurements consisted of point-based acoustic Doppler velocimeter (ADV) measurements, flow field particle image velocimetry (PIV) capture, head loss experiments and computational fluid dynamics (CFD) simulation. The gross pollutant capture/retention experiments included the use of anthropogenic litter components, tracer dye and custom modified artificial gross pollutants. Anthropogenic litter was limited to tin cans, bottle caps and plastic bags, while the artificial pollutants consisted of 40 mm spheres with a range of four buoyancies. The hydrodynamic results led to the definition of global and local flow features. The gross pollutant capture/retention results showed that when the internal retaining screens are fully blocked, the capture/retention performance of the GPT rapidly deteriorates. The overall results showed that the GPT will operate efficiently until at least 70% of the screens are blocked, particularly at high flow rates. This important finding indicates that cleaning operations could be more effectively planned when the GPT capture/retention performance deteriorates. At lower flow rates, the capture/retention performance trends were reversed. There is little difference in the poor capture/retention performance between a fully blocked GPT and a partially filled or empty GPT with 100% screen blockages. The results also revealed that the GPT is designed with an efficient high flow bypass system to avoid upstream blockages. The capture/retention performance of the GPT at medium to high inlet flow rates is close to maximum efficiency (100%). With regard to the design appraisal of the GPT, a raised inlet offers a better capture/retention performance, particularly at lower flow rates. Further design appraisals of the GPT are recommended.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Catheter associated urinary tract infections (CAUTI) are a worldwide problem that may lead to increased patient morbidity, cost and mortality.1e3 The literature is divided on whether there are real effects from CAUTI on length of stay or mortality. Platt4 found the costs and mortality risks to be largeyetGraves et al found the opposite.5 A reviewof the published estimates of the extra length of stay showed results between zero and 30 days.6 The differences in estimates may have been caused by the different epidemiological methods applied. Accurately estimating the effects of CAUTI is difficult because it is a time-dependent exposure. This means that standard statistical techniques, such asmatched case-control studies, tend to overestimate the increased hospital stay and mortality risk due to infection. The aim of the study was to estimate excess length of stay andmortality in an intensive care unit (ICU) due to a CAUTI, using a statistical model that accounts for the timing of infection. Data collected from ICU units in lower and middle income countries were used for this analysis.7,8 There has been little research for these settings, hence the need for this paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The alliance project delivery method is used for approximately one third of all Australian government infrastructure projects representing $8-$10 billion per annum. Despite its widespread use, little is known about the differences between estimated project cost and actual cost over the project lifecycle. This paper presents the findings of research into 14 Australian government alliance case studies investigating the observed cost uplift over each project’s lifecycle. I find that significant cost uplift is likely and that this uplift is greater than that afflicting traditional delivery methods. Furthermore, most of the cost uplift occurs at a different place in the project lifecycle, namely between Business Case and Contractual Commitment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Optimal scheduling of voltage regulators (VRs), fixed and switched capacitors and voltage on customer side of transformer (VCT) along with the optimal allocaton of VRs and capacitors are performed using a hybrid optimisation method based on discrete particle swarm optimisation and genetic algorithm. Direct optimisation of the tap position is not appropriate since in general the high voltage (HV) side voltage is not known. Therefore, the tap setting can be determined give the optimal VCT once the HV side voltage is known. The objective function is composed of the distribution line loss cost, the peak power loss cost and capacitors' and VRs' capital, operation and maintenance costs. The constraints are limits on bus voltage and feeder current along with VR taps. The bus voltage should be maintained within the standard level and the feeder current should not exceed the feeder-rated current. The taps are to adjust the output voltage of VRs between 90 and 110% of their input voltages. For validation of the proposed method, the 18-bus IEEE system is used. The results are compared with prior publications to illustrate the benefit of the employed technique. The results also show that the lowest cost planning for voltage profile will be achieved if a combination of capacitors, VRs and VCTs is considered.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Automation technology can provide construction firms with a number of competitive advantages. Technology strategy guides a firm's approach to all technology, including automation. Engineering management educators, researchers, and construction industry professionals need improved understanding of how technology affects results, and how to better target investments to improve competitive performance. A more formal approach to the concept of technology strategy can benefit the construction manager in his efforts to remain competitive in increasingly hostile markets. This paper recommends consideration of five specific dimensions of technology strategy within the overall parameters of market conditions, firm capabilities and goals, and stage of technology evolution. Examples of the application of this framework in the formulation of technology strategy are provided for CAD applications, co-ordinated positioning technology and advanced falsework and formwork mechanisation to support construction field operations. Results from this continuing line of research can assist managers in making complex and difficult decisions regarding reengineering construction processes in using new construction technology and benefit future researchers by providing new tools for analysis. Through managing technology to best suit the existing capabilities of their firm, and addressing the market forces, engineering managers can better face the increasingly competitive environment in which they operate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The concept of constructability uses integration art of individual functions through a valuable and timely construction inputs into planning and design development stages. It results in significant savings in cost and time needed to finalize infrastructure projects. However, available constructability principles, developed by CII Australia (1993), do not cover Operation and Maintenance (O&M) phases of projects, whilst major cost and time in multifaceted infrastructure projects are spent in post-occupancy stages. This paper discusses the need to extend the constructability concept by examining current O&M issues in the provision of multifaceted building projects. It highlights available O&M problems and shortcomings of building projects, as well as their causes and reasons in different categories. This initial categorization is an efficient start point for testing probable present O&M issues in various cases of complex infrastructure building projects. This preliminary categorization serve as a benchmark to develop an extended constructability model that considers the whole project life cycle phases rather than a specific phase. It anticipates that the development of an extended constructability model can reduce significant number of reworks, mistakes, extra costs and time wasted during delivery stages of multifaceted building projects.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Estimating and predicting degradation processes of engineering assets is crucial for reducing the cost and insuring the productivity of enterprises. Assisted by modern condition monitoring (CM) technologies, most asset degradation processes can be revealed by various degradation indicators extracted from CM data. Maintenance strategies developed using these degradation indicators (i.e. condition-based maintenance) are more cost-effective, because unnecessary maintenance activities are avoided when an asset is still in a decent health state. A practical difficulty in condition-based maintenance (CBM) is that degradation indicators extracted from CM data can only partially reveal asset health states in most situations. Underestimating this uncertainty in relationships between degradation indicators and health states can cause excessive false alarms or failures without pre-alarms. The state space model provides an efficient approach to describe a degradation process using these indicators that can only partially reveal health states. However, existing state space models that describe asset degradation processes largely depend on assumptions such as, discrete time, discrete state, linearity, and Gaussianity. The discrete time assumption requires that failures and inspections only happen at fixed intervals. The discrete state assumption entails discretising continuous degradation indicators, which requires expert knowledge and often introduces additional errors. The linear and Gaussian assumptions are not consistent with nonlinear and irreversible degradation processes in most engineering assets. This research proposes a Gamma-based state space model that does not have discrete time, discrete state, linear and Gaussian assumptions to model partially observable degradation processes. Monte Carlo-based algorithms are developed to estimate model parameters and asset remaining useful lives. In addition, this research also develops a continuous state partially observable semi-Markov decision process (POSMDP) to model a degradation process that follows the Gamma-based state space model and is under various maintenance strategies. Optimal maintenance strategies are obtained by solving the POSMDP. Simulation studies through the MATLAB are performed; case studies using the data from an accelerated life test of a gearbox and a liquefied natural gas industry are also conducted. The results show that the proposed Monte Carlo-based EM algorithm can estimate model parameters accurately. The results also show that the proposed Gamma-based state space model have better fitness result than linear and Gaussian state space models when used to process monotonically increasing degradation data in the accelerated life test of a gear box. Furthermore, both simulation studies and case studies show that the prediction algorithm based on the Gamma-based state space model can identify the mean value and confidence interval of asset remaining useful lives accurately. In addition, the simulation study shows that the proposed maintenance strategy optimisation method based on the POSMDP is more flexible than that assumes a predetermined strategy structure and uses the renewal theory. Moreover, the simulation study also shows that the proposed maintenance optimisation method can obtain more cost-effective strategies than a recently published maintenance strategy optimisation method by optimising the next maintenance activity and the waiting time till the next maintenance activity simultaneously.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A significant proportion of the cost of software development is due to software testing and maintenance. This is in part the result of the inevitable imperfections due to human error, lack of quality during the design and coding of software, and the increasing need to reduce faults to improve customer satisfaction in a competitive marketplace. Given the cost and importance of removing errors improvements in fault detection and removal can be of significant benefit. The earlier in the development process faults can be found, the less it costs to correct them and the less likely other faults are to develop. This research aims to make the testing process more efficient and effective by identifying those software modules most likely to contain faults, allowing testing efforts to be carefully targeted. This is done with the use of machine learning algorithms which use examples of fault prone and not fault prone modules to develop predictive models of quality. In order to learn the numerical mapping between module and classification, a module is represented in terms of software metrics. A difficulty in this sort of problem is sourcing software engineering data of adequate quality. In this work, data is obtained from two sources, the NASA Metrics Data Program, and the open source Eclipse project. Feature selection before learning is applied, and in this area a number of different feature selection methods are applied to find which work best. Two machine learning algorithms are applied to the data - Naive Bayes and the Support Vector Machine - and predictive results are compared to those of previous efforts and found to be superior on selected data sets and comparable on others. In addition, a new classification method is proposed, Rank Sum, in which a ranking abstraction is laid over bin densities for each class, and a classification is determined based on the sum of ranks over features. A novel extension of this method is also described based on an observed polarising of points by class when rank sum is applied to training data to convert it into 2D rank sum space. SVM is applied to this transformed data to produce models the parameters of which can be set according to trade-off curves to obtain a particular performance trade-off.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes an effective method for signal-authentication and spoofing detection for civilian GNSS receivers using the GPS L1 C/A and the Galileo E1-B Safety of Life service. The paper discusses various spoofing attack profiles and how the proposed method is able to detect these attacks. This method is relatively low-cost and can be suitable for numerous mass-market applications. This paper is the subject of a pending patent.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

While a number of factors have been highlighted in the innovation adoption literature, little is known about whether different factors are related to innovation adoption in differently-sized firms. We used preliminary case studies of small, medium and large firms to ground our hypotheses, which were then tested using a survey of 94 firms. We found that external stakeholder pressure and non-financial readiness were related to innovation adoption in SMEs; but that for large firms, adoption was related to the opportunity to innovate. It may be that the difficulties of adopting innovations, including both the financial cost and the effort involved, are too great for SMEs to overcome unless there is either a compelling need (external pressure) or enough in-house capability (non-financial readiness). This suggests that SMEs are more likely to have innovation “pushed” onto them while large firms are more likely to “pull” innovations when they have the opportunity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Queensland University of Technology (QUT) allows the presentation of a thesis for the Degree of Doctor of Philosophy in the format of published or submitted papers, where such papers have been published, accepted or submitted during the period of candidature. This thesis is composed of Seven published/submitted papers and one poster presentation, of which five have been published and the other two are under review. This project is financially supported by the QUTPRA Grant. The twenty-first century started with the resurrection of lignocellulosic biomass as a potential substitute for petrochemicals. Petrochemicals, which enjoyed the sustainable economic growth during the past century, have begun to reach or have reached their peak. The world energy situation is complicated by political uncertainty and by the environmental impact associated with petrochemical import and usage. In particular, greenhouse gasses and toxic emissions produced by petrochemicals have been implicated as a significant cause of climate changes. Lignocellulosic biomass (e.g. sugarcane biomass and bagasse), which potentially enjoys a more abundant, widely distributed, and cost-effective resource base, can play an indispensible role in the paradigm transition from fossil-based to carbohydrate-based economy. Poly(3-hydroxybutyrate), PHB has attracted much commercial interest as a plastic and biodegradable material because some its physical properties are similar to those of polypropylene (PP), even though the two polymers have quite different chemical structures. PHB exhibits a high degree of crystallinity, has a high melting point of approximately 180°C, and most importantly, unlike PP, PHB is rapidly biodegradable. Two major factors which currently inhibit the widespread use of PHB are its high cost and poor mechanical properties. The production costs of PHB are significantly higher than for plastics produced from petrochemical resources (e.g. PP costs $US1 kg-1, whereas PHB costs $US8 kg-1), and its stiff and brittle nature makes processing difficult and impedes its ability to handle high impact. Lignin, together with cellulose and hemicellulose, are the three main components of every lignocellulosic biomass. It is a natural polymer occurring in the plant cell wall. Lignin, after cellulose, is the most abundant polymer in nature. It is extracted mainly as a by-product in the pulp and paper industry. Although, traditionally lignin is burnt in industry for energy, it has a lot of value-add properties. Lignin, which to date has not been exploited, is an amorphous polymer with hydrophobic behaviour. These make it a good candidate for blending with PHB and technically, blending can be a viable solution for price and reduction and enhance production properties. Theoretically, lignin and PHB affect the physiochemical properties of each other when they become miscible in a composite. A comprehensive study on structural, thermal, rheological and environmental properties of lignin/PHB blends together with neat lignin and PHB is the targeted scope of this thesis. An introduction to this research, including a description of the research problem, a literature review and an account of the research progress linking the research papers is presented in Chapter 1. In this research, lignin was obtained from bagasse through extraction with sodium hydroxide. A novel two-step pH precipitation procedure was used to recover soda lignin with the purity of 96.3 wt% from the black liquor (i.e. the spent sodium hydroxide solution). The precipitation process is presented in Chapter 2. A sequential solvent extraction process was used to fractionate the soda lignin into three fractions. These fractions, together with the soda lignin, were characterised to determine elemental composition, purity, carbohydrate content, molecular weight, and functional group content. The thermal properties of the lignins were also determined. The results are presented and discussed in Chapter 2. On the basis of the type and quantity of functional groups, attempts were made to identify potential applications for each of the individual lignins. As an addendum to the general section on the development of composite materials of lignin, which includes Chapters 1 and 2, studies on the kinetics of bagasse thermal degradation are presented in Appendix 1. The work showed that distinct stages of mass losses depend on residual sucrose. As the development of value-added products from lignin will improve the economics of cellulosic ethanol, a review on lignin applications, which included lignin/PHB composites, is presented in Appendix 2. Chapters 3, 4 and 5 are dedicated to investigations of the properties of soda lignin/PHB composites. Chapter 3 reports on the thermal stability and miscibility of the blends. Although the addition of soda lignin shifts the onset of PHB decomposition to lower temperatures, the lignin/PHB blends are thermally more stable over a wider temperature range. The results from the thermal study also indicated that blends containing up to 40 wt% soda lignin were miscible. The Tg data for these blends fitted nicely to the Gordon-Taylor and Kwei models. Fourier transform infrared spectroscopy (FT-IR) evaluation showed that the miscibility of the blends was because of specific hydrogen bonding (and similar interactions) between reactive phenolic hydroxyl groups of lignin and the carbonyl group of PHB. The thermophysical and rheological properties of soda lignin/PHB blends are presented in Chapter 4. In this chapter, the kinetics of thermal degradation of the blends is studied using thermogravimetric analysis (TGA). This preliminary investigation is limited to the processing temperature of blend manufacturing. Of significance in the study, is the drop in the apparent energy of activation, Ea from 112 kJmol-1 for pure PHB to half that value for blends. This means that the addition of lignin to PHB reduces the thermal stability of PHB, and that the comparative reduced weight loss observed in the TGA data is associated with the slower rate of lignin degradation in the composite. The Tg of PHB, as well as its melting temperature, melting enthalpy, crystallinity and melting point decrease with increase in lignin content. Results from the rheological investigation showed that at low lignin content (.30 wt%), lignin acts as a plasticiser for PHB, while at high lignin content it acts as a filler. Chapter 5 is dedicated to the environmental study of soda lignin/PHB blends. The biodegradability of lignin/PHB blends is compared to that of PHB using the standard soil burial test. To obtain acceptable biodegradation data, samples were buried for 12 months under controlled conditions. Gravimetric analysis, TGA, optical microscopy, scanning electron microscopy (SEM), differential scanning calorimetry (DSC), FT-IR, and X-ray photoelectron spectroscopy (XPS) were used in the study. The results clearly demonstrated that lignin retards the biodegradation of PHB, and that the miscible blends were more resistant to degradation compared to the immiscible blends. To obtain an understanding between the structure of lignin and the properties of the blends, a methanol-soluble lignin, which contains 3× less phenolic hydroxyl group that its parent soda lignin used in preparing blends for the work reported in Chapters 3 and 4, was blended with PHB and the properties of the blends investigated. The results are reported in Chapter 6. At up to 40 wt% methanolsoluble lignin, the experimental data fitted the Gordon-Taylor and Kwei models, similar to the results obtained soda lignin-based blends. However, the values obtained for the interactive parameters for the methanol-soluble lignin blends were slightly lower than the blends obtained with soda lignin indicating weaker association between methanol-soluble lignin and PHB. FT-IR data confirmed that hydrogen bonding is the main interactive force between the reactive functional groups of lignin and the carbonyl group of PHB. In summary, the structural differences existing between the two lignins did not manifest itself in the properties of their blends.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper focuses on information sharing with key suppliers and seeks to explore the factors that might influence its extent and depth. We also investigate how information sharing affects a company’s performance with regards to resource usage, output, and flexibility. Drawing from transaction cost- and contingency theories, several factors, namely environmental uncertainty, demand uncertainty, dependency and, the product life cycle stage are proposed to explain the level of information shared with key suppliers. We develop a model where information sharing mediates the (contingent) factors and company performance. A mail survey was used to collect data from Finnish and Swedish companies. Partial Least Squares analysis was separately performed for each country (n=119, n=102). There was consistent evidence that environmental uncertainty, demand uncertainty and supplier/buyer dependency had explanatory power, whereas no significance was found for the relationship between product life cycle stage and information sharing. The results also confirm previous studies by providing support for a positive relationship between information sharing and performance, where output performance was found to be the most strongly related.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: To determine whether remote monitoring (structured telephone support or telemonitoring) without regular clinic or home visits improves outcomes for patients with chronic heart failure. Data sources: 15 electronic databases, hand searches of previous studies, and contact with authors and experts. Data extraction: Two investigators independently screened the results. Review methods: Published randomised controlled trials comparing remote monitoring programmes with usual care in patients with chronic heart failure managed within the community. Results: 14 randomised controlled trials (4264 patients) of remote monitoring met the inclusion criteria: four evaluated telemonitoring, nine evaluated structured telephone support, and one evaluated both. Remote monitoring programmes reduced the rates of admission to hospital for chronic heart failure by 21% (95% confidence interval 11% to 31%) and all cause mortality by 20% (8% to 31%); of the six trials evaluating health related quality of life three reported significant benefits with remote monitoring, and of the four studies examining healthcare costs with structured telephone support three reported reduced cost and one no effect. Conclusion: Programmes for chronic heart failure that include remote monitoring have a positive effect on clinical outcomes in community dwelling patients with chronic heart failure.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A time-resolved inverse spatially offset Raman spectrometer was constructed for depth profiling of Raman-active substances under both the lab and the field environments. The system operating principles and performance are discussed along with its advantages relative to traditional continuous wave spatially offset Raman spectrometer. The developed spectrometer uses a combination of space- and time-resolved detection in order to obtain high-quality Raman spectra from substances hidden behind coloured opaque surface layers, such as plastic and garments, with a single measurement. The time-gated spatially offset Raman spectrometer was successfully used to detect concealed explosives and drug precursors under incandescent and fluorescent background light as well as under daylight. The average screening time was 50 s per measurement. The excitation energy requirements were relatively low (20 mW) which makes the probe safe for screening hazardous substances. The unit has been designed with nanosecond laser excitation and gated detection, making it of lower cost and complexity than previous picosecond-based systems, to provide a functional platform for in-line or in-field sensing of chemical substances.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Inspection of solder joints has been a critical process in the electronic manufacturing industry to reduce manufacturing cost, improve yield, and ensure project quality and reliability. This paper proposes the use of the Log-Gabor filter bank, Discrete Wavelet Transform and Discrete Cosine Transform for feature extraction of solder joint images on Printed Circuit Boards (PCBs). A distance based on the Mahalanobis Cosine metric is also presented for classification of five different types of solder joints. From the experimental results, this methodology achieved high accuracy and a well generalised performance. This can be an effective method to reduce cost and improve quality in the production of PCBs in the manufacturing industry.