214 resultados para Cost estimation of environmental protection
Resumo:
Interferometry is a sensitive technique for recording tear film surface irregularities in a noninvasive manner. At the same time, the technique is hindered by natural eye movements resulting in measurement noise. Estimating tear film surface quality from interferograms can be reduced to a spatial-average-localized weighted estimate of the first harmonic of the interference fringes. However, previously reported estimation techniques proved to perform poorly in cases where the pattern fringes were significantly disturbed. This can occur in cases of measuring tear film surface quality on a contact lens on the eye or in a dry eye. We present a new estimation technique for extracting the first harmonic from the interference fringes that combines the traditional spectral estimation techniques with morphological image processing techniques. The proposed technique proves to be more robust to changes in interference fringes caused by natural eye movements and the degree of dryness of the contact lens and corneal surfaces than its predecessors, resulting in tear film surface quality estimates that are less noisy
Resumo:
This chapter looks at issues of non-stationarity in determining when a transient has occurred and when it is possible to fit a linear model to a non-linear response. The first issue is associated with the detection of loss of damping of power system modes. When some control device such as an SVC fails, the operator needs to know whether the damping of key power system oscillation modes has deteriorated significantly. This question is posed here as an alarm detection problem rather than an identification problem to get a fast detection of a change. The second issue concerns when a significant disturbance has occurred and the operator is seeking to characterize the system oscillation. The disturbance initially is large giving a nonlinear response; this then decays and can then be smaller than the noise level ofnormal customer load changes. The difficulty is one of determining when a linear response can be reliably identified between the non-linear phase and the large noise phase of thesignal. The solution proposed in this chapter uses “Time-Frequency” analysis tools to assistthe extraction of the linear model.
Resumo:
We investigated the relative importance of vision and proprioception in estimating target and hand locations in a dynamic environment. Subjects performed a position estimation task in which a target moved horizontally on a screen at a constant velocity and then disappeared. They were asked to estimate the position of the invisible target under two conditions: passively observing and manually tracking. The tracking trials included three visual conditions with a cursor representing the hand position: always visible, disappearing simultaneously with target disappearance, and always invisible. The target’s invisible displacement was systematically underestimated during passive observation. In active conditions, tracking with the visible cursor significantly decreased the extent of underestimation. Tracking of the invisible target became much more accurate under this condition and was not affected by cursor disappearance. In a second experiment, subjects were asked to judge the position of their unseen hand instead of the target during tracking movements. Invisible hand displacements were also underestimated when compared with the actual displacement. Continuous or brief presentation of the cursor reduced the extent of underestimation. These results suggest that vision–proprioception interactions are critical for representing exact target–hand spatial relationships, and that such sensorimotor representation of hand kinematics serves a cognitive function in predicting target position. We propose a hypothesis that the central nervous system can utilize information derived from proprioception and/or efference copy for sensorimotor prediction of dynamic target and hand positions, but that effective use of this information for conscious estimation requires that it be presented in a form that corresponds to that used for the estimations.
Resumo:
he purpose of this study was to evaluate the comparative cost of treating alcohol dependence with either cognitive behavioral therapy (CBT) alone or CBT combined with naltrexone (CBT+naltrexone). Two hundred ninety-eight outpatients dependent on alcohol who were consecutively treated for alcohol dependence participated in this study. One hundred seven (36%) patients received adjunctive pharmacotherapy (CBT+naltrexone). The Drug Abuse Treatment Cost Analysis Program was used to estimate treatment costs. Adjunctive pharmacotherapy (CBT+naltrexone) introduced an additional treatment cost and was 54% more expensive than CBT alone. When treatment abstinence rates (36.1% CBT; 62.6% CBT+naltrexone) were applied to cost effectiveness ratios, CBT+naltrexone demonstrated an advantage over CBT alone. There were no differences between groups on a preference-based health measure (SF-6D). In this treatment center, to achieve 100 abstainers over a 12-week program, 280 patients require CBT compared with 160 CBT+naltrexone. The dominant choice was CBT+naltrexone based on modest economic advantages and significant efficiencies in the numbers needed to treat.
Resumo:
This paper presents a methodology for estimation of average travel time on signalized urban networks by integrating cumulative plots and probe data. This integration aims to reduce the relative deviations in the cumulative plots due to midlink sources and sinks. During undersaturated traffic conditions, the concept of a virtual probe is introduced, and therefore, accurate travel time can be obtained when a real probe is unavailable. For oversaturated traffic conditions, only one probe per travel time estimation interval—360 s or 3% of vehicles traversing the link as a probe—has the potential to provide accurate travel time.
Resumo:
Australian climate, soils and agricultural management practices are significantly different from those of the northern hemisphere nations. Consequently, experimental data on greenhouse gas production from European and North American agricultural soils and its interpretation are unlikely to be directly applicable to Australian systems.
Resumo:
This research discusses some of the issues encountered while developing a set of WGEN parameters for Chile and advice for others interested in developing WGEN parameters for arid climates. The WGEN program is a commonly used and a valuable research tool; however, it has specific limitations in arid climates that need careful consideration. These limitations are analysed in the context of generating a set of WGEN parameters for Chile. Fourteen to 26 years of precipitation data are used to calculate precipitation parameters for 18 locations in Chile, and 3–8 years of temperature and solar radiation data are analysed to generate parameters for seven of these locations. Results indicate that weather generation parameters in arid regions are sensitive to erroneous or missing precipitation data. Research shows that the WGEN-estimated gamma distribution shape parameter (α) for daily precipitation in arid zones will tend to cluster around discrete values of 0 or 1, masking the high sensitivity of these parameters to additional data. Rather than focus on the length in years when assessing the adequacy of a data record for estimation of precipitation parameters, researchers should focus on the number of wet days in dry months in a data set. Analysis of the WGEN routines for the estimation of temperature and solar radiation parameters indicates that errors can occur when individual ‘months’ have fewer than two wet days in the data set. Recommendations are provided to improve methods for estimation of WGEN parameters in arid climates.
Resumo:
It is possible to estimate the depth of focus (DOF) of the eye directly from wavefront measurements using various retinal image quality metrics (IQMs). In such methods, DOF is defined as the range of defocus error that degrades the retinal image quality calculated from IQMs to a certain level of the maximum value. Although different retinal image quality metrics are used, currently there have been two arbitrary threshold levels adopted, 50% and 80%. There has been limited study of the relationship between these threshold levels and the actual measured DOF. We measured the subjective DOF in a group of 17 normal subjects, and used through-focus augmented visual Strehl ratio based on optical transfer function (VSOTF) derived from their wavefront aberrations as the IQM. For each subject, a VSOTF threshold level was derived that would match the subjectively measured DOF. Significant correlation was found between the subject’s estimated threshold level and the HOA RMS (Pearson’s r=0.88, p<0.001). The linear correlation can be used to estimate the threshold level for each individual subject, subsequently leading to a method for estimating individual’s DOF from a single measurement of their wavefront aberrations.
Resumo:
Background: A bundled approach to central venous catheter care is currently being promoted as an effective way of preventing catheter-related bloodstream infection (CR-BSI). Consumables used in the bundled approach are relatively inexpensive which may lead to the conclusion that the bundle is cost-effective. However, this fails to consider the nontrivial costs of the monitoring and education activities required to implement the bundle, or that alternative strategies are available to prevent CR-BSI. We evaluated the cost-effectiveness of a bundle to prevent CR-BSI in Australian intensive care patients. ---------- Methods and Findings: A Markov decision model was used to evaluate the cost-effectiveness of the bundle relative to remaining with current practice (a non-bundled approach to catheter care and uncoated catheters), or use of antimicrobial catheters. We assumed the bundle reduced relative risk of CR-BSI to 0.34. Given uncertainty about the cost of the bundle, threshold analyses were used to determine the maximum cost at which the bundle remained cost-effective relative to the other approaches to infection control. Sensitivity analyses explored how this threshold alters under different assumptions about the economic value placed on bed-days and health benefits gained by preventing infection. If clinicians are prepared to use antimicrobial catheters, the bundle is cost-effective if national 18-month implementation costs are below $1.1 million. If antimicrobial catheters are not an option the bundle must cost less than $4.3 million. If decision makers are only interested in obtaining cash-savings for the unit, and place no economic value on either the bed-days or the health benefits gained through preventing infection, these cost thresholds are reduced by two-thirds.---------- Conclusions: A catheter care bundle has the potential to be cost-effective in the Australian intensive care setting. Rather than anticipating cash-savings from this intervention, decision makers must be prepared to invest resources in infection control to see efficiency improvements.
Resumo:
We estimate the parameters of a stochastic process model for a macroparasite population within a host using approximate Bayesian computation (ABC). The immunity of the host is an unobserved model variable and only mature macroparasites at sacrifice of the host are counted. With very limited data, process rates are inferred reasonably precisely. Modeling involves a three variable Markov process for which the observed data likelihood is computationally intractable. ABC methods are particularly useful when the likelihood is analytically or computationally intractable. The ABC algorithm we present is based on sequential Monte Carlo, is adaptive in nature, and overcomes some drawbacks of previous approaches to ABC. The algorithm is validated on a test example involving simulated data from an autologistic model before being used to infer parameters of the Markov process model for experimental data. The fitted model explains the observed extra-binomial variation in terms of a zero-one immunity variable, which has a short-lived presence in the host.
Resumo:
With rising environmental alarm, the reduction of critical aircraft emissions including carbon dioxides (CO2) and nitrogen oxides (NOx) is one of most important aeronautical problems. There can be many possible attempts to solve such problem by designing new wing/aircraft shape, new efficient engine, etc. The paper rather provides a set of acceptable flight plans as a first step besides replacing current aircrafts. The paper investigates a green aircraft design optimisation in terms of aircraft range, mission fuel weight (CO2) and NOx using advanced Evolutionary Algorithms coupled to flight optimisation system software. Two multi-objective design optimisations are conducted to find the best set of flight plans for current aircrafts considering discretised altitude and Mach numbers without designing aircraft shape and engine types. The objectives of first optimisation are to maximise range of aircraft while minimising NOx with constant mission fuel weight. The second optimisation considers minimisation of mission fuel weight and NOx with fixed aircraft range. Numerical results show that the method is able to capture a set of useful trade-offs that reduce NOx and CO2 (minimum mission fuel weight).
Resumo:
In today’s information society, electronic tools, such as computer networks for the rapid transfer of data and composite databases for information storage and management, are critical in ensuring effective environmental management. In particular environmental policies and programs for federal, state, and local governments need a large volume of up-to-date information on the quality of water, air, and soil in order to conserve and protect natural resources and to carry out meteorology. In line with this, the utilization of information and communication technologies (ICTs) is crucial to preserve and improve the quality of life. In handling tasks in the field of environmental protection a range of environmental and technical information is often required for a complex and mutual decision making in a multidisciplinary team environment. In this regard e-government provides a foundation of the transformative ICT initiative which can lead to better environmental governance, better services, and increased public participation in environmental decision- making process.
Resumo:
This paper presents a method for measuring the in-bucket payload volume on a dragline excavator for the purpose of estimating the material's bulk density in real-time. Knowledge of the payload's bulk density can provide feedback to mine planning and scheduling to improve blasting and therefore provide a more uniform bulk density across the excavation site. This allows a single optimal bucket size to be used for maximum overburden removal per dig and in turn reduce costs and emissions in dragline operation and maintenance. The proposed solution uses a range bearing laser to locate and scan full buckets between the lift and dump stages of the dragline cycle. The bucket is segmented from the scene using cluster analysis, and the pose of the bucket is calculated using the Iterative Closest Point (ICP) algorithm. Payload points are identified using a known model and subsequently converted into a height grid for volume estimation. Results from both scaled and full scale implementations show that this method can achieve an accuracy of above 95%.