983 resultados para Expected cost


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a demand side response model (DSR) which assists small electricity consumers, through an aggregator, exposed to the market price to proactively mitigate price and peak impact on the electrical system. The proposed model allows consumers to manage air-conditioning when as a function of possible price spikes. The main contribution of this research is to demonstrate how consumers can minimise the total expected cost by optimising air-conditioning to account for occurrences of a price spike in the electricity market. This model investigates how pre-cooling method can be used to minimise energy costs when there is a substantial risk of an electricity price spike. The model was tested with Queensland electricity market data from the Australian Energy Market Operator and Brisbane temperature data from the Bureau of Statistics during hot days on weekdays in the period 2011 to 2012.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A statistical approach is used in the design of a battery-supercapacitor energy storage system for a wind farm. The design exploits the technical merits of the two energy storage mediums, in terms of the differences in their specific power and energy densities, and their ability to accommodate different rates of change in the charging/discharging powers. By treating the input wind power as random and using a proposed coordinated power flows control strategy for the battery and the supercapacitor, the approach evaluates the energy storage capacities, the corresponding expected life cycle cost/year of the storage mediums, and the expected cost/year of unmet power dispatch. A computational procedure is then developed for the design of a least-cost/year hybrid energy storage system to realize wind power dispatch at a specified confidence level.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Campaigners are increasingly using online social networking platforms for promoting products, ideas and information. A popular method of promoting a product or even an idea is incentivizing individuals to evangelize the idea vigorously by providing them with referral rewards in the form of discounts, cash backs, or social recognition. Due to budget constraints on scarce resources such as money and manpower, it may not be possible to provide incentives for the entire population, and hence incentives need to be allocated judiciously to appropriate individuals for ensuring the highest possible outreach size. We aim to do the same by formulating and solving an optimization problem using percolation theory. In particular, we compute the set of individuals that are provided incentives for minimizing the expected cost while ensuring a given outreach size. We also solve the problem of computing the set of individuals to be incentivized for maximizing the outreach size for given cost budget. The optimization problem turns out to be non trivial; it involves quantities that need to be computed by numerically solving a fixed point equation. Our primary contribution is, that for a fairly general cost structure, we show that the optimization problems can be solved by solving a simple linear program. We believe that our approach of using percolation theory to formulate an optimization problem is the first of its kind. (C) 2016 Elsevier B.V. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

DESIGN We will address our research objectives by searching the published and unpublished literature and conducting an evidence synthesis of i) studies of the effectiveness of psychosocial interventions provided for children and adolescents who have suffered maltreatment, ii) economic evaluations of these interventions and iii) studies of their acceptability to children, adolescents and their carers. SEARCH STRATEGY: Evidence will be identified via electronic databases for health and allied health literature, social sciences and social welfare, education and other evidence based depositories, and economic databases. We will identify material generated by user-led,voluntary sector enquiry by searching the internet and browsing the websites of relevant UK government departments and charities. Additionally, studies will be identified via the bibliographies of retrieved articles/reviews; targeted author searches; forward citation searching. We will also use our extensive professional networks, and our planned consultations with key stakeholders and our study steering committee. Databases will be searched from inception to time of search. REVIEW STRATEGY Inclusion criteria: 1) Infants, children or adolescents who have experienced maltreatment between the ages of 0 17 years. 2) All psychosocial interventions available for maltreated children and adolescents, by any provider and in any setting, aiming to address the sequelae of any form of maltreatment, including fabricated illness. 3) For synthesis of evidence of effectiveness: all controlled studies in which psychosocial interventions are compared with no-treatment, treatment as usual, waitlist or other-treated controls. For a synthesis of evidence of acceptability we will include any design that asks participants for their views or provides data on non-participation. For decision-analytic modelling we may include uncontrolled studies. Primary and secondary outcomes will be confirmed in consultation with stakeholders. Provisional primary outcomes are psychological distress/mental health (particularly PTSD, depression and anxiety, self-harm); ii) behaviour; iii) social functioning; iv) cognitive / academic attainment, v) quality of life, and vi) costs. After studies that meet the inclusion criteria have been identified (independently by two reviewers), data will be extracted and risk of bias (RoB) assessed (independently by two reviewers) using the Cochrane Collaboration RoB Tool (effectiveness), quality hierarchies of data sources for economic analyses (cost-effectiveness) and the CASP tool for qualitative research (acceptability). Where interventions are similar and appropriate data are available (or can be obtained) evidence synthesis will be performed to pool the results. Where possible, we will explore the extent to which age, maltreatment history (including whether intra- or extra-familial), time since maltreatment, care setting (family / out-of-home care including foster care/residential), care history, and characteristics of intervention (type, setting, provider, duration) moderate the effects of psychosocial interventions. A synthesis of acceptability data will be undertaken, using a narrative approach to synthesis. A decision-analytic model will be constructed to compare the expected cost-effectiveness of the different types of intervention identified in the systematic review. We will also conduct a Value of information analysis if the data permit. EXPECTED OUTPUTS: A synthesis of the effectiveness and cost effectiveness of psychosocial interventions for maltreated children (taking into account age, maltreatment profile and setting) and their acceptability to key stakeholders.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Split System Approach (SSA) based methodology is presented to assist in making optimal Preventive Maintenance decisions for serial production lines. The methodology treats a production line as a complex series system with multiple PM actions over multiple intervals. Both risk related cost and maintenance related cost are factored into the methodology as either deterministic or random variables. This SSA based methodology enables Asset Management (AM) decisions to be optimized considering a variety of factors including failure probability, failure cost, maintenance cost, PM performance, and the type of PM strategy. The application of this new methodology and an evaluation of the effects of these factors on PM decisions are demonstrated using an example. The results of this work show that the performance of a PM strategy can be measured by its Total Expected Cost Index (TECI). The optimal PM interval is dependent on TECI, PM performance and types of PM strategies. These factors are interrelated. Generally it was found that a trade-off between reliability and the number of PM actions needs to be made so that one can minimize Total Expected Cost (TEC) for asset maintenance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to the limitation of current condition monitoring technologies, the estimates of asset health states may contain some uncertainties. A maintenance strategy ignoring this uncertainty of asset health state can cause additional costs or downtime. The partially observable Markov decision process (POMDP) is a commonly used approach to derive optimal maintenance strategies when asset health inspections are imperfect. However, existing applications of the POMDP to maintenance decision-making largely adopt the discrete time and state assumptions. The discrete-time assumption requires the health state transitions and maintenance activities only happen at discrete epochs, which cannot model the failure time accurately and is not cost-effective. The discrete health state assumption, on the other hand, may not be elaborate enough to improve the effectiveness of maintenance. To address these limitations, this paper proposes a continuous state partially observable semi-Markov decision process (POSMDP). An algorithm that combines the Monte Carlo-based density projection method and the policy iteration is developed to solve the POSMDP. Different types of maintenance activities (i.e., inspections, replacement, and imperfect maintenance) are considered in this paper. The next maintenance action and the corresponding waiting durations are optimized jointly to minimize the long-run expected cost per unit time and availability. The result of simulation studies shows that the proposed maintenance optimization approach is more cost-effective than maintenance strategies derived by another two approximate methods, when regular inspection intervals are adopted. The simulation study also shows that the maintenance cost can be further reduced by developing maintenance strategies with state-dependent maintenance intervals using the POSMDP. In addition, during the simulation studies the proposed POSMDP shows the ability to adopt a cost-effective strategy structure when multiple types of maintenance activities are involved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Preventive Maintenance (PM) is often applied to improve the reliability of production lines. A Split System Approach (SSA) based methodology is presented to assist in making optimal PM decisions for serial production lines. The methodology treats a production line as a complex series system with multiple (imperfect) PM actions over multiple intervals. The conditional and overall reliability of the entire production line over these multiple PM intervals are hierarchically calculated using SSA, and provide a foundation for cost analysis. Both risk-related cost and maintenance-related cost are factored into the methodology as either deterministic or random variables. This SSA based methodology enables Asset Management (AM) decisions to be optimised considering a variety of factors including failure probability, failure cost, maintenance cost, PM performance, and the type of PM strategy. The application of this new methodology and an evaluation of the effects of these factors on PM decisions are demonstrated using an example. The results of this work show that the performance of a PM strategy can be measured by its Total Expected Cost Index (TECI). The optimal PM interval is dependent on TECI, PM performance and types of PM strategies. These factors are interrelated. Generally, it was found that a trade-off between reliability and the number of PM actions needs to be made so that one can minimise Total Expected Cost (TEC) for asset maintenance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This project provides a costed and appraised set of management strategies for mitigating threats to species of conservation significance in the Pilbara IBRA bioregion of Western Australia (hereafter 'the Pilbara'). Conservation significant species are either listed under federal and state legislation, international agreements or considered likely to be threatened in the next 20 years. Here we report on the 17 technically and socially feasible management strategies, which were drawn from the collective experience and knowledge of 49 experts and stakeholders in the ecology and management of the Pilbara region. We determine the relative ecological cost-effectiveness of each strategy, calculated as the expected benefit of management to the persistence of 53 key threatened native fauna and flora species, divided by the expected cost of management. Finally we provide decision support to assist prioritisation of the strategies on the basis of ecological cost-effectiveness.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter. © 2006 Blackwell Publishing Ltd/CNRS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents stylized models for conducting performance analysis of the manufacturing supply chain network (SCN) in a stochastic setting for batch ordering. We use queueing models to capture the behavior of SCN. The analysis is clubbed with an inventory optimization model, which can be used for designing inventory policies . In the first case, we model one manufacturer with one warehouse, which supplies to various retailers. We determine the optimal inventory level at the warehouse that minimizes total expected cost of carrying inventory, back order cost associated with serving orders in the backlog queue, and ordering cost. In the second model we impose service level constraint in terms of fill rate (probability an order is filled from stock at warehouse), assuming that customers do not balk from the system. We present several numerical examples to illustrate the model and to illustrate its various features. In the third case, we extend the model to a three-echelon inventory model which explicitly considers the logistics process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

- Objective To compare health service cost and length of stay between a traditional and an accelerated diagnostic approach to assess acute coronary syndromes (ACS) among patients who presented to the emergency department (ED) of a large tertiary hospital in Australia. - Design, setting and participants This historically controlled study analysed data collected from two independent patient cohorts presenting to the ED with potential ACS. The first cohort of 938 patients was recruited in 2008–2010, and these patients were assessed using the traditional diagnostic approach detailed in the national guideline. The second cohort of 921 patients was recruited in 2011–2013 and was assessed with the accelerated diagnostic approach named the Brisbane protocol. The Brisbane protocol applied early serial troponin testing for patients at 0 and 2 h after presentation to ED, in comparison with 0 and 6 h testing in traditional assessment process. The Brisbane protocol also defined a low-risk group of patients in whom no objective testing was performed. A decision tree model was used to compare the expected cost and length of stay in hospital between two approaches. Probabilistic sensitivity analysis was used to account for model uncertainty. - Results Compared with the traditional diagnostic approach, the Brisbane protocol was associated with reduced expected cost of $1229 (95% CI −$1266 to $5122) and reduced expected length of stay of 26 h (95% CI −14 to 136 h). The Brisbane protocol allowed physicians to discharge a higher proportion of low-risk and intermediate-risk patients from ED within 4 h (72% vs 51%). Results from sensitivity analysis suggested the Brisbane protocol had a high chance of being cost-saving and time-saving. - Conclusions This study provides some evidence of cost savings from a decision to adopt the Brisbane protocol. Benefits would arise for the hospital and for patients and their families.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study examines the intraday and weekend volatility on the German DAX. The intraday volatility is partitioned into smaller intervals and compared to a whole day’s volatility. The estimated intraday variance is U-shaped and the weekend variance is estimated to 19 % of a normal trading day. The patterns in the intraday and weekend volatility are used to develop an extension to the Black and Scholes formula to form a new time basis. Calendar or trading days are commonly used for measuring time in option pricing. The Continuous Time using Discrete Approximations model (CTDA) developed in this study uses a measure of time with smaller intervals, approaching continuous time. The model presented accounts for the lapse of time during trading only. Arbitrage pricing suggests that the option price equals the expected cost of hedging volatility during the option’s remaining life. In this model, time is allowed to lapse as volatility occurs on an intraday basis. The measure of time is modified in CTDA to correct for the non-constant volatility and to account for the patterns in volatility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A "plan diagram" is a pictorial enumeration of the execution plan choices of a database query optimizer over the relational selectivity space. We have shown recently that, for industrial-strength database engines, these diagrams are often remarkably complex and dense, with a large number of plans covering the space. However, they can often be reduced to much simpler pictures, featuring significantly fewer plans, without materially affecting the query processing quality. Plan reduction has useful implications for the design and usage of query optimizers, including quantifying redundancy in the plan search space, enhancing useability of parametric query optimization, identifying error-resistant and least-expected-cost plans, and minimizing the overheads of multi-plan approaches. We investigate here the plan reduction issue from theoretical, statistical and empirical perspectives. Our analysis shows that optimal plan reduction, w.r.t. minimizing the number of plans, is an NP-hard problem in general, and remains so even for a storage-constrained variant. We then present a greedy reduction algorithm with tight and optimal performance guarantees, whose complexity scales linearly with the number of plans in the diagram for a given resolution. Next, we devise fast estimators for locating the best tradeoff between the reduction in plan cardinality and the impact on query processing quality. Finally, extensive experimentation with a suite of multi-dimensional TPCH-based query templates on industrial-strength optimizers demonstrates that complex plan diagrams easily reduce to "anorexic" (small absolute number of plans) levels incurring only marginal increases in the estimated query processing costs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

How is climate change affecting our coastal environment? How can coastal communities adapt to sea level rise and increased storm risk? These questions have garnered tremendous interest from scientists and policy makers alike, as the dynamic coastal environment is particularly vulnerable to the impacts of climate change. Over half the world population lives and works in a coastal zone less than 120 miles wide, thereby being continuously affected by the changes in the coastal environment [6]. Housing markets are directly influenced by the physical processes that govern coastal systems. Beach towns like Oak Island in North Carolina (NC) face severe erosion, and the tax assesed value of one coastal property fell by 93% in 2007 [9]. With almost ninety percent of the sandy beaches in the US facing moderate to severe erosion [8], coastal communities often intervene to stabilize the shoreline and hold back the sea in order to protect coastal property and infrastructure. Beach nourishment, which is the process of rebuilding a beach by periodically replacing an eroding section of the beach with sand dredged from another location, is a policy for erosion control in many parts of the US Atlantic and Pacific coasts [3]. Beach nourishment projects in the United States are primarily federally funded and implemented by the Army Corps of Engineers (ACE) after a benefit-cost analysis. Benefits from beach nourishment include reduction in storm damage and recreational benefits from a wider beach. Costs would include the expected cost of construction, present value of periodic maintenance, and any external cost such as the environmental cost associated with a nourishment project (NOAA). Federal appropriations for nourishment totaled $787 million from 1995 to 2002 [10]. Human interventions to stabilize shorelines and physical coastal dynamics are strongly coupled. The value of the beach, in the form of storm protection and recreation amenities, is at least partly capitalized into property values. These beach values ultimately influence the benefit-cost analysis in support of shoreline stabilization policy, which, in turn, affects the shoreline dynamics. This paper explores the policy implications of this circularity. With a better understanding of the physical-economic feedbacks, policy makers can more effectively design climate change adaptation strategies. (PDF contains 4 pages)