916 resultados para market monitoring costs
Resumo:
This work illustrates the influence of wind forecast errors on system costs, wind curtailment and generator dispatch in a system with high wind penetration. Realistic wind forecasts of different specified accuracy levels are created using an auto-regressive moving average model and these are then used in the creation of day-ahead unit commitment schedules. The schedules are generated for a model of the 2020 Irish electricity system with 33% wind penetration using both stochastic and deterministic approaches. Improvements in wind forecast accuracy are demonstrated to deliver: (i) clear savings in total system costs for deterministic and, to a lesser extent, stochastic scheduling; (ii) a decrease in the level of wind curtailment, with close agreement between stochastic and deterministic scheduling; and (iii) a decrease in the dispatch of open cycle gas turbine generation, evident with deterministic, and to a lesser extent, with stochastic scheduling.
Resumo:
Policy makers and analysts are often faced with situations where it is unclear whether market-based instruments hold real promise of reducing costs, relative to conventional uniform standards. We develop analytic expressions that can be employed with modest amounts of information to estimate the potential cost savings associated with market-based policies, with an application to the environmental policy realm. These simple formulae can identify instruments that merit more detailed investigation. We illustrate the use of these results with an application to nitrogen oxides control by electric utilities in the United States.
Resumo:
To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.
Much Ado About Nothing: The Limitation of Liability and the Market for 19th century Irish Bank Stock
Resumo:
Abstract Limited liability is widely believed to be a prerequisite for the emergence of an active and liquid securities market because the transactions costs associated with trading ownership of unlimited liability firms are viewed as prohibitive. In this article, we examine the trading of shares in an Irish bank, which limited its liability in 1883. Using this bank’s archives, we assemble a time series of trading data, which we test for structural breaks. Our results suggest that the move to limited liability had a negligible impact upon the trading of this bank’s shares.
Resumo:
We present a simple framework in which both the exchange rate disconnect and forward bias puzzles are simultaneously resolved. The flexible-price two-country monetary model is extended to include a consumption externality with habit persistence. Habitpersistence is modeled using Campbell Cochrane preferences with ‘deep’ habits along the lines of the work of Ravn, Schmitt-Grohe and Uribe. By deep habits, we mean habits defined over goods rather than countries. The model is simulated using the artificial economy methodology. It offers a neo-classical explanation of the Meese–Rogoff puzzle and mimics the failure of fundamentals to explain nominal exchange rates in a linear setting. Finally, the model naturally generates the negative slope in the standard forward market regression.
Resumo:
In this paper we provide a detailed profile and analysis of the regional risk capital market in Scotland, using an innovative methodology and specially developed databases which cover risk capital investment in young companies in the periods 2000–04 and 2005–07. This identifies the investment activity of all actors in the market and provides estimates of the total flow of risk capital investment into early-stage Scottish companies over the period. The paper concludes by drawing out the implications for policy makers (providing a more robust evidence base for the development, implementation and monitoring of policy) and for academic researchers (on the methodologies for estimating market scale and efficiency).
Resumo:
Wind energy has been identified as key to the European Union’s 2050 low carbon economy. However, as wind is a variable resource and stochastic by nature, it is difficult to plan and schedule the power system under varying wind power generation. This paper investigates the impacts of offshore wind power forecast error on the operation and management of a pool-based electricity market in 2050. The impact of the magnitude and variance of the offshore wind power forecast error on system generation costs, emission costs, dispatch-down of wind, number of start-ups and system marginal price is analysed. The main findings of this research are that the magnitude of the offshore wind power forecast error has the largest impact on system generation costs and dispatch-down of wind, but the variance of the offshore wind power forecast error has the biggest impact on emissions costs and system marginal price. Overall offshore wind power forecast error variance results in a system marginal price increase of 9.6% in 2050.
Resumo:
This paper studies disinflationary shocks in a non-linear New Keynesian model with search and matching frictions and moral hazard in the labor markets. Our focus is on understanding the wage formation process as well as welfare costs of disinflations in the presence of such labor market frictions.
The presence of imperfect information in labor markets imposes a lower bound on worker surplus that varies endogenously. Consequently equilibrium can take two forms depending on whether the no shirking condition is binding or not. We also evaluate both regimes from a welfare perspective when the economy is subject to a perfectly credible disinflationary shock.
Resumo:
Polymer extrusion is regarded as an energy-intensive production process, and the real-time monitoring of both energy consumption and melt quality has become necessary to meet new carbon regulations and survive in the highly competitive plastics market. The use of a power meter is a simple and easy way to monitor energy, but the cost can sometimes be high. On the other hand, viscosity is regarded as one of the key indicators of melt quality in the polymer extrusion process. Unfortunately, viscosity cannot be measured directly using current sensory technology. The employment of on-line, in-line or off-line rheometers is sometimes useful, but these instruments either involve signal delay or cause flow restrictions to the extrusion process, which is obviously not suitable for real-time monitoring and control in practice. In this paper, simple and accurate real-time energy monitoring methods are developed. This is achieved by looking inside the controller, and using control variables to calculate the power consumption. For viscosity monitoring, a ‘soft-sensor’ approach based on an RBF neural network model is developed. The model is obtained through a two-stage selection and differential evolution, enabling compact and accurate solutions for viscosity monitoring. The proposed monitoring methods were tested and validated on a Killion KTS-100 extruder, and the experimental results show high accuracy compared with traditional monitoring approaches.
Resumo:
This paper investigates the impacts of offshore wind power forecast error on the operation and management of a pool-based electricity market in 2050. The impact from offshore wind power forecast errors of up to 2000 MW on system generation costs, emission costs, dispatch-down of wind, number of start-ups and system marginal price are analysed. The main findings of this research are an increase in system marginal prices of approximately 1% for every percentage point rise in the offshore wind power forecast error regardless of the average forecast error sign. If offshore wind power generates less than forecasted (−13%) generation costs and system marginal prices increases by 10%. However, if offshore wind power generates more than forecasted (4%) the generation costs decrease yet the system marginal prices increase by 3%. The dispatch down of large quantities of wind power highlights the need for flexible interconnector capacity. From a system operator's perspective it is more beneficial when scheduling wind ahead of the trading period to forecast less wind than will be generated.
Resumo:
BACKGROUND: Age-related macular degeneration is the most common cause of sight impairment in the UK. In neovascular age-related macular degeneration (nAMD), vision worsens rapidly (over weeks) due to abnormal blood vessels developing that leak fluid and blood at the macula.
OBJECTIVES: To determine the optimal role of optical coherence tomography (OCT) in diagnosing people newly presenting with suspected nAMD and monitoring those previously diagnosed with the disease.
DATA SOURCES: Databases searched: MEDLINE (1946 to March 2013), MEDLINE In-Process & Other Non-Indexed Citations (March 2013), EMBASE (1988 to March 2013), Biosciences Information Service (1995 to March 2013), Science Citation Index (1995 to March 2013), The Cochrane Library (Issue 2 2013), Database of Abstracts of Reviews of Effects (inception to March 2013), Medion (inception to March 2013), Health Technology Assessment database (inception to March 2013).
REVIEW METHODS: Types of studies: direct/indirect studies reporting diagnostic outcomes.
INDEX TEST: time domain optical coherence tomography (TD-OCT) or spectral domain optical coherence tomography (SD-OCT).
COMPARATORS: clinical evaluation, visual acuity, Amsler grid, colour fundus photographs, infrared reflectance, red-free images/blue reflectance, fundus autofluorescence imaging, indocyanine green angiography, preferential hyperacuity perimetry, microperimetry. Reference standard: fundus fluorescein angiography (FFA). Risk of bias was assessed using quality assessment of diagnostic accuracy studies, version 2. Meta-analysis models were fitted using hierarchical summary receiver operating characteristic curves. A Markov model was developed (65-year-old cohort, nAMD prevalence 70%), with nine strategies for diagnosis and/or monitoring, and cost-utility analysis conducted. NHS and Personal Social Services perspective was adopted. Costs (2011/12 prices) and quality-adjusted life-years (QALYs) were discounted (3.5%). Deterministic and probabilistic sensitivity analyses were performed.
RESULTS: In pooled estimates of diagnostic studies (all TD-OCT), sensitivity and specificity [95% confidence interval (CI)] was 88% (46% to 98%) and 78% (64% to 88%) respectively. For monitoring, the pooled sensitivity and specificity (95% CI) was 85% (72% to 93%) and 48% (30% to 67%) respectively. The FFA for diagnosis and nurse-technician-led monitoring strategy had the lowest cost (£39,769; QALYs 10.473) and dominated all others except FFA for diagnosis and ophthalmologist-led monitoring (£44,649; QALYs 10.575; incremental cost-effectiveness ratio £47,768). The least costly strategy had a 46.4% probability of being cost-effective at £30,000 willingness-to-pay threshold.
LIMITATIONS: Very few studies provided sufficient information for inclusion in meta-analyses. Only a few studies reported other tests; for some tests no studies were identified. The modelling was hampered by a lack of data on the diagnostic accuracy of strategies involving several tests.
CONCLUSIONS: Based on a small body of evidence of variable quality, OCT had high sensitivity and moderate specificity for diagnosis, and relatively high sensitivity but low specificity for monitoring. Strategies involving OCT alone for diagnosis and/or monitoring were unlikely to be cost-effective. Further research is required on (i) the performance of SD-OCT compared with FFA, especially for monitoring but also for diagnosis; (ii) the performance of strategies involving combinations/sequences of tests, for diagnosis and monitoring; (iii) the likelihood of active and inactive nAMD becoming inactive or active respectively; and (iv) assessment of treatment-associated utility weights (e.g. decrements), through a preference-based study.
STUDY REGISTRATION: This study is registered as PROSPERO CRD42012001930.
FUNDING: The National Institute for Health Research Health Technology Assessment programme.
Resumo:
Bioresorbable polymers such as PLA have an important role to play in the development of temporary implantable medical devices with significant benefits over traditional therapies. However, development of new devices is hindered by high manufacturing costs associated with difficulties in processing the material. A major problem is the lack of insight on material degradation during processing. In this work, a method of quantifying degradation of PLA using IR spectroscopy coupled with computational chemistry and chemometric modeling is examined. It is shown that the method can predict the quantity of degradation products in solid-state samples with reasonably good accuracy, indicating the potential to adapt the method to developing an on-line sensor for monitoring PLA degradation in real-time during processing.
Resumo:
OBJECTIVE: To assess the efficiency of alternative monitoring services for people with ocular hypertension (OHT), a glaucoma risk factor.
DESIGN: Discrete event simulation model comparing five alternative care pathways: treatment at OHT diagnosis with minimal monitoring; biennial monitoring (primary and secondary care) with treatment if baseline predicted 5-year glaucoma risk is ≥6%; monitoring and treatment aligned to National Institute for Health and Care Excellence (NICE) glaucoma guidance (conservative and intensive).
SETTING: UK health services perspective.
PARTICIPANTS: Simulated cohort of 10 000 adults with OHT (mean intraocular pressure (IOP) 24.9 mm Hg (SD 2.4).
MAIN OUTCOME MEASURES: Costs, glaucoma detected, quality-adjusted life years (QALYs).
RESULTS: Treating at diagnosis was the least costly and least effective in avoiding glaucoma and progression. Intensive monitoring following NICE guidance was the most costly and effective. However, considering a wider cost-utility perspective, biennial monitoring was less costly and provided more QALYs than NICE pathways, but was unlikely to be cost-effective compared with treating at diagnosis (£86 717 per additional QALY gained). The findings were robust to risk thresholds for initiating monitoring but were sensitive to treatment threshold, National Health Service costs and treatment adherence.
CONCLUSIONS: For confirmed OHT, glaucoma monitoring more frequently than every 2 years is unlikely to be efficient. Primary treatment and minimal monitoring (assessing treatment responsiveness (IOP)) could be considered; however, further data to refine glaucoma risk prediction models and value patient preferences for treatment are needed. Consideration to innovative and affordable service redesign focused on treatment responsiveness rather than more glaucoma testing is recommended.
Resumo:
Energy-using Products (EuPs) contribute significantly to the United Kingdom’s CO2 emissions, both in the domestic and non-domestic sectors. Policies that encourage the use of more energy efficient products (such as minimum performance standards, energy labelling, enhanced capital allowances, etc.) can therefore generate significant reductions in overall energy consumption and hence, CO2 emissions. While these policies can impose costs on the producers and consumers of these products in the short run, the process of product innovation may reduce the magnitude of these costs over time. If this is the case, then it is important that the impacts of innovation are taken into account in policy impact assessments. Previous studies have found considerable evidence of experience curve effects for EuP categories (e.g. refrigerators, televisions, etc.), with learning rates of around 20% for both average unit costs and average prices; similar to those found for energy supply technologies. Moreover, the decline in production costs has been accompanied by a significant improvement in the energy efficiency of EuPs. Building on these findings and the results of an empirical analysis of UK sales data for a range of product categories, this paper sets out an analytic framework for assessing the impact of EuP policy interventions on consumers and producers which takes explicit account of the product innovation process. The impact of the product innovation process can be seen in the continuous evolution of the energy class profiles of EuP categories over time; with higher energy classes (e.g. A, A+, etc.) entering the market and increasing their market share, while lower classes (e.g. E, F, etc.) lose share and then leave the market. Furthermore, the average prices of individual energy classes have declined over their respective lives, while new classes have typically entered the market at successively lower “launch prices”. Based on two underlying assumptions regarding the shapes of the “lifecycle profiles” for the relative sales and the relative average mark-ups of individual energy classes, a simple simulation model is developed that can replicate the observed market dynamics in terms of the evolution of market shares and average prices. The model is used to assess the effect of two alternative EuP policy interventions – a minimum energy performance standard and an energy-labelling scheme – on the average unit cost trajectory and the average price trajectory of a typical EuP category, and hence the financial impacts on producers and consumers.