888 resultados para Congestion pricing
Resumo:
Query processing over the Internet involving autonomous data sources is a major task in data integration. It requires the estimated costs of possible queries in order to select the best one that has the minimum cost. In this context, the cost of a query is affected by three factors: network congestion, server contention state, and complexity of the query. In this paper, we study the effects of both the network congestion and server contention state on the cost of a query. We refer to these two factors together as system contention states. We present a new approach to determining the system contention states by clustering the costs of a sample query. For each system contention state, we construct two cost formulas for unary and join queries respectively using the multiple regression process. When a new query is submitted, its system contention state is estimated first using either the time slides method or the statistical method. The cost of the query is then calculated using the corresponding cost formulas. The estimated cost of the query is further adjusted to improve its accuracy. Our experiments show that our methods can produce quite accurate cost estimates of the submitted queries to remote data sources over the Internet.
Resumo:
A new configurable architecture is presented that offers multiple levels of video playback by accommodating variable levels of network utilization and bandwidth. By utilizing scalable MPEG-4 encoding at the network edge and using specific video delivery protocols, media streaming components are merged to fully optimize video playback for IPv6 networks, thus improving QoS. This is achieved by introducing “programmable network functionality” (PNF) which splits layered video transmission and distributes it evenly over available bandwidth, reducing packet loss and delay caused by out-of-profile DiffServ classes. An FPGA design is given which gives improved performance, e.g. link utilization, end-to-end delay, and that during congestion, improves on-time delivery of video frames by up to 80% when compared to current “static” DiffServ.
Resumo:
A key issue in the design of next generation Internet routers and switches will be provision of traffic manager (TM) functionality in the datapaths of their high speed switching fabrics. A new architecture that allows dynamic deployment of different TM functions is presented. By considering the processing requirements of operations such as policing and congestion, queuing, shaping and scheduling, a solution has been derived that is scalable with a consistent programmable interface. Programmability is achieved using a function computation unit which determines the action (e.g. drop, queue, remark, forward) based on the packet attribute information and a memory storage part. Results of a Xilinx Virtex-5 FPGA reference design are presented.
Resumo:
Using a unique high-frequency data-set on a comprehensive sample of Greek blue-chip stocks, spanning from September 2003 through March 2006, this note assesses the extent and role of commonality in returns, order flows (OFs), and liquidity. It also formally models aggregate equity returns in terms of aggregate equity OF, in an effort to clarify OF's importance in explaining returns for the Athens Exchange market. Almost a quarter of the daily returns in the FTSE/ATHEX20 index is explained by aggregate own OF. In a second step, using principal components and canonical correlation analyses, we document substantial common movements in returns, OFs, and liquidity, both on a market-wide basis and on an individual security basis. These results emphasize that asset pricing and liquidity cannot be analyzed in isolation from each other.
Resumo:
This paper adds to the growing literature of market competition related to the remanufacturer by analyzing the model where the remanufacturer and the manufacturer collaborate with each other in the same channel. This paper investigates a single-period deterministic model which keeps the analysis simple so as to obtain sharper insights. The results characterize the optimal remanufacturing and pricing strategies for the remanufacturer and the manufacturer in the collaborative model
Resumo:
The increasing risks and costs of new product development require firms to collaborate with their supply chain partners in product management. In this paper, a supply chain model is proposed with one risk-neutral supplier and one risk-averse manufacturer. The manufacturer has an opportunity to enhance demand by developing a new product, but both the actual demand for new product and the supplier’s wholesale price are uncertain. The supplier has an incentive to share risks of new product development via an advance commitment to wholesale price for its own profit maximization. The effects of the manufacturer’s risk sensitivity on the players’ optimal strategies are analyzed and the trade-off between innovation incentives and pricing flexibility is investigated from the perspective of the supplier. The results highlight the significant role of risk sensitivity in collaborative new product development, and it is found that the manufacturer’s innovation level and retail price are always decreasing in the risk sensitivity, and the supplier prefers commitment to wholesale price only when the risk sensitivity is below a certain threshold.
Resumo:
This article examines the impact of pension deficits on default risk as measured by the premia on corporate credit default swaps (CDS). We find highly significant evidence that unfunded pension liabilities raise one- and five-year CDS premia. However, this relation is not homogeneous across countries, with the U.S. CDS market leading its European counterparts in the pricing of defined-benefit pension risk.
Resumo:
Increasingly infrastructure providers are supplying the cloud marketplace with storage and on-demand compute resources to host cloud applications. From an application user's point of view, it is desirable to identify the most appropriate set of available resources on which to execute an application. Resource choice can be complex and may involve comparing available hardware specifications, operating systems, value-added services, such as network configuration or data replication, and operating costs, such as hosting cost and data throughput. Providers' cost models often change and new commodity cost models, such as spot pricing, have been introduced to offer significant savings. In this paper, a software abstraction layer is used to discover infrastructure resources for a particular application, across multiple providers, by using a two-phase constraints-based approach. In the first phase, a set of possible infrastructure resources are identified for a given application. In the second phase, a heuristic is used to select the most appropriate resources from the initial set. For some applications a cost-based heuristic is most appropriate; for others a performance-based heuristic may be used. A financial services application and a high performance computing application are used to illustrate the execution of the proposed resource discovery mechanism. The experimental result shows the proposed model could dynamically select an appropriate set of resouces that match the application's requirements.
Resumo:
This paper proposes a new non-parametric method for estimating model-free, time-varying liquidity betas which builds on realized covariance and volatility theory. Working under a liquidity-adjusted CAPM framework we provide evidence that liquidity risk is a factor priced in the Greek stock market, mainly arising from the covariation of individual liquidity with local market liquidity, however, the level of liquidity seems to be an irrelevant variable in asset pricing. Our findings provide support to the notion that liquidity shocks transmitted across securities can cause market-wide effects and can have important implications for portfolio diversification strategies. ©2012 Elsevier B.V. All rights reserved.
Resumo:
Introduction Product standardisation involves promoting the prescribing of pre-selected products within a particular category across a healthcare region and is designed to improve patient safety by promoting continuity of medicine use across the primary/secondary care interface, in addition to cost containment without compromising clinical care (i.e. maintaining safety and efficacy). Objectives To examine the impact of product standardisation on the prescribing of compound alginate preparations within primary care in Northern Ireland. Methods Data were obtained on alginate prescribing from the Northern Ireland Central Services Agency (Prescription Pricing Branch), covering a period of 43 months. Two standardisation promotion interventions were carried out at months 18 and 33. In addition to conventional statistical analyses, a simple interrupted time series analysis approach, using graphical interpretation, was used to facilitate interpretation of the data. Results There was a significant increase in the prescribed share of the preferred alginate product in each of the four health boards in Northern Ireland and a decrease in the cost per Defined Daily Dose for alginate liquid preparations overall. Compliance with the standardisation policy was, however, incomplete and was influenced to a marked degree by the activities of the pharmaceutical industry. The overall economic impact of the prescribing changes during the study was small (3.1%). Conclusion The findings suggested that product standardisation significantly influenced the prescribing pattern for compound alginate liquid preparations within primary care across Northern Ireland. © 2012 The Authors. IJPP © 2012 Royal Pharmaceutical Society.
Resumo:
The REsearch on a CRuiser Enabled Air Transport Environment (RECREATE) project is considers the introduction and airworthiness of cruiser-feeder operations for civil aircraft. Cruiser-feeder operations are investigated as a promising pioneering idea for the air transport of the future. The soundness of the concept of cruiser-feeder operations for civil aircraft can be understood, taking air-to-air refueling operations as an example. For this example, a comprehensive estimate of the benefits can be made, which shows a fuel burn reduction potential and a CO2 emission reduction of 31% for a typical 6000 nautical miles flight with a payload of 250 passengers. This reduction potential is known to be large by any standard. The top level objective of the RECREATE project is to demonstrate on a preliminary design level that cruiser-feeder operations (as a concept to reduce fuel burn and CO2 emission levels) can be shown to comply with the airworthiness requirements for civil aircraft. The underlying Scientific and Technological (S&T) objectives are to determine and study airworthy operational concepts for cruiser-feeder operations, and to derive and quantify benefits in terms of CO2 emission reduction but also other benefits.
Work Package (WP) 3 has the objective to substantiate the assumed benefits of the cruiser/feeder operations through refined analysis and simulation. In this report, initial benefits evaluation of the initial RECREATE cruiser/feeder concepts is presented. The benefits analysis is conducted in delta mode, i.e. comparison is made with a baseline system. Since comparing different aircraft and air transport systems is never a trivial task, appropriate measures and metrics are defined and selected first. Non-dimensional parameters are defined and values for the baseline system derived.
The impact of cruiser/feeder operations such as air-to-air refueling are studied with respect to fuel-burn (or carbon-dioxide), noise and congestion. For this purpose, traffic simulations have been conducted.
Cruiser/feeder operations will have an impact on dispatch reliability as well. An initial assessment of the effect on dispatch reliability has been made and is reported.
Finally, a considerable effort has been made to create the infrastructure for economic delta analysis of the cruiser/feeder concept of operation. First results of the cost analysis have been obtained.
Resumo:
With the increase in construction in dense urban environments, the delays associated with managing the material supply chain to site is called into question. Purpose: The aim of this investigation is to gain the perspective of construction contractors operating in a dense urban environment and the resulting strategies adopted to reduce delays in the delivery of materials to site. Methodology: This is achieved through incorporating a comprehensive literature review on the subject in conjunction with industry interviews with construction professionals in the identification of various management issues and corresponding strategies in the reduction of delays in the delivery of materials to site. Findings: The key issue which emerges is the lack of space for unloading bays while the corresponding key strategy is to schedule deliveries outside peak congestion times. Practical Implication: With confined site construction evident throughout the industry and the noted importance of an effective supply chain, the findings here in further assist on-site management in the daily task of ensuring the effective delivery and off-loading of materials in a complex and hazardous environment. Originality/Value: This research aids on-site management of confined site environments in the coordination of the material supply chain to site.
Resumo:
Experiments were undertaken to characterize a noninvasive chronic, model of nasal congestion in which nasal patency is measured using acoustic rhinometry. Compound 48/80 was administered intranasally to elicit nasal congestion in five beagle dogs either by syringe (0.5 ml) in thiopental sodium-anesthetized animals or as a mist (0.25 ml) in the same animals in the conscious state. Effects of mast cell degranulation on nasal cavity volume as well as on minimal cross-sectional area (A(min)) and intranasal distance to A(min) (D(min)) were studied. Compound 48/80 caused a dose-related decrease in nasal cavity volume and A(min) together with a variable increase in D(min). Maximal responses were seen at 90-120 min. Compound 48/80 was less effective in producing nasal congestion in conscious animals, which also had significantly larger basal nasal cavity volumes. These results demonstrate the utility of using acoustic rhinometry to measure parameters of nasal patency in dogs and suggest that this model may prove useful in studies of the actions of decongestant drugs.
Resumo:
We consider the problem of self-healing in reconfigurable networks e.g., peer-to-peer and wireless mesh networks. For such networks under repeated attack by an omniscient adversary, we propose a fully distributed algorithm, Xheal, that maintains good expansion and spectral properties of the network, while keeping the network connected. Moreover, Xheal does this while allowing only low stretch and degree increase per node. The algorithm heals global properties like expansion and stretch while only doing local changes and using only local information. We also provide bounds on the second smallest eigenvalue of the Laplacian which captures key properties such as mixing time, conductance, congestion in routing etc. Xheal has low amortized latency and bandwidth requirements. Our work improves over the self-healing algorithms Forgiving tree [PODC 2008] andForgiving graph [PODC 2009] in that we are able to give guarantees on degree and stretch, while at the same time preserving the expansion and spectral properties of the network.
Resumo:
Tackling food-related health conditions is becoming one of the most pressing issues in the policy agendas of western liberal democratic governments. In this article, I intend to illustrate what the liberal philosopher John Stuart Mill would have said about legislation on unhealthy food and I focus especially on the arguments advanced by Mill in his classic essay On Liberty ([1859] 2006). Mill is normally considered as the archetype of liberal anti-paternalism and his ideas are often invoked by those who oppose state paternalism, including those who reject legislation that restricts the consumption of unhealthy food. Furthermore, his views have been applied to related policy areas such as alcohol minimum pricing (Saunders 2013) and genetically modified food (Holtug 2001). My analysis proceeds as follows. First, I show that Mill’s account warrants some restrictions on food advertising and justifies various forms of food labelling. Second, I assess whether and to what extent Mill’s ‘harm principle’ justifies social and legal non-paternalistic penalties against unhealthy eaters who are guilty of other-regarding harm. Finally, I show that Mill’s account warrants taxing unhealthy foods, thus restricting the freedom of both responsible and irresponsible eaters and de facto justifying what I call ‘secondary paternalism’.