970 resultados para Cost allocation
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Suppliers of water and energy are frequently natural monopolies, with their pricing regulated by governmental agencies. Pricing schemes are evaluated by the efficiency of the resource allocation they lead to, the capacity of the utilities to capture their costs and the distributional effects of the policies, in particular, impacts on the poor. One pricing approach has been average cost pricing, which guarantees cost recovery and allows utilities to provide their product at relatively low prices. However, average cost pricing leads to economically inefficient consumption levels, when sources of water and energy are limited and increasing the supply is costly. An alternative approach is increasing block rates (hereafter, IBR or tiered pricing), where individuals pay a low rate for an initial consumption block and a higher rate as they increase use beyond that block. An example of IBR is shown in Figure 1 (on next page), which shows a rate structure for residential water use. With the rates in Figure 1, a household would be charged $0.46 and $0.71 per hundred gallons for consumption below and above 21,000 gallons per month, respectively.
Resumo:
A prevalent claim is that we are in knowledge economy. When we talk about knowledge economy, we generally mean the concept of “Knowledge-based economy” indicating the use of knowledge and technologies to produce economic benefits. Hence knowledge is both tool and raw material (people’s skill) for producing some kind of product or service. In this kind of environment economic organization is undergoing several changes. For example authority relations are less important, legal and ownership-based definitions of the boundaries of the firm are becoming irrelevant and there are only few constraints on the set of coordination mechanisms. Hence what characterises a knowledge economy is the growing importance of human capital in productive processes (Foss, 2005) and the increasing knowledge intensity of jobs (Hodgson, 1999). Economic processes are also highly intertwined with social processes: they are likely to be informal and reciprocal rather than formal and negotiated. Another important point is also the problem of the division of labor: as economic activity becomes mainly intellectual and requires the integration of specific and idiosyncratic skills, the task of dividing the job and assigning it to the most appropriate individuals becomes arduous, a “supervisory problem” (Hogdson, 1999) emerges and traditional hierarchical control may result increasingly ineffective. Not only specificity of know how makes it awkward to monitor the execution of tasks, more importantly, top-down integration of skills may be difficult because ‘the nominal supervisors will not know the best way of doing the job – or even the precise purpose of the specialist job itself – and the worker will know better’ (Hogdson,1999). We, therefore, expect that the organization of the economic activity of specialists should be, at least partially, self-organized. The aim of this thesis is to bridge studies from computer science and in particular from Peer-to-Peer Networks (P2P) to organization theories. We think that the P2P paradigm well fits with organization problems related to all those situation in which a central authority is not possible. We believe that P2P Networks show a number of characteristics similar to firms working in a knowledge-based economy and hence that the methodology used for studying P2P Networks can be applied to organization studies. Three are the main characteristics we think P2P have in common with firms involved in knowledge economy: - Decentralization: in a pure P2P system every peer is an equal participant, there is no central authority governing the actions of the single peers; - Cost of ownership: P2P computing implies shared ownership reducing the cost of owing the systems and the content, and the cost of maintaining them; - Self-Organization: it refers to the process in a system leading to the emergence of global order within the system without the presence of another system dictating this order. These characteristics are present also in the kind of firm that we try to address and that’ why we have shifted the techniques we adopted for studies in computer science (Marcozzi et al., 2005; Hales et al., 2007 [39]) to management science.
Resumo:
The past decade has seen the energy consumption in servers and Internet Data Centers (IDCs) skyrocket. A recent survey estimated that the worldwide spending on servers and cooling have risen to above $30 billion and is likely to exceed spending on the new server hardware . The rapid rise in energy consumption has posted a serious threat to both energy resources and the environment, which makes green computing not only worthwhile but also necessary. This dissertation intends to tackle the challenges of both reducing the energy consumption of server systems and by reducing the cost for Online Service Providers (OSPs). Two distinct subsystems account for most of IDC’s power: the server system, which accounts for 56% of the total power consumption of an IDC, and the cooling and humidifcation systems, which accounts for about 30% of the total power consumption. The server system dominates the energy consumption of an IDC, and its power draw can vary drastically with data center utilization. In this dissertation, we propose three models to achieve energy effciency in web server clusters: an energy proportional model, an optimal server allocation and frequency adjustment strategy, and a constrained Markov model. The proposed models have combined Dynamic Voltage/Frequency Scaling (DV/FS) and Vary-On, Vary-off (VOVF) mechanisms that work together for more energy savings. Meanwhile, corresponding strategies are proposed to deal with the transition overheads. We further extend server energy management to the IDC’s costs management, helping the OSPs to conserve, manage their own electricity cost, and lower the carbon emissions. We have developed an optimal energy-aware load dispatching strategy that periodically maps more requests to the locations with lower electricity prices. A carbon emission limit is placed, and the volatility of the carbon offset market is also considered. Two energy effcient strategies are applied to the server system and the cooling system respectively. With the rapid development of cloud services, we also carry out research to reduce the server energy in cloud computing environments. In this work, we propose a new live virtual machine (VM) placement scheme that can effectively map VMs to Physical Machines (PMs) with substantial energy savings in a heterogeneous server cluster. A VM/PM mapping probability matrix is constructed, in which each VM request is assigned with a probability running on PMs. The VM/PM mapping probability matrix takes into account resource limitations, VM operation overheads, server reliability as well as energy effciency. The evolution of Internet Data Centers and the increasing demands of web services raise great challenges to improve the energy effciency of IDCs. We also express several potential areas for future research in each chapter.
Resumo:
A patient classification system was developed integrating a patient acuity instrument with a computerized nursing distribution method based on a linear programming model. The system was designed for real-time measurement of patient acuity (workload) and allocation of nursing personnel to optimize the utilization of resources.^ The acuity instrument was a prototype tool with eight categories of patients defined by patient severity and nursing intensity parameters. From this tool, the demand for nursing care was defined in patient points with one point equal to one hour of RN time. Validity and reliability of the instrument was determined as follows: (1) Content validity by a panel of expert nurses; (2) predictive validity through a paired t-test analysis of preshift and postshift categorization of patients; (3) initial reliability by a one month pilot of the instrument in a practice setting; and (4) interrater reliability by the Kappa statistic.^ The nursing distribution system was a linear programming model using a branch and bound technique for obtaining integer solutions. The objective function was to minimize the total number of nursing personnel used by optimally assigning the staff to meet the acuity needs of the units. A penalty weight was used as a coefficient of the objective function variables to define priorities for allocation of staff.^ The demand constraints were requirements to meet the total acuity points needed for each unit and to have a minimum number of RNs on each unit. Supply constraints were: (1) total availability of each type of staff and the value of that staff member (value was determined relative to that type of staff's ability to perform the job function of an RN (i.e., value for eight hours RN = 8 points, LVN = 6 points); (2) number of personnel available for floating between units.^ The capability of the model to assign staff quantitatively and qualitatively equal to the manual method was established by a thirty day comparison. Sensitivity testing demonstrated appropriate adjustment of the optimal solution to changes in penalty coefficients in the objective function and to acuity totals in the demand constraints.^ Further investigation of the model documented: correct adjustment of assignments in response to staff value changes; and cost minimization by an addition of a dollar coefficient to the objective function. ^
Resumo:
This study compares the procurement cost-minimizing and productive efficiency performance of the auction mechanism used by independent system operators (ISOs) in wholesale electricity auction markets in the U.S. with that of a proposed alternative. The current practice allocates energy contracts as if the auction featured a discriminatory final payment method when, in fact, the markets are uniform price auctions. The proposed alternative explicitly accounts for the market clearing price during the allocation phase. We find that the proposed alternative largely outperforms the current practice on the basis of procurement costs in the context of simple auction markets featuring both day-ahead and real-time auctions and that the procurement cost advantage of the alternative is complete when we simulate the effects of increased competition. We also find that a trade-off between the objectives of procurement cost minimization and productive efficiency emerges in our simple auction markets and persists in the face of increased competition.
Resumo:
This study of the wholesale electricity market compares the efficiency performance of the auction mechanism currently in place in U.S. markets with the performance of a proposed mechanism. The analysis highlights the importance of considering strategic behavior when comparing different institutional systems. We find that in concentrated markets, neither auction mechanism can guarantee an efficient allocation. The advantage of the current mechanism increases with increased price competition if market demand is perfectly inelastic. However, if market demand has some responsiveness to price, the superiority of the current auction with respect to efficiency is not that obvious. We present a case where the proposed auction outperforms the current mechanism on efficiency even if all offers reflect true production costs. We also find that a market designer might face a choice problem with a tradeoff between lower electricity cost and production efficiency. Some implications for social welfare are discussed as well.
Resumo:
We propose a nonparametric model for global cost minimization as a framework for optimal allocation of a firm's output target across multiple locations, taking account of differences in input prices and technologies across locations. This should be useful for firms planning production sites within a country and for foreign direct investment decisions by multi-national firms. Two illustrative examples are included. The first example considers the production location decision of a manufacturing firm across a number of adjacent states of the US. In the other example, we consider the optimal allocation of US and Canadian automobile manufacturers across the two countries.
Resumo:
The research project is an extension of the economic theory to the health care field and health care research projects evaluating the influence of demand and supply variables upon medical care inflation. The research tests a model linking the demographic and socioeconomic characteristics of the population, its community case mix, and technology, the prices of goods and services other than medical care, the way its medical services are delivered and the health care resources available to its population to different utilization patterns which, consequently, lead to variations in health care prices among metropolitan areas. The research considers the relationship of changes in community characteristics and resources and medical care inflation.^ The rapidly increasing costs of medical care have been of great concern to the general public, medical profession, and political bodies. Research and analysis of the main factors responsible for the rate of increase of medical care prices is necessary in order to devise appropriate solutions to cope with the problem. An understanding of the community characteristics and resources-medical care costs relationships in the metropolitan areas potentially offers guidance in individual plan and national policy development.^ The research considers 145 factors measuring community milieu (demographic, social, educational, economic, illness level, prices of goods and services other than medical care, hospital supply, physicians resources and techological factors). Through bivariate correlation analysis, the number of variables was reduced to a set of 1 to 4 variables for each cost equation. Two approaches were identified to track inflation in the health care industry. One approach measures costs of production which accounts for price and volume increases. The other approach measures price increases. One general and four specific measures were developed to represent each of the major approaches. The general measure considers the increase on medical care prices as a whole and the specific measures deal with hospital costs and physician's fees. The relationships among changes in community characteristics and resources and health care inflation were analyzed using bivariate correlation and regression analysis methods. It has been concluded that changes in community characteristics and resources are predictive of hospital costs and physician's fees inflation, but are not predictive of increases in medical care prices. These findings provide guidance in the formulation of public policy which could alter the trend of medical care inflation and in the allocation of limited Federal funds.^
Resumo:
This paper tackles the optimization of applications in multi-provider hybrid cloud scenarios from an economic point of view. In these scenarios the great majority of solutions offer the automatic allocation of resources on different cloud providers based on their current prices. However our approach is intended to introduce a novel solution by making maximum use of divide and rule. This paper describes a methodology to create cost aware cloud applications that can be broken down into the three most important components in cloud infrastructures: computation, network and storage. A real videoconference system has been modified in order to evaluate this idea with both theoretical and empirical experiments. This system has become a widely used tool in several national and European projects for e-learning and collaboration purposes.
Resumo:
The Swinfen Charitable Trust has used email for some years as a low-cost telemedicine medium to provide consultant support for doctors in developing countries. A scalable, automatic message-routing system was constructed which automates many of the tasks involved in message handling. During the first 12 months of its use, 1510 messages were processed automatically. There were 128 referrals from 18 hospitals in nine countries. Of these 128 queries, 89 (70%) were replied to within 72 h; the median delay was 1.1 day. The 39 unanswered queries were sent to backup specialists for reply and 36 of them (92%) were replied to within 72 h. In the remaining three cases, a second-line (backup) specialist was required. The referrals were handled by 54 volunteer specialists from a panel of over 70. Two system operators, located 10 time zones apart, managed the system. The median time from receipt of a new referral to its allocation to a specialist was 0.2 days (interquartile range, IQR, 0.1-0.8). The median interval between receipt of a new referral and first reply was 2.6 days (IQR 0.8-5.9). Automatic message handling solves many of the problems of manual email telemedicine systems and represents a potentially scalable way of doing low-cost telemedicine in the developing world.
Resumo:
Email has been used for some years as a low-cost telemedicine medium to provide support for developing countries. However, all operations have been relatively small scale and fairly labour intensive to administer. A scalable, automatic message-routing system was constructed which automates many of the tasks. During a four-month study period in 2002, 485 messages were processed automatically. There were 31 referrals from eight hospitals in three countries. These referrals were handled by 25 volunteer specialists from a panel of 42. Two system operators, located 10 time zones apart, managed the system. The median time from receipt of a new referral to its allocation to a specialist was 1.0 days (interquartile range 0.7-2.4). The median interval between allocation and first reply was 0.7 days (interquartile range 0.3-2.3). Automatic message handling solves many of the problems of manual email telemedicine systems and represents a potentially scalable way of doing low-cost telemedicine in the developing world.
Resumo:
This paper describes an attempt to evaluate cost efficiency in UK university central administration. The funding councils of higher education institutions have progressively evolved elaborate systems for measuring university performance in teaching quality and research. Indeed, funding of universities is linked to their performance in research. The allocation of resources between academic and administrative activities, on the other hand, has so far not been subject to scrutiny. Yet, expenditure on administration is typically some 30% of that allocated to academic activities. This paper sets up a data envelopment analysis (DEA) framework to identify practices leading to cost-efficient central administrative services in UK universities. The problems in defining the unit of assessment and the relationship between the inputs and the outputs are clearly demonstrated. © 2005 Elsevier Ltd. All rights reserved.
Resumo:
Since 1988, quasi-markets have been introduced into many areas of social policy in the UK, the NHS internal market is one example. Markets operate by price signals. The NHS Internal Market, if it is to operate efficiently, requires purchasers and providers to respond to price signals. The research hypothesis is - cost accounting methods can be developed to enable healthcare contracts to be priced on a cost-basis in a manner which will facilitate the achievement of economic efficiency in the NHS internal market. Surveys of hospitals in 1991 and 1994 established the cost methods adopted in deriving the prices for healthcare contracts in the first year of the market and three years on. An in-depth view of the costing for pricing process was gained through case studies. Hospitals had inadequate cost information on which to price healthcare contracts at the inception of the internal market: prices did not reflect the relative performance of healthcare providers sufficiently closely to enable the market's espoused efficiency aims to be achieved. Price variations were often due to differing costing approaches rather than efficiency. Furthermore, price comparisons were often meaningless because of inadequate definition of the services (products). In April 1993, the NHS Executive issued guidance on costing for contracting to all NHS providers in an attempt to improve the validity of price comparisons between alternative providers. The case studies and the 1994 survey show that although price comparison has improved, considerable problems remain. Consistency is not assured, and the problem of adequate product definition is still to be solved. Moreover, the case studies clearly highlight the mismatch of rigid, full-cost pricing rules with both the financial management considerations at local level and the emerging internal market(s). Incentives exist to cost-shift, and healthcare prices can easily be manipulated. In the search for a new health policy paradigm to replace traditional bureaucratic provision, cost-based pricing cannot be used to ensure a more efficient allocation of healthcare resources.
Resumo:
Fibre overlay is a cost-effective technique to alleviate wavelength blocking in some links of a wavelength-routed optical network by increasing the number of wavelengths in those links. In this letter, we investigate the effects of overlaying fibre in an all-optical network (AON) based on GÉANT2 topology. The constraint-based routing and wavelength assignment (CB-RWA) algorithm locates where cost-efficient upgrades should be implemented. Through numerical examples, we demonstrate that the network capacity improves by 25 per cent by overlaying fibre on 10 per cent of the links, and by 12 per cent by providing hop reduction links comprising 2 per cent of the links. For the upgraded network, we also show the impact of dynamic traffic allocation on the blocking probability. Copyright © 2010 John Wiley & Sons, Ltd.