196 resultados para decoupled net present value
Resumo:
The Person Trade-Off (PTO) is a methodology aimed at measuring thesocial value of health states. The rest of methodologies would measure individualutility and would be less appropriate for taking resource allocation decisions.However few studies have been conducted to test the validity of the method.We present a pilot study with this objective. The study is based on theresult of interviews to 30 undergraduate students in Economics. We judgethe validity of PTO answers by their adequacy to three hypothesis of rationality.First, we show that, given certain rationality assumptions, PTO answersshould be predicted from answers to Standard Gamble questions. This firsthypothesis is not verified. The second hypothesis is that PTO answersshould not vary with different frames of equivalent PTO questions. Thissecond hypothesis is also not verified. Our third hypothesis is that PTOvalues should predict social preferences for allocating resources betweenpatients. This hypothesis is verified. The evidence on the validity of themethod is then conflicting.
Resumo:
We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.
Resumo:
In today s highly competitive and global marketplace the pressure onorganizations to find new ways to create and deliver value to customersgrows ever stronger. In the last two decades, logistics and supply chainhas moved to the center stage. There has been a growing recognition thatit is through an effective management of the logistics function and thesupply chain that the goal of cost reduction and service enhancement canbe achieved. The key to success in Supply Chain Management (SCM) requireheavy emphasis on integration of activities, cooperation, coordination andinformation sharing throughout the entire supply chain, from suppliers tocustomers. To be able to respond to the challenge of integration there isthe need of sophisticated decision support systems based on powerfulmathematical models and solution techniques, together with the advancesin information and communication technologies. The industry and the academiahave become increasingly interested in SCM to be able to respond to theproblems and issues posed by the changes in the logistics and supply chain.We present a brief discussion on the important issues in SCM. We then arguethat metaheuristics can play an important role in solving complex supplychain related problems derived by the importance of designing and managingthe entire supply chain as a single entity. We will focus specially on theIterated Local Search, Tabu Search and Scatter Search as the ones, but notlimited to, with great potential to be used on solving the SCM relatedproblems. We will present briefly some successful applications.
Resumo:
In 1990 a new Spanish 'Plan General de Contabilidad' (PGC) implementedthe requirements of the EU 4th and 7th Directives in Spain. Included in the PGC is the requirement, derived from the 4th Directive, that accounts should present a 'true and fair view', in Spanish 'imagen fiel'. Where the term has been used in English speaking jurisdictions it has proved to have a variety of shades of meaning, and to have had strikingly different impact in different countries. Within the European Union the term has been seen as a 'Trojan horse', inserted into the 4th Directive to inject an Anglo-Saxon approach of flexibility and judgement dependent accounting into a Continental European accounting tradition of detailed prescription and uniformity. In this paper we report on a survey of the views and experience of Spanish auditors relating to 'imagen fiel'. Specifically, we:1) Review the English language literature on 'true and fair view' to identify the key areas of controversy.2) Consider the significance of the 'true and fair view' within the EU 4th Directive.3) Report on the experience of Spanish auditors in working with this concept, their views on the value of the term, and their experience in use of the true and fair view 'override'.
Resumo:
We present a new unifying framework for investigating throughput-WIP(Work-in-Process) optimal control problems in queueing systems,based on reformulating them as linear programming (LP) problems withspecial structure: We show that if a throughput-WIP performance pairin a stochastic system satisfies the Threshold Property we introducein this paper, then we can reformulate the problem of optimizing alinear objective of throughput-WIP performance as a (semi-infinite)LP problem over a polygon with special structure (a thresholdpolygon). The strong structural properties of such polygones explainthe optimality of threshold policies for optimizing linearperformance objectives: their vertices correspond to the performancepairs of threshold policies. We analyze in this framework theversatile input-output queueing intensity control model introduced byChen and Yao (1990), obtaining a variety of new results, including (a)an exact reformulation of the control problem as an LP problem over athreshold polygon; (b) an analytical characterization of the Min WIPfunction (giving the minimum WIP level required to attain a targetthroughput level); (c) an LP Value Decomposition Theorem that relatesthe objective value under an arbitrary policy with that of a giventhreshold policy (thus revealing the LP interpretation of Chen andYao's optimality conditions); (d) diminishing returns and invarianceproperties of throughput-WIP performance, which underlie thresholdoptimality; (e) a unified treatment of the time-discounted andtime-average cases.
Resumo:
The paper develops a method to solve higher-dimensional stochasticcontrol problems in continuous time. A finite difference typeapproximation scheme is used on a coarse grid of low discrepancypoints, while the value function at intermediate points is obtainedby regression. The stability properties of the method are discussed,and applications are given to test problems of up to 10 dimensions.Accurate solutions to these problems can be obtained on a personalcomputer.
Resumo:
Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.
Resumo:
This paper proposes an argument that explains incumbency advantage without recurring to the collective irresponsibility of legislatures. For that purpose, we exploit the informational value of incumbency: incumbency confers voters information about governing politicians not available from challengers. Because there are many reasons for high reelection rates different from incumbency status, we propose a measure of incumbency advantage that improves the use of pure reelection success. We also study the relationship between incumbency advantage and ideological and selection biases. An important implication of our analysis is that the literature linking incumbency and legislature irresponsibility most likely provides an overestimation of the latter.
Resumo:
How much information does an auctioneer want bidders to have in a private value environment?We address this question using a novel approach to ordering information structures based on the property that in private value settings more information leads to a more disperse distribution of buyers updated expected valuations. We define the class of precision criteria following this approach and different notions of dispersion, and relate them to existing criteria of informativeness. Using supermodular precision, we obtain three results: (1) a more precise information structure yields a more efficient allocation; (2) the auctioneer provides less than the efficient level of information since more information increases bidder informational rents; (3) there is a strategic complementarity between information and competition, so that both the socially efficient and the auctioneer s optimal choice of precision increase with the number of bidders, and both converge as the number of bidders goes to infinity.
Resumo:
This paper examines the value of connections between German industry andthe Nazi movement in early 1933. Drawing on previously unused contemporarysources about management and supervisory board composition and stock returns,we find that one out of seven firms, and a large proportion of the biggest companies,had substantive links with the National Socialist German Workers Party. Firmssupporting the Nazi movement experienced unusually high returns, outperformingunconnected ones by 5% to 8% between January and March 1933. These resultsare not driven by sectoral composition and are robust to alternative estimatorsand definitions of affiliation.
Resumo:
The generalization of simple (two-variable) correspondence analysis to more than two categorical variables, commonly referred to as multiple correspondence analysis, is neither obvious nor well-defined. We present two alternative ways of generalizing correspondence analysis, one based on the quantification of the variables and intercorrelation relationships, and the other based on the geometric ideas of simple correspondence analysis. We propose a version of multiple correspondence analysis, with adjusted principal inertias, as the method of choice for the geometric definition, since it contains simple correspondence analysis as an exact special case, which is not the situation of the standard generalizations. We also clarify the issue of supplementary point representation and the properties of joint correspondence analysis, a method that visualizes all two-way relationships between the variables. The methodology is illustrated using data on attitudes to science from the International Social Survey Program on Environment in 1993.
Resumo:
We studied the decision making process in the Dictator Game and showed that decisions are the result of a two-step process. In a first step, decision makers generate an automatic, intuitive proposal. Given sufficient motivation and cognitive resources, they adjust this in a second, more deliberated phase. In line with the social intuitionist model, we show that one s Social Value Orientation determines intuitive choice tendencies in the first step, and that this effect is mediated by the dictator s perceived interpersonal closeness with the receiver. Self-interested concerns subsequently leadto a reduction of donation size in step 2. Finally, we show that increasing interpersonal closeness can promote pro-social decision-making.
Resumo:
In cost-effectiveness analysis (CEA) it is usually assumed that a QALY is of equal value to everybody, irrespective of the patient's age. However, it is possible that society assigns different social values to a QALY according to who gets it. In this paper we discuss the possibility of weighting health benefits for age in CEA. We also examinethe possibility that age-related preferences depend on the size of the health gain. An experiment was performedto test these hypotheses. The results assessing suggest that the patient's age is a relevant factor when assessing health gains.
Resumo:
This paper presents findings from a study investigating a firm s ethical practices along the value chain. In so doing we attempt to better understand potential relationships between a firm s ethical stance with its customers and those of its suppliers within a supply chain and identify particular sectoral and cultural influences that might impinge on this. Drawing upon a database comprising of 667 industrial firms from 27 different countries, we found that ethical practices begin with the firm s relationship with its customers, the characteristics of which then influence the ethical stance with the firm s suppliers within the supply chain. Importantly, market structure along with some key cultural characteristics were also found to exert significant influence on the implementation of ethical policies in these firms.
Resumo:
A simple variant of trait group selection, employing predators as themechanism underlying group selection, supports contingent reproductivesuicide as altruism (i.e., behavior lowering personal fitness whileaugmenting that of another) without kin assortment. The contingentsuicidal type may either saturate the population or be polymorphicwith a type avoiding suicide, depending on parameters. In addition tocontingent suicide, this randomly assorting morph may also exhibitcontinuously expressed strong altruism (sensu Wilson 1979) usuallythought restricted to kin selection. The model will not, however,support a sterile worker caste as such, where sterility occurs beforelife history events associated with effective altruism; reproductivesuicide must remain fundamentally contingent (facultative sensu WestEberhard 1987; Myles 1988) under random assortment. The continuouslyexpressed strong altruism supported by the model may be reinterpretedas probability of arbitrarily committing reproductive suicide, withoutbenefit for another; such arbitrary suicide (a "load" on "adaptive"suicide) is viable only under a more restricted parameter spacerelative to the necessarily concomitant adaptive contingent suicide.