132 resultados para Individually rational utility set
Resumo:
The empirical evidence testing the validity of the rational partisan theory (RPT) has been mixed. In this article, we argue that the inclusion of other macroeconomic policies and the presence of an independent central bank can partly contribute to explain this inconclusiveness. This article expands Alesina s (1987) RPT model to include an extra policy and an independent central bank. With these extensions, the implications of RPT are altered signi ficantly. In particular, when the central bank is more concerned about output than public spending (an assumption made by many papers in this literature), then the direct relationship between in flation and output derived in Alesina (1987) never holds. Keywords: central bank, conservativeness, political uncertainty. JEL Classi fication: E58, E63.
Resumo:
When one wishes to implement public policies, there is a previous need of comparing different actions and valuating and evaluating them to assess their social attractiveness. Recently the concept of well-being has been proposed as a multidimensional proxy for measuring societal prosperity and progress; a key research topic is then on how we can measure and evaluate this plurality of dimensions for policy decisions. This paper defends the thesis articulated in the following points: 1. Different metrics are linked to different objectives and values. To use only one measurement unit (on the grounds of the so-called commensurability principle) for incorporating a plurality of dimensions, objectives and values, implies reductionism necessarily. 2. Point 1) can be proven as a matter of formal logic by drawing on the work of Geach about moral philosophy. This theoretical demonstration is an original contribution of this article. Here the distinction between predicative and attributive adjectives is formalised and definitions are provided. Predicative adjectives are further distinguished into absolute and relative ones. The new concepts of set commensurability and rod commensurability are introduced too. 3. The existence of a plurality of social actors, with interest in the policy being assessed, causes that social decisions involve multiple types of values, of which economic efficiency is only one. Therefore it is misleading to make social decisions based only on that one value. 4. Weak comparability of values, which is grounded on incommensurability, is proved to be the main methodological foundation of policy evaluation in the framework of well-being economics. Incommensurability does not imply incomparability; on the contrary incommensurability is the only rational way to compare societal options under a plurality of policy objectives. 5. Weak comparability can be implemented by using multi-criteria evaluation, which is a formal framework for applied consequentialism under incommensurability. Social Multi-Criteria Evaluation, in particular, allows considering both technical and social incommensurabilities simultaneously.
Resumo:
Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
Resumo:
The recent availability of the chicken genome sequence poses the question of whether there are human protein-coding genes conserved in chicken that are currently not included in the human gene catalog. Here, we show, using comparative gene finding followed by experimental verification of exon pairs by RT–PCR, that the addition to the multi-exonic subset of this catalog could be as little as 0.2%, suggesting that we may be closing in on the human gene set. Our protocol, however, has two shortcomings: (i) the bioinformatic screening of the predicted genes, applied to filter out false positives, cannot handle intronless genes; and (ii) the experimental verification could fail to identify expression at a specific developmental time. This highlights the importance of developing methods that could provide a reliable estimate of the number of these two types of genes.
Resumo:
It has long been standard in agency theory to search for incentive-compatible mechanisms on the assumption that people care only about their own material wealth. However, this assumption is clearly refuted by numerous experiments, and we feel that it may be useful to consider nonpecuniary utility in mechanism design and contract theory. Accordingly, we devise an experiment to explore optimal contracts in an adverse-selection context. A principal proposes one of three contract menus, each of which offers a choice of two incentive-compatible contracts, to two agents whose types are unknown to the principal. The agents know the set of possible menus, and choose to either accept one of the two contracts offered in the proposed menu or to reject the menu altogether; a rejection by either agent leads to lower (and equal) reservation payoffs for all parties. While all three possible menus favor the principal, they do so to varying degrees. We observe numerous rejections of the more lopsided menus, and approach an equilibrium where one of the more equitable contract menus (which one depends on the reservation payoffs) is proposed and agents accept a contract, selecting actions according to their types. Behavior is largely consistent with all recent models of social preferences, strongly suggesting there is value in considering nonpecuniary utility in agency theory.
Resumo:
The paper explores the consequences that relying on different behavioral assumptions in training managers may have on their future performance. We argue that training with an emphasis on the standard assumptions used in economics (rationality and self-interest) leads future managers to rely excessively on rational and explicit safeguarding, crowding out instinctive contractual heuristics and signaling a 'bad' type to potential partners. In contrast, human assumptions used in management theories, because of their diverse, implicit and even contradictory nature, do not conflict with the innate set of cooperative tools and may provide a good training ground for such tools. We present tentative confirmatory evidence by examining how the weight given to behavioral assumptions in the core courses of the top 100 business schools influences the average salaries of their MBA graduates. Controlling for the average quality of their students and some other schools' characteristics, average salaries are significantly greater for those schools whose core MBA courses contain a higher proportion of management courses as opposed to courses based on economics or technical disciplines.
Resumo:
We start with a generalization of the well-known three-door problem:the n-door problem. The solution of this new problem leads us toa beautiful representation system for real numbers in (0,1] as alternated series, known in the literature as Pierce expansions. A closer look to Pierce expansions will take us to some metrical properties of sets defined through the Pierce expansions of its elements. Finally, these metrical properties will enable us to present 'strange' sets, similar to the classical Cantor set.
Resumo:
This paper explores biases in the elicitation of utilities under risk and the contribution that generalizations of expected utility can make to the resolution of these biases. We used five methods to measure utilities under risk and found clear violations of expected utility. Of the theories studies, prospect theory was most consistent with our data. The main improvement of prospect theory over expected utility was in comparisons between a riskless and a risky prospect(riskless-risk methods). We observed no improvement over expected utility in comparisons between two risky prospects (risk-risk methods). An explanation why we found no improvement of prospect theory over expected utility in risk-risk methods may be that there was less overweighting of small probabilities in our study than has commonly been observed.
Resumo:
This paper formalizes in a fully-rational model the popular idea that politiciansperceive an electoral cost in adopting costly reforms with future benefits and reconciles it with the evidence that reformist governments are not punished by voters.To do so, it proposes a model of elections where political ability is ex-ante unknownand investment in reforms is unobservable. On the one hand, elections improve accountability and allow to keep well-performing incumbents. On the other, politiciansmake too little reforms in an attempt to signal high ability and increase their reappointment probability. Although in a rational expectation equilibrium voters cannotbe fooled and hence reelection does not depend on reforms, the strategy of underinvesting in reforms is nonetheless sustained by out-of-equilibrium beliefs. Contrary tothe conventional wisdom, uncertainty makes reforms more politically viable and may,under some conditions, increase social welfare. The model is then used to study howpolitical rewards can be set so as to maximize social welfare and the desirability of imposing a one-term limit to governments. The predictions of this theory are consistentwith a number of empirical regularities on the determinants of reforms and reelection.They are also consistent with a new stylized fact documented in this paper: economicuncertainty is associated to more reforms in a panel of 20 OECD countries.
Resumo:
Models incorporating more realistic models of customer behavior, as customers choosing from an offerset, have recently become popular in assortment optimization and revenue management. The dynamicprogram for these models is intractable and approximated by a deterministic linear program called theCDLP which has an exponential number of columns. When there are products that are being consideredfor purchase by more than one customer segment, CDLP is difficult to solve since column generationis known to be NP-hard. However, recent research indicates that a formulation based on segments withcuts imposing consistency (SDCP+) is tractable and approximates the CDLP value very closely. In thispaper we investigate the structure of the consideration sets that make the two formulations exactly equal.We show that if the segment consideration sets follow a tree structure, CDLP = SDCP+. We give acounterexample to show that cycles can induce a gap between the CDLP and the SDCP+ relaxation.We derive two classes of valid inequalities called flow and synchronization inequalities to further improve(SDCP+), based on cycles in the consideration set structure. We give a numeric study showing theperformance of these cycle-based cuts.
Resumo:
By identifying types whose low-order beliefs up to level li about the state of nature coincide, weobtain quotient type spaces that are typically smaller than the original ones, preserve basic topologicalproperties, and allow standard equilibrium analysis even under bounded reasoning. Our Bayesian Nash(li; l-i)-equilibria capture players inability to distinguish types belonging to the same equivalence class.The case with uncertainty about the vector of levels (li; l-i) is also analyzed. Two examples illustratethe constructions.
Resumo:
Through an experiment, we investigate how the level of rationality relatesto concerns for equality and efficiency. Subjects perform dictator games anda guessing game. More rational subjects are not more frequently of the selfregardingtype. When performing a comparison within the same degree of rationality,self-regarding subjects show more strategic sophistication than othersubjects.
Resumo:
In todays competitive markets, the importance of goodscheduling strategies in manufacturing companies lead to theneed of developing efficient methods to solve complexscheduling problems.In this paper, we studied two production scheduling problemswith sequence-dependent setups times. The setup times areone of the most common complications in scheduling problems,and are usually associated with cleaning operations andchanging tools and shapes in machines.The first problem considered is a single-machine schedulingwith release dates, sequence-dependent setup times anddelivery times. The performance measure is the maximumlateness.The second problem is a job-shop scheduling problem withsequence-dependent setup times where the objective is tominimize the makespan.We present several priority dispatching rules for bothproblems, followed by a study of their performance. Finally,conclusions and directions of future research are presented.
Resumo:
Researchers have used stylized facts on asset prices and trading volumein stock markets (in particular, the mean reversion of asset returnsand the correlations between trading volume, price changes and pricelevels) to support theories where agents are not rational expected utilitymaximizers. This paper shows that this empirical evidence is in factconsistent with a standard infite horizon perfect information expectedutility economy where some agents face leverage constraints similar tothose found in todays financial markets. In addition, and in sharpcontrast to the theories above, we explain some qualitative differencesthat are observed in the price-volume relation on stock and on futuresmarkets. We consider a continuous-time economy where agents maximize theintegral of their discounted utility from consumption under both budgetand leverage con-straints. Building on the work by Vila and Zariphopoulou(1997), we find a closed form solution, up to a negative constant, for theequilibrium prices and demands in the region of the state space where theconstraint is non-binding. We show that, at the equilibrium, stock holdingsvolatility as well as its ratio to stock price volatility are increasingfunctions of the stock price and interpret this finding in terms of theprice-volume relation.