858 resultados para Implicit Utility Maximizing Weights


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We aim to provide a review of the stochastic discount factor bounds usually applied to diagnose asset pricing models. In particular, we mainly discuss the bounds used to analyze the disaster model of Barro (2006). Our attention is focused in this disaster model since the stochastic discount factor bounds that are applied to study the performance of disaster models usually consider the approach of Barro (2006). We first present the entropy bounds that provide a diagnosis of the analyzed disaster model which are the methods of Almeida and Garcia (2012, 2016); Ghosh et al. (2016). Then, we discuss how their results according to the disaster model are related to each other and also present the findings of other methodologies that are similar to these bounds but provide different evidence about the performance of the framework developed by Barro (2006).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an economy which primitives are exactly those in Mirrlees (1971), we investigate the efficiency of labor income tax schedules derived under the equal sacrifice principle. Starting from a given government revenue level, we use Werning’s (2007b) approach to assess whether there is an alternative tax schedule to the one derived under the equal sacrifice principle that raises more revenue while delivering less utility to no one. For our preferred parametrizations of the problem we find that inefficiency only arises at very high levels of income. We also show how the multipliers of the Pareto problem may be extracted from the data and used to find the implicit marginal social weights associated with each level of income.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Every time another corporate scandal captures media headlines, the 'bad apple vs. bad barrel' discussion starts anew. Yet this debate overlooks the influence of the broader societal context on organizational behavior. In this article, we argue that misbehaviors of organizations (the 'barrels') and their members (the 'apples') cannot be addressed properly without a clear understanding of their broader context (the 'larder'). Whereas previously, a strong societal framework dampened the practical application of the Homo economicus concept (business actors as perfectly rational and egocentric utility-maximizing agents without any moral concern), specialization, individualization and globalization led to a business world disembedded from broader societal norms. This emancipated business world promotes a literal interpretation of Homo economicus among business organizations and their members. Consequently, we argue that the first step toward 'healthier' apples and barrels is to sanitize the larder, that is, adapt the framework in which organizations and their members evolve.Chaque fois qu'un nouveau scandale fait la une des médias, la question de savoir si le problème se situe au niveau des individus (des 'pommes isolées') ou au niveau des organisations (les 'caisses de pommes') refait surface. Ce débat tend néanmoins à sous-estimer l'influence du contexte sociétal plus large sur le comportement dans les organisations. Dans cet article, nous soutenons l'idée que les scandales éthiques dans les organisations ou parmi leurs membres ne peuvent être compris correctement sans une vision plus précise de leur contexte plus large (la 'cave à pommes'). Si dans le passé un contexte sociétal fort permettait d'adoucir les applications pratiques de l'Homo economicus (qui considère l'acteur économique comme un agent parfaitement rationnel et égocentrique cherchant à maximiser son utilité sans réflexion morale), l'individualisation et la globalisation ont conduit à un monde économique désencastré et déconnecté des normes sociales plus larges. Ce monde économique autonome promouvoit une interprétation littérale de l'Homo economicus parmi les entreprises et leurs employés. Il en résulte que le premier pas vers des pommes moins pourries passe par un assainissement de la cave, c'est-à-dire l'adoption d'un cadre socio-normatif qui permet un recadrage du contexte dans lequel les organisations économiques et leurs acteurs agissent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La presencia del sector informal es una de las principales características del mercado de trabajo en países en vías de desarrollo como Colombia. Esta problemática ha sido ampliamente estudiada en los últimos años debido a su gran impacto en la economía y a que el funcionamiento del mercado de traba jo, los salarios y los precios se comportan de una manera diferente al de los países desarrollados. Una política monetaria y fiscal responsable debe tener en cuenta estas especificidades. La presencia del sector informal es una de las principales características del mercado de trabajo en países en vías de desarrollo como Colombia. Esta problemática ha sido ampliamente estudiada en los últimos años debido a su gran impacto en la economía y a que el funcionamiento del mercado de trabajo, los salarios y los precios se comportan de una manera diferente al de los países desarrollados. Una política monetaria y fiscal responsable debe tener en cuenta estas especificidades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We will investigate the amount of residual demand in a market consisting of only one consumer and two producers. Since there is only one consumer, we cannot really speak about a rationing rule, but we can ask ourselves whether a known rationing rule reflects the consumer’s utility maximizing behavior. We will show that, if the consumer has a Cobb-Douglas utility function, then the amount purchased by the consumer from the high-price firm lies between the values determined according to the efficient rationing rule and the random rationing rule. We will show further, that if the consumer has a quasilinear utility function, then in the economically interesting case his residual demand function will be equal to the residual demand function under efficient rationing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El objetivo de este trabajo es utilizar algunos hechos estilizados de la "Gran recesión", específicamente la drástica caída en el nivel de capitalización bancario, para analizar la relación entre los ciclos financieros y los ciclos reales, así como la efectividad de la política monetaria no convencional y las políticas macroprudenciales. Para esto, en el primer capítulo se desarrolla una microfundamentación de la banca a partir de un modelo de Costly State Verification, que es incluido posteriomente en distintas especificaciones de modelos DSGE. Los resultados muestran que: (i) los ciclos financieros y los ciclos económicos pueden relacionarse a partir del deterioro del capital bancario; (ii) Las políticas macroprudenciales y no convencionales son efectivas para moderar los ciclos económicos, pero son costosas en términos de recursos e inflación.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Filter degeneracy is the main obstacle for the implementation of particle filter in non-linear high-dimensional models. A new scheme, the implicit equal-weights particle filter (IEWPF), is introduced. In this scheme samples are drawn implicitly from proposal densities with a different covariance for each particle, such that all particle weights are equal by construction. We test and explore the properties of the new scheme using a 1,000-dimensional simple linear model, and the 1,000-dimensional non-linear Lorenz96 model, and compare the performance of the scheme to a Local Ensemble Kalman Filter. The experiments show that the new scheme can easily be implemented in high-dimensional systems and is never degenerate, with good convergence properties in both systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

High molecular weight semi crystalline thermoplastic poly(ester urethanes), TPEUs, were prepared from a vegetable oil-based diisocyanate, aliphatic diol chain extenders and poly(ethylene adipate) macro diol using one-shot, pre-polymer and multi-stage polyaddition methods. The optimized polymerization reaction achieved ultra-high molecular weight TPEUs (>2 million as determined by GPC) in a short time, indicating a very high HPMDI diol reactivity. TPEUs with very well controlled hard segment (HS) and soft segment (SS) blocks were prepared and characterized with DSC, TGA, tensile analysis, and WAXD in order to reveal structure property relationships. A confinement effect that imparts elastomeric properties to otherwise thermoplastic TPEUs was revealed. The confinement extent was found to vary predictably with structure indicating that one can custom engineer tougher polyurethane elastomers by "tuning" soft segment crystallinity with suitable HS block structure. Generally, the HPMDI-based TPEUs exhibited thermal stability and mechanical properties comparable to entirely petroleum-based TPEUs. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider a voting procedure where countries, states, or districts comprising a union each elect representatives who then participate in later votes at the union level on their behalf. The countries, provinces, and states may vary in their populations and composition. If we wish to maximize the total expected utility of all agents in the union, how to weight the votes of the representatives of the different countries, states or districts at the union level? We provide a simple characterization of the efficient voting rule in terms of the weights assigned to different districts and the voting threshold (how large a qualified majority is needed to induce change versus the status quo). Next, in the context of a model of the correlation structure of agents preferences, we analyze how voting weights relate to the population size of a country. We then analyze the voting weights in Council of the European Union under the Nice Treaty and the recently proposed constitution, and contrast them under different versions of our model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We characterize the value function of maximizing the total discounted utility of dividend payments for a compound Poisson insurance risk model when strictly positive transaction costs are included, leading to an impulse control problem. We illustrate that well known simple strategies can be optimal in the case of exponential claim amounts. Finally we develop a numerical procedure to deal with general claim amount distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Please consult the paper edition of this thesis to read. It is available on the 5th Floor of the Library at Call Number: Z 9999 P65 D53 2007

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particle filters are fully non-linear data assimilation techniques that aim to represent the probability distribution of the model state given the observations (the posterior) by a number of particles. In high-dimensional geophysical applications the number of particles required by the sequential importance resampling (SIR) particle filter in order to capture the high probability region of the posterior, is too large to make them usable. However particle filters can be formulated using proposal densities, which gives greater freedom in how particles are sampled and allows for a much smaller number of particles. Here a particle filter is presented which uses the proposal density to ensure that all particles end up in the high probability region of the posterior probability density function. This gives rise to the possibility of non-linear data assimilation in large dimensional systems. The particle filter formulation is compared to the optimal proposal density particle filter and the implicit particle filter, both of which also utilise a proposal density. We show that when observations are available every time step, both schemes will be degenerate when the number of independent observations is large, unlike the new scheme. The sensitivity of the new scheme to its parameter values is explored theoretically and demonstrated using the Lorenz (1963) model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O Brasil é um país onde os 50% mais pobres se apropriam aproximadamente de 10% da renda agregada, e os 10% mais ricos detêm quase 50% deste mesmo. O colorário desse alto grau de desigualdade é que se uma pessoa está somente preocupada em maximizar o nível de GPD, a função de bem–estar social implícita adotada devota parte do seu peso ao bem-estar de 10% da população. Em outras palavras, a concentração brasileira de renda cria uma anomalia dentro da perspectiva de agente representativo implícito na análise macroeconômica aonde as pessoas valem aquilo que ganham. A análise da pobreza inverte esse peso estrutural da população, estipulando zero de peso para o segmento não pobre da sociedade e atribuindo pesos aos indivíduos que aumentam com suas necessidades insatisfeitas. Esse projeto estuda as conexões entre a evolução macroeconômica Brasileira recente e da pobreza. A análise é dividida em duas partes: A primeira parte descreve a evolução da pobreza brasileira e seus principais determinantes macroeconômicos durante os últimos 15 anos. A segunda parte tira proveito das mudanças da pobreza e desigualdades medidas durante o período 1993-96 para estudar seus principais determinantes macroeconômicos. Dado a maior importância do Plano Real, uma especial atenção foi dada a análise dos impactos da desinflação no nível e na distribuição de renda e a possível sinergia entre essas duas dimensões de determinação da pobreza. A terceira parte do projeto decompõe as mudanças dos diversos índices de pobreza através dos diferentes grupos dado pelas características dos chefes de família (i.e.; sexo, anos de estudo, raça, classe trabalhadora, setores de atividades, região, densidade populacional). Depois essa decomposição é avançada um passo desatrelando as mudanças nessa diferentes células de pobreza em termos de suas respectivas mudanças em termos de desigualdade da renda per capita. Esse perfil de pobreza ajuda a mapear as diferentes fontes de mudança da pobreza na análise histórica e fornece consistência interna para os exercícios de análises contra-factuais.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In multi-attribute utility theory, it is often not easy to elicit precise values for the scaling weights representing the relative importance of criteria. A very widespread approach is to gather incomplete information. A recent approach for dealing with such situations is to use information about each alternative?s intensity of dominance, known as dominance measuring methods. Different dominancemeasuring methods have been proposed, and simulation studies have been carried out to compare these methods with each other and with other approaches but only when ordinal information about weights is available. In this paper, we useMonte Carlo simulation techniques to analyse the performance of and adapt such methods to deal with weight intervals, weights fitting independent normal probability distributions orweights represented by fuzzy numbers.Moreover, dominance measuringmethod performance is also compared with a widely used methodology dealing with incomplete information on weights, the stochastic multicriteria acceptability analysis (SMAA). SMAA is based on exploring the weight space to describe the evaluations that would make each alternative the preferred one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new method for ranking alternatives in multicriteria decision-making problems when there is imprecision concerning the alternative performances, component utility functions and weights. We assume decision maker?s preferences are represented by an additive multiattribute utility function, in which weights can be modeled by independent normal variables, fuzzy numbers, value intervals or by an ordinal relation. The approaches are based on dominance measures or exploring the weight space in order to describe which ratings would make each alternative the preferred one. On the one hand, the approaches based on dominance measures compute the minimum utility difference among pairs of alternatives. Then, they compute a measure by which to rank the alternatives. On the other hand, the approaches based on exploring the weight space compute confidence factors describing the reliability of the analysis. These methods are compared using Monte Carlo simulation.