952 resultados para Gelfand-Shilov Theorem
Resumo:
We define a subgame perfect Nash equilibrium under Knightian uncertainty for two players, by means of a recursive backward induction procedure. We prove an extension of the Zermelo-von Neumann-Kuhn Theorem for games of perfect information, i. e., that the recursive procedure generates a Nash equilibrium under uncertainty (Dow and Werlang(1994)) of the whole game. We apply the notion for two well known games: the chain store and the centipede. On the one hand, we show that subgame perfection under Knightian uncertainty explains the chain store paradox in a one shot version. On the other hand, we show that subgame perfection under uncertainty does not account for the leaving behavior observed in the centipede game. This is in contrast to Dow, Orioli and Werlang(1996) where we explain by means of Nash equilibria under uncertainty (but not subgame perfect) the experiments of McKelvey and Palfrey(1992). Finally, we show that there may be nontrivial subgame perfect equilibria under uncertainty in more complex extensive form games, as in the case of the finitely repeated prisoner's dilemma, which accounts for cooperation in early stages of the game.
Resumo:
This paper proves the existence and uniqueness of a fixed-point for local contractions without assuming the family of contraction coefficients to be uniformly bounded away from 1. More importantly it shows how this fixed-point result can apply to study existence and uniqueness of solutions to some recursive equations that arise in economic dynamics.
Resumo:
A Automação e o processo de Robotização vêm, cada vem mais, se tornando pauta nas discussões de centenas de indústrias brasileiras, onde a tendência clara e identificada é a de investimentos expressivos na melhoria de processos e produtos, por intermédio dessas tecnologias; com foco, sempre que possível, na nacionalização de equipamentos. O presente trabalho tem como objetivo avaliar o modelo proposto por Paul Kennedy (1993) com relação à tendência de Automação e Robotização nas Indústrias Mundiais, analisando o estudo realizado diante de uma economia emergente como a brasileira. Para tanto, foram pesquisadas empresas no Brasil, em diferentes segmentos industriais, o estado da arte em termos de tecnologia de automação e robótica aplicada a processos industriais, e sugerido um modelo diferente do idealizado originalmente por Kennedy. A análise do autor se baseou no teorema que, na matemática discreta, chamamos de “law of the excluded middle”, ou seja, segundo Kennedy, o Brasil estaria vivendo hoje uma migração gradual das indústrias para os países ricos. O Brasil é um exemplo de país industrializado, de economia emergente, que investe intensamente em processos automatizados, mas que não é classificado dentro do grupo desses países ricos. Através da pesquisa realizada será apresentado um novo modelo, no qual países emergentes como o Brasil têm acesso à tecnologia de ponta em automação e robótica, aplicando a mesma em seus processos industriais.
Resumo:
The main objective of this paper is to propose a novel setup that allows estimating separately the welfare costs of the uncertainty stemming from business-cycle uctuations and from economic-growth variation, when the two types of shocks associated with them (respectively,transitory and permanent shocks) hit consumption simultaneously. Separating these welfare costs requires dealing with degenerate bivariate distributions. Levis Continuity Theorem and the Disintegration Theorem allow us to adequately de ne the one-dimensional limiting marginal distributions. Under Normality, we show that the parameters of the original marginal distributions are not afected, providing the means for calculating separately the welfare costs of business-cycle uctuations and of economic-growth variation. Our empirical results show that, if we consider only transitory shocks, the welfare cost of business cycles is much smaller than previously thought. Indeed, we found it to be negative - -0:03% of per-capita consumption! On the other hand, we found that the welfare cost of economic-growth variation is relatively large. Our estimate for reasonable preference-parameter values shows that it is 0:71% of consumption US$ 208:98 per person, per year.
Resumo:
Este trabalho tem por objetivo estimar um modelo empírico para relacionar os gastos em publicidade com a receita das firmas, de forma a servir como ferramenta de tomada de decisão, para isso vamos fazer um estudo de caso da indústria de telecomunicações. A Indústria de comunicação (publicidade) no Brasil, segundo dados do IBGE de 2008, é responsável por 4% do PIB, gerando receitas da ordem 115 bilhões de reais. Com 113 mil empresas que geram 711 mil empregos, ocupam 866 mil pessoas e pagam 11,8 bilhões em salários e encargos. No entanto, a maioria dos gestores de marketing declara não ter instrumentos para medir o impacto de suas ações no resultado das empresas. O modelo empírico será estimado tendo como base dados mensais dos serviços de ligações de longa distância nacional da Embratel para o período de janeiro de 2009 até dezembro de 2011. As informações quase sempre não disponíveis, só puderam ser usadas devido ao compromisso de confidencialidade. A partir de técnicas de cointegração, foi calculada a elasticidade de longo prazo da receita em relação aos gastos com publicidade e ao preço, assim com as respectivas velocidades de ajustamento aos desvios de curto prazo. Os resultados sugerem que a receita responde positivamente às variações dos gastos em publicidade, embora o percentual seja relativamente baixo, através do teorema de Dorfman-Steiner conseguimos indicar que o ponto ótimo da relação entre gastos com publicidade e a receita seria de aproximadamente 20%, respeitadas as limitações do modelo.
Resumo:
We construct a model in which a first mover decides on its location before it knows the identity of the second mover; joint location results in a negative extemality. Contracts are inherently incomplete since the first mover's initial decision cannot be specified. We analyze several kinds of rights, including damages, injunctions, and rights to exclude (arising from covenants or land ownership). There are cases in which allocating any of these basic rights to the first mover-i.e., first-party rights-is dominated by second-party rights, and cases in which the reverse is true. A Coasian result (efficiency regardless of the rights allocation) only holds under a limited set of conditions. As corollaries of a theorem ranking the basic rights regimes, a number of results emerge contradicting conventional wisdom, including the relative inefficiency of concentrated land ownership and the relevance of the generator's identity. We conclude with a mechanism and a new rights regime that each yield the first best in all cases.
Resumo:
We define a subgame perfect Nash equilibrium under Knightian uncertainty for two players, by means of a recursive backward induction procedure. We prove an extension of the Zermelo-von Neumann-Kuhn Theorem for games of perfect information, i. e., that the recursive procedure generates a Nash equilibrium under uncertainty (Dow and Werlang(1994)) of the whole game. We apply the notion for two well known games: the chain store and the centipede. On the one hand, we show that subgame perfection under Knightian uncertainty explains the chain store paradox in a one shot version. On the other hand, we show that subgame perfection under uncertainty does not account for the leaving behavior observed in the centipede game. This is in contrast to Dow, Orioli and Werlang(1996) where we explain by means of Nash equilibria under uncertainty (but not subgame perfect) the experiments of McKelvey and Palfrey(1992). Finally, we show that there may be nontrivial subgame perfect equilibria under uncertainty in more complex extensive form games, as in the case of the finitely repeated prisoner's dilemma, which accounts for cooperation in early stages of the game .
Resumo:
One property (called action-consistency) that is implicit in the common prior assumption (CPA) is identified and shown to be the driving force of the use of the CPA in a class of well-known results. In particular, we show that Aumann (1987)’s Bayesian characterization of correlated equilibrium, Aumann and Brandenburger (1995)’s epistemic conditions for Nash equilibrium, and Milgrom and Stokey (1982)’s no-trade theorem are all valid without the CPA but with action-consistency. Moreover, since we show that action-consistency is much less restrictive than the CPA, the above results are more general than previously thought, and insulated from controversies around the CPA.
Resumo:
This paper studies a model of a sequential auction where bidders are allowed to acquire further information about their valuations of the object in the middle of the auction. It is shown that, in any equilibrium where the distribution of the final price is atornless, a bidder's best response has a simple characterization. In particular, the optimal information acquisition point is the same, regardless of the other bidders' actions. This makes it natural to focus on symmetric, undominated equilibria, as in the Vickrey auction. An existence theorem for such a class of equilibria is presented. The paper also presents some results and numerical simulations that compare this sequential auction with the one-shot auction. 8equential auctions typically yield more expected revenue for the seller than their one-shot counterparts. 80 the possibility of mid-auction information acquisition can provide an explanation for why sequential procedures are more often adopted.
Resumo:
This paper revisits Modern Portfolio Theory and derives eleven properties of Efficient Allocations and Portfolios in the presence of leverage. With different degrees of leverage, an Efficient Portfolio is a linear combination of two portfolios that lie in different efficient frontiers - which allows for an attractive reinterpretation of the Separation Theorem. In particular a change in the investor risk-return preferences will leave the allocation between the Minimum Risk and Risk Portfolios completely unaltered - but will change the magnitudes of the tactical risk allocations within the Risk Portfolio. The paper also discusses the role of diversification in an Efficient Portfolio, emphasizing its more tactical, rather than strategic character
Resumo:
This thesis presents general methods in non-Gaussian analysis in infinite dimensional spaces. As main applications we study Poisson and compound Poisson spaces. Given a probability measure μ on a co-nuclear space, we develop an abstract theory based on the generalized Appell systems which are bi-orthogonal. We study its properties as well as the generated Gelfand triples. As an example we consider the important case of Poisson measures. The product and Wick calculus are developed on this context. We provide formulas for the change of the generalized Appell system under a transformation of the measure. The L² structure for the Poisson measure, compound Poisson and Gamma measures are elaborated. We exhibit the chaos decomposition using the Fock isomorphism. We obtain the representation of the creation, annihilation operators. We construct two types of differential geometry on the configuration space over a differentiable manifold. These two geometries are related through the Dirichlet forms for Poisson measures as well as for its perturbations. Finally, we construct the internal geometry on the compound configurations space. In particular, the intrinsic gradient, the divergence and the Laplace-Beltrami operator. As a result, we may define the Dirichlet forms which are associated to a diffusion process. Consequently, we obtain the representation of the Lie algebra of vector fields with compact support. All these results extends directly for the marked Poisson spaces.
Resumo:
This work is divided in two parts. In the first part we develop the theory of discrete nonautonomous dynamical systems. In particular, we investigate skew-product dynamical system, periodicity, stability, center manifold, and bifurcation. In the second part we present some concrete models that are used in ecology/biology and economics. In addition to developing the mathematical theory of these models, we use simulations to construct graphs that illustrate and describe the dynamics of the models. One of the main contributions of this dissertation is the study of the stability of some concrete nonlinear maps using the center manifold theory. Moreover, the second contribution is the study of bifurcation, and in particular the construction of bifurcation diagrams in the parameter space of the autonomous Ricker competition model. Since the dynamics of the Ricker competition model is similar to the logistic competition model, we believe that there exists a certain class of two-dimensional maps with which we can generalize our results. Finally, using the Brouwer’s fixed point theorem and the construction of a compact invariant and convex subset of the space, we present a proof of the existence of a positive periodic solution of the nonautonomous Ricker competition model.
Resumo:
Trigonometry, branch of mathematics related to the study of triangles, developed from practical needs, especially relating to astronomy, Surveying and Navigation. Johann Müller, the Regiomontanus (1436-1476) mathematician and astronomer of the fifteenth century played an important role in the development of this science. His work titled De Triangulis Omnimodis Libri Quinque written around 1464, and published posthumously in 1533, presents the first systematic exposure of European plane and spherical trigonometry, a treatment independent of astronomy. In this study we present a description, translation and analysis of some aspects of this important work in the history of trigonometry. Therefore, the translation was performed using a version of the book Regiomontanus on Triangles of Barnabas Hughes, 1967. In it you will find the original work in Latin and an English translation. For this study, we use for most of our translation in Portuguese, the English version, but some doubt utterance, statement and figures were made by the original Latin. In this work, we can see that trigonometry is considered as a branch of mathematics which is subordinated to geometry, that is, toward the study of triangles. Regiomontanus provides a large number of theorems as the original trigonometric formula for the area of a triangle. Use algebra to solve geometric problems and mainly shows the first practical theorem for the law of cosines in spherical trigonometry. Thus, this study shows some of the development of the trigonometry in the fifteenth century, especially with regard to concepts such as sine and cosine (sine reverse), the work discussed above, is of paramount importance for the research in the history of mathematics more specifically in the area of historical analysis and critique of literary sources or studying the work of a particular mathematician
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The Predictive Controller has been receiving plenty attention in the last decades, because the need to understand, to analyze, to predict and to control real systems has been quickly growing with the technological and industrial progress. The objective of this thesis is to present a contribution for the development and implementation of Nonlinear Predictive Controllers based on Hammerstein model, as well as to its make properties evaluation. In this case, in the Nonlinear Predictive Controller development the time-step linearization method is used and a compensation term is introduced in order to improve the controller performance. The main motivation of this thesis is the study and stability guarantee for the Nonlinear Predictive Controller based on Hammerstein model. In this case, was used the concepts of sections and Popov Theorem. Simulation results with literature models shows that the proposed approaches are able to control with good performance and to guarantee the systems stability