929 resultados para Bayes theorem


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We construct a model in which a first mover decides on its location before it knows the identity of the second mover; joint location results in a negative extemality. Contracts are inherently incomplete since the first mover's initial decision cannot be specified. We analyze several kinds of rights, including damages, injunctions, and rights to exclude (arising from covenants or land ownership). There are cases in which allocating any of these basic rights to the first mover-i.e., first-party rights-is dominated by second-party rights, and cases in which the reverse is true. A Coasian result (efficiency regardless of the rights allocation) only holds under a limited set of conditions. As corollaries of a theorem ranking the basic rights regimes, a number of results emerge contradicting conventional wisdom, including the relative inefficiency of concentrated land ownership and the relevance of the generator's identity. We conclude with a mechanism and a new rights regime that each yield the first best in all cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We define a subgame perfect Nash equilibrium under Knightian uncertainty for two players, by means of a recursive backward induction procedure. We prove an extension of the Zermelo-von Neumann-Kuhn Theorem for games of perfect information, i. e., that the recursive procedure generates a Nash equilibrium under uncertainty (Dow and Werlang(1994)) of the whole game. We apply the notion for two well known games: the chain store and the centipede. On the one hand, we show that subgame perfection under Knightian uncertainty explains the chain store paradox in a one shot version. On the other hand, we show that subgame perfection under uncertainty does not account for the leaving behavior observed in the centipede game. This is in contrast to Dow, Orioli and Werlang(1996) where we explain by means of Nash equilibria under uncertainty (but not subgame perfect) the experiments of McKelvey and Palfrey(1992). Finally, we show that there may be nontrivial subgame perfect equilibria under uncertainty in more complex extensive form games, as in the case of the finitely repeated prisoner's dilemma, which accounts for cooperation in early stages of the game .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One property (called action-consistency) that is implicit in the common prior assumption (CPA) is identified and shown to be the driving force of the use of the CPA in a class of well-known results. In particular, we show that Aumann (1987)’s Bayesian characterization of correlated equilibrium, Aumann and Brandenburger (1995)’s epistemic conditions for Nash equilibrium, and Milgrom and Stokey (1982)’s no-trade theorem are all valid without the CPA but with action-consistency. Moreover, since we show that action-consistency is much less restrictive than the CPA, the above results are more general than previously thought, and insulated from controversies around the CPA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies a model of a sequential auction where bidders are allowed to acquire further information about their valuations of the object in the middle of the auction. It is shown that, in any equilibrium where the distribution of the final price is atornless, a bidder's best response has a simple characterization. In particular, the optimal information acquisition point is the same, regardless of the other bidders' actions. This makes it natural to focus on symmetric, undominated equilibria, as in the Vickrey auction. An existence theorem for such a class of equilibria is presented. The paper also presents some results and numerical simulations that compare this sequential auction with the one-shot auction. 8equential auctions typically yield more expected revenue for the seller than their one-shot counterparts. 80 the possibility of mid-auction information acquisition can provide an explanation for why sequential procedures are more often adopted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelos para detecção de fraude são utilizados para identificar se uma transação é legítima ou fraudulenta com base em informações cadastrais e transacionais. A técnica proposta no estudo apresentado, nesta dissertação, consiste na de Redes Bayesianas (RB); seus resultados foram comparados à técnica de Regressão Logística (RL), amplamente utilizada pelo mercado. As Redes Bayesianas avaliadas foram os classificadores bayesianos, com a estrutura Naive Bayes. As estruturas das redes bayesianas foram obtidas a partir de dados reais, fornecidos por uma instituição financeira. A base de dados foi separada em amostras de desenvolvimento e validação por cross validation com dez partições. Naive Bayes foram os classificadores escolhidos devido à simplicidade e a sua eficiência. O desempenho do modelo foi avaliado levando-se em conta a matriz de confusão e a área abaixo da curva ROC. As análises dos modelos revelaram desempenho, levemente, superior da regressão logística quando comparado aos classificadores bayesianos. A regressão logística foi escolhida como modelo mais adequado por ter apresentado melhor desempenho na previsão das operações fraudulentas, em relação à matriz de confusão. Baseada na área abaixo da curva ROC, a regressão logística demonstrou maior habilidade em discriminar as operações que estão sendo classificadas corretamente, daquelas que não estão.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper revisits Modern Portfolio Theory and derives eleven properties of Efficient Allocations and Portfolios in the presence of leverage. With different degrees of leverage, an Efficient Portfolio is a linear combination of two portfolios that lie in different efficient frontiers - which allows for an attractive reinterpretation of the Separation Theorem. In particular a change in the investor risk-return preferences will leave the allocation between the Minimum Risk and Risk Portfolios completely unaltered - but will change the magnitudes of the tactical risk allocations within the Risk Portfolio. The paper also discusses the role of diversification in an Efficient Portfolio, emphasizing its more tactical, rather than strategic character

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is divided in two parts. In the first part we develop the theory of discrete nonautonomous dynamical systems. In particular, we investigate skew-product dynamical system, periodicity, stability, center manifold, and bifurcation. In the second part we present some concrete models that are used in ecology/biology and economics. In addition to developing the mathematical theory of these models, we use simulations to construct graphs that illustrate and describe the dynamics of the models. One of the main contributions of this dissertation is the study of the stability of some concrete nonlinear maps using the center manifold theory. Moreover, the second contribution is the study of bifurcation, and in particular the construction of bifurcation diagrams in the parameter space of the autonomous Ricker competition model. Since the dynamics of the Ricker competition model is similar to the logistic competition model, we believe that there exists a certain class of two-dimensional maps with which we can generalize our results. Finally, using the Brouwer’s fixed point theorem and the construction of a compact invariant and convex subset of the space, we present a proof of the existence of a positive periodic solution of the nonautonomous Ricker competition model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hebb proposed that synapses between neurons that fire synchronously are strengthened, forming cell assemblies and phase sequences. The former, on a shorter scale, are ensembles of synchronized cells that function transiently as a closed processing system; the latter, on a larger scale, correspond to the sequential activation of cell assemblies able to represent percepts and behaviors. Nowadays, the recording of large neuronal populations allows for the detection of multiple cell assemblies. Within Hebb's theory, the next logical step is the analysis of phase sequences. Here we detected phase sequences as consecutive assembly activation patterns, and then analyzed their graph attributes in relation to behavior. We investigated action potentials recorded from the adult rat hippocampus and neocortex before, during and after novel object exploration (experimental periods). Within assembly graphs, each assembly corresponded to a node, and each edge corresponded to the temporal sequence of consecutive node activations. The sum of all assembly activations was proportional to firing rates, but the activity of individual assemblies was not. Assembly repertoire was stable across experimental periods, suggesting that novel experience does not create new assemblies in the adult rat. Assembly graph attributes, on the other hand, varied significantly across behavioral states and experimental periods, and were separable enough to correctly classify experimental periods (Naïve Bayes classifier; maximum AUROCs ranging from 0.55 to 0.99) and behavioral states (waking, slow wave sleep, and rapid eye movement sleep; maximum AUROCs ranging from 0.64 to 0.98). Our findings agree with Hebb's view that assemblies correspond to primitive building blocks of representation, nearly unchanged in the adult, while phase sequences are labile across behavioral states and change after novel experience. The results are compatible with a role for phase sequences in behavior and cognition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trigonometry, branch of mathematics related to the study of triangles, developed from practical needs, especially relating to astronomy, Surveying and Navigation. Johann Müller, the Regiomontanus (1436-1476) mathematician and astronomer of the fifteenth century played an important role in the development of this science. His work titled De Triangulis Omnimodis Libri Quinque written around 1464, and published posthumously in 1533, presents the first systematic exposure of European plane and spherical trigonometry, a treatment independent of astronomy. In this study we present a description, translation and analysis of some aspects of this important work in the history of trigonometry. Therefore, the translation was performed using a version of the book Regiomontanus on Triangles of Barnabas Hughes, 1967. In it you will find the original work in Latin and an English translation. For this study, we use for most of our translation in Portuguese, the English version, but some doubt utterance, statement and figures were made by the original Latin. In this work, we can see that trigonometry is considered as a branch of mathematics which is subordinated to geometry, that is, toward the study of triangles. Regiomontanus provides a large number of theorems as the original trigonometric formula for the area of a triangle. Use algebra to solve geometric problems and mainly shows the first practical theorem for the law of cosines in spherical trigonometry. Thus, this study shows some of the development of the trigonometry in the fifteenth century, especially with regard to concepts such as sine and cosine (sine reverse), the work discussed above, is of paramount importance for the research in the history of mathematics more specifically in the area of historical analysis and critique of literary sources or studying the work of a particular mathematician

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The portfolio theory is a field of study devoted to investigate the decision-making by investors of resources. The purpose of this process is to reduce risk through diversification and thus guarantee a return. Nevertheless, the classical Mean-Variance has been criticized regarding its parameters and it is observed that the use of variance and covariance has sensitivity to the market and parameter estimation. In order to reduce the estimation errors, the Bayesian models have more flexibility in modeling, capable of insert quantitative and qualitative parameters about the behavior of the market as a way of reducing errors. Observing this, the present study aimed to formulate a new matrix model using Bayesian inference as a way to replace the covariance in the MV model, called MCB - Covariance Bayesian model. To evaluate the model, some hypotheses were analyzed using the method ex post facto and sensitivity analysis. The benchmarks used as reference were: (1) the classical Mean Variance, (2) the Bovespa index's market, and (3) in addition 94 investment funds. The returns earned during the period May 2002 to December 2009 demonstrated the superiority of MCB in relation to the classical model MV and the Bovespa Index, but taking a little more diversifiable risk that the MV. The robust analysis of the model, considering the time horizon, found returns near the Bovespa index, taking less risk than the market. Finally, in relation to the index of Mao, the model showed satisfactory, return and risk, especially in longer maturities. Some considerations were made, as well as suggestions for further work

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Predictive Controller has been receiving plenty attention in the last decades, because the need to understand, to analyze, to predict and to control real systems has been quickly growing with the technological and industrial progress. The objective of this thesis is to present a contribution for the development and implementation of Nonlinear Predictive Controllers based on Hammerstein model, as well as to its make properties evaluation. In this case, in the Nonlinear Predictive Controller development the time-step linearization method is used and a compensation term is introduced in order to improve the controller performance. The main motivation of this thesis is the study and stability guarantee for the Nonlinear Predictive Controller based on Hammerstein model. In this case, was used the concepts of sections and Popov Theorem. Simulation results with literature models shows that the proposed approaches are able to control with good performance and to guarantee the systems stability

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Support Vector Machines (SVM) has attracted increasing attention in machine learning area, particularly on classification and patterns recognition. However, in some cases it is not easy to determinate accurately the class which given pattern belongs. This thesis involves the construction of a intervalar pattern classifier using SVM in association with intervalar theory, in order to model the separation of a pattern set between distinct classes with precision, aiming to obtain an optimized separation capable to treat imprecisions contained in the initial data and generated during the computational processing. The SVM is a linear machine. In order to allow it to solve real-world problems (usually nonlinear problems), it is necessary to treat the pattern set, know as input set, transforming from nonlinear nature to linear problem. The kernel machines are responsible to do this mapping. To create the intervalar extension of SVM, both for linear and nonlinear problems, it was necessary define intervalar kernel and the Mercer s theorem (which caracterize a kernel function) to intervalar function

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Equipment maintenance is the major cost factor in industrial plants, it is very important the development of fault predict techniques. Three-phase induction motors are key electrical equipments used in industrial applications mainly because presents low cost and large robustness, however, it isn t protected from other fault types such as shorted winding and broken bars. Several acquisition ways, processing and signal analysis are applied to improve its diagnosis. More efficient techniques use current sensors and its signature analysis. In this dissertation, starting of these sensors, it is to make signal analysis through Park s vector that provides a good visualization capability. Faults data acquisition is an arduous task; in this way, it is developed a methodology for data base construction. Park s transformer is applied into stationary reference for machine modeling of the machine s differential equations solution. Faults detection needs a detailed analysis of variables and its influences that becomes the diagnosis more complex. The tasks of pattern recognition allow that systems are automatically generated, based in patterns and data concepts, in the majority cases undetectable for specialists, helping decision tasks. Classifiers algorithms with diverse learning paradigms: k-Neighborhood, Neural Networks, Decision Trees and Naïves Bayes are used to patterns recognition of machines faults. Multi-classifier systems are used to improve classification errors. It inspected the algorithms homogeneous: Bagging and Boosting and heterogeneous: Vote, Stacking and Stacking C. Results present the effectiveness of constructed model to faults modeling, such as the possibility of using multi-classifiers algorithm on faults classification