231 resultados para Priors


Relevância:

10.00% 10.00%

Publicador:

Resumo:

An analytic solution to the multi-target Bayes recursion known as the δ-Generalized Labeled Multi-Bernoulli ( δ-GLMB) filter has been recently proposed by Vo and Vo in [“Labeled Random Finite Sets and Multi-Object Conjugate Priors,” IEEE Trans. Signal Process., vol. 61, no. 13, pp. 3460-3475, 2014]. As a sequel to that paper, the present paper details efficient implementations of the δ-GLMB multi-target tracking filter. Each iteration of this filter involves an update operation and a prediction operation, both of which result in weighted sums of multi-target exponentials with intractably large number of terms. To truncate these sums, the ranked assignment and K-th shortest path algorithms are used in the update and prediction, respectively, to determine the most significant terms without exhaustively computing all of the terms. In addition, using tools derived from the same framework, such as probability hypothesis density filtering, we present inexpensive (relative to the δ-GLMB filter) look-ahead strategies to reduce the number of computations. Characterization of the L1-error in the multi-target density arising from the truncation is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we apply the theory of declsion making with expected utility and non-additive priors to the choice of optimal portfolio. This theory describes the behavior of a rational agent who i5 averse to pure 'uncertainty' (as well as, possibly, to 'risk'). We study the agent's optimal allocation of wealth between a safe and an uncertain asset. We show that there is a range of prices at which the agent neither buys not sells short the uncertain asset. In contrast the standard theory of expected utility predicts that there is exactly one such price. We also provide a definition of an increase in uncertainty aversion and show that it causes the range of prices to increase.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We transform a non co-operati ve game into a -Bayesian decision problem for each player where the uncertainty faced by a player is the strategy choices of the other players, the pr iors of other players on the choice of other players, the priors over priors and so on.We provide a complete characterization between the extent of knowledge about the rationality of players and their ability to successfulIy eliminate strategies which are not best responses. This paper therefore provides the informational foundations of iteratively unàominated strategies and rationalizable strategic behavior (Bernheim (1984) and Pearce (1984». Moreover, sufficient condi tions are also found for Nash equilibrium behavior. We also provide Aumann's (1985) results on correlated equilibria .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Kalai and Lebrer (93a, b) have recently show that for the case of infinitely repeated games, a coordination assumption on beliefs and optimal strategies ensures convergence to Nash equilibrium. In this paper, we show that for the case of repeated games with long (but finite) horizon, their condition does not imply approximate Nash equilibrium play. Recently Kalai and Lehrer (93a, b) proved that a coordination assumption on beliefs and optimal strategies, ensures that pIayers of an infinitely repeated game eventually pIay 'E-close" to an E-Nash equilibrium. Their coordination assumption requires that if players believes that certain set of outcomes have positive probability then it must be the case that this set of outcomes have, in fact, positive probability. This coordination assumption is called absolute continuity. For the case of finitely repeated games, the absolute continuity assumption is a quite innocuous assumption that just ensures that pIayers' can revise their priors by Bayes' Law. However, for the case of infinitely repeated games, the absolute continuity assumption is a stronger requirement because it also refers to events that can never be observed in finite time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently Kajii and (2008) proposed to characterize interim efficient allocations in an exchange economy under asymmetric information when uncertainty is represented by multiple posteriors. When agents have Bewley's incomplete preferences, Kajii and Ui (2008) proposed a necessary and sufficient condition on the set of posteriors. However, when agents have Gilboa--Schmeidler's MaxMin expected utility preferences, they only propose a sufficient condition. The objective of this paper is to complete Kajii and Ui's work by proposing a necessary and sufficient condition for interim efficiency for various models of ambiguity aversion and in particular MaxMin expected utility. Our proof is based on a direct application of some results proposed by Rigotti, Shannon and Stralecki (2008).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este estudo tem como objetivo identificar quais os fundamentos que levam algumas Câmaras Municipais pernambucanas a não acompanhar o parecer prévio do Tribunal de Contas do Estado no julgamento das contas anuais dos prefeitos. Tendo em vista que uma parcela significativa dos julgamentos das contas anuais dos prefeitos realizados pelas Câmaras Municipais não tem acompanhado as recomendações emitidas nos Pareceres Prévios do TCE, inicialmente foram identificadas as abordagens teóricas sobre a estrutura da relação do executivo com o legislativo que oferecem subsídios para o entendimento da face política das Câmaras Municipais nestes julgamentos. Destacou-se também o aspecto técnicoadministrativo viabilizado pelo processo legislativo do julgamento das contas, passível da análise e solicitação de revisão judicial pelo Ministério Público com vistas à anulação, caso não apresente motivação legal formalmente registrada que atenda às disposições constitucionais. O estudo foi realizado por meio de pesquisa bibliográfica, documental e de campo, por meio de entrevistas semiestruturadas com vereadores das Câmaras Municipais pernambucanas. A metodologia qualitativa de análise de conteúdo foi escolhida para a análise dos dados. Os resultados da pesquisa permitiram identificar que os fundamentos que levam algumas Câmaras Municipais pernambucanas a não acompanhar o parecer prévio do Tribunal de Contas no julgamento das contas anuais dos prefeitos não estão formalmente evidenciados no processo legislativo pertinente, cujos documentos oficiais pesquisados não atendem aos requisitos legais de motivação, nem tampouco explicam as razões do não acompanhamento do parecer prévio do TCE. A opinião dos vereadores entrevistados conduz ao entendimento de que tais fundamentos são de natureza política, em detrimento dos fundamentos técnicos, explicados pela relação de preponderância do executivo sobre o poder legislativo municipal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the first essay, "Determinants of Credit Expansion in Brazil", analyzes the determinants of credit using an extensive bank level panel dataset. Brazilian economy has experienced a major boost in leverage in the first decade of 2000 as a result of a set factors ranging from macroeconomic stability to the abundant liquidity in international financial markets before 2008 and a set of deliberate decisions taken by President Lula's to expand credit, boost consumption and gain political support from the lower social strata. As relevant conclusions to our investigation we verify that: credit expansion relied on the reduction of the monetary policy rate, international financial markets are an important source of funds, payroll-guaranteed credit and investment grade status affected positively credit supply. We were not able to confirm the importance of financial inclusion efforts. The importance of financial sector sanity indicators of credit conditions cannot be underestimated. These results raise questions over the sustainability of this expansion process and financial stability in the future. The second essay, “Public Credit, Monetary Policy and Financial Stability”, discusses the role of public credit. The supply of public credit in Brazil has successfully served to relaunch the economy after the Lehman-Brothers demise. It was later transformed into a driver for economic growth as well as a regulation device to force private banks to reduce interest rates. We argue that the use of public funds to finance economic growth has three important drawbacks: it generates inflation, induces higher loan rates and may induce financial instability. An additional effect is the prevention of market credit solutions. This study contributes to the understanding of the costs and benefits of credit as a fiscal policy tool. The third essay, “Bayesian Forecasting of Interest Rates: Do Priors Matter?”, discusses the choice of priors when forecasting short-term interest rates. Central Banks that commit to an Inflation Target monetary regime are bound to respond to inflation expectation spikes and product hiatus widening in a clear and transparent way by abiding to a Taylor rule. There are various reports of central banks being more responsive to inflationary than to deflationary shocks rendering the monetary policy response to be indeed non-linear. Besides that there is no guarantee that coefficients remain stable during time. Central Banks may switch to a dual target regime to consider deviations from inflation and the output gap. The estimation of a Taylor rule may therefore have to consider a non-linear model with time varying parameters. This paper uses Bayesian forecasting methods to predict short-term interest rates. We take two different approaches: from a theoretic perspective we focus on an augmented version of the Taylor rule and include the Real Exchange Rate, the Credit-to-GDP and the Net Public Debt-to-GDP ratios. We also take an ”atheoretic” approach based on the Expectations Theory of the Term Structure to model short-term interest. The selection of priors is particularly relevant for predictive accuracy yet, ideally, forecasting models should require as little a priori expert insight as possible. We present recent developments in prior selection, in particular we propose the use of hierarchical hyper-g priors for better forecasting in a framework that can be easily extended to other key macroeconomic indicators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several statistical models can be used for assessing genotype X environment interaction (GEI) and studying genotypic stability. The objectives of this research were to show how (i) to use Bayesian methodology for computing Shukla's phenotypic stability variance and (ii) to incorporate prior information on the parameters for better estimation. Potato [Solanum tuberosum subsp. andigenum (Juz. & Bukasov) Hawkes], wheat (Triticum aestivum L.), and maize (Zea mays L.) multi environment trials (MET) were used for illustrating the application of the Bayes paradigm. The potato trial included 15 genotypes, but prior information for just three genotypes was used. The wheat trial used prior information on all 10 genotypes included in the trial, whereas for the maize trial, noninformative priors for the nine genotypes was used. Concerning the posterior distribution of the genotypic means, the maize MET with 20 sites gave less disperse posterior distributions of the genotypic means than did the posterior distribution of the genotypic means of the other METs, which included fewer environments. The Bayesian approach allows use of other statistical strategies such as the normal truncated distribution (used in this study). When analyzing grain yield, a lower bound of zero and an upper bound set by the researcher's experience can be used. The Bayesian paradigm offers plant breeders the possibility of computing the probability of a genotype being the best performer. The results of this study show that although some genotypes may have a very low probability of being the best in all sites, they have a relatively good chance of being among the five highest yielding genotypes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Bayesian Inference it is often desirable to have a posterior density reflecting mainly the information from sample data. To achieve this purpose it is important to employ prior densities which add little information to the sample. We have in the literature many such prior densities, for example, Jeffreys (1967), Lindley (1956); (1961), Hartigan (1964), Bernardo (1979), Zellner (1984), Tibshirani (1989), etc. In the present article, we compare the posterior densities of the reliability function by using Jeffreys, the maximal data information (Zellner, 1984), Tibshirani's, and reference priors for the reliability function R(t) in a Weibull distribution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of Bayesian statistical analysis, elicitation is the process of formulating a prior density f(.) about one or more uncertain quantities to represent a person's knowledge and beliefs. Several different methods of eliciting prior distributions for one unknown parameter have been proposed. However, there are relatively few methods for specifying a multivariate prior distribution and most are just applicable to specific classes of problems and/or based on restrictive conditions, such as independence of variables. Besides, many of these procedures require the elicitation of variances and correlations, and sometimes elicitation of hyperparameters which are difficult for experts to specify in practice. Garthwaite et al. (2005) discuss the different methods proposed in the literature and the difficulties of eliciting multivariate prior distributions. We describe a flexible method of eliciting multivariate prior distributions applicable to a wide class of practical problems. Our approach does not assume a parametric form for the unknown prior density f(.), instead we use nonparametric Bayesian inference, modelling f(.) by a Gaussian process prior distribution. The expert is then asked to specify certain summaries of his/her distribution, such as the mean, mode, marginal quantiles and a small number of joint probabilities. The analyst receives that information, treating it as a data set D with which to update his/her prior beliefs to obtain the posterior distribution for f(.). Theoretical properties of joint and marginal priors are derived and numerical illustrations to demonstrate our approach are given. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The exponential-logarithmic is a new lifetime distribution with decreasing failure rate and interesting applications in the biological and engineering sciences. Thus, a Bayesian analysis of the parameters would be desirable. Bayesian estimation requires the selection of prior distributions for all parameters of the model. In this case, researchers usually seek to choose a prior that has little information on the parameters, allowing the data to be very informative relative to the prior information. Assuming some noninformative prior distributions, we present a Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. Jeffreys prior is derived for the parameters of exponential-logarithmic distribution and compared with other common priors such as beta, gamma, and uniform distributions. In this article, we show through a simulation study that the maximum likelihood estimate may not exist except under restrictive conditions. In addition, the posterior density is sometimes bimodal when an improper prior density is used. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pós-graduação em Matematica Aplicada e Computacional - FCT