932 resultados para Random-set theory
Resumo:
Prediction of random effects is an important problem with expanding applications. In the simplest context, the problem corresponds to prediction of the latent value (the mean) of a realized cluster selected via two-stage sampling. Recently, Stanek and Singer [Predicting random effects from finite population clustered samples with response error. J. Amer. Statist. Assoc. 99, 119-130] developed best linear unbiased predictors (BLUP) under a finite population mixed model that outperform BLUPs from mixed models and superpopulation models. Their setup, however, does not allow for unequally sized clusters. To overcome this drawback, we consider an expanded finite population mixed model based on a larger set of random variables that span a higher dimensional space than those typically applied to such problems. We show that BLUPs for linear combinations of the realized cluster means derived under such a model have considerably smaller mean squared error (MSE) than those obtained from mixed models, superpopulation models, and finite population mixed models. We motivate our general approach by an example developed for two-stage cluster sampling and show that it faithfully captures the stochastic aspects of sampling in the problem. We also consider simulation studies to illustrate the increased accuracy of the BLUP obtained under the expanded finite population mixed model. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
A geodesic in a graph G is a shortest path between two vertices of G. For a specific function e(n) of n, we define an almost geodesic cycle C in G to be a cycle in which for every two vertices u and v in C, the distance d(G)(u, v) is at least d(C)(u, v) - e(n). Let omega(n) be any function tending to infinity with n. We consider a random d-regular graph on n vertices. We show that almost all pairs of vertices belong to an almost geodesic cycle C with e(n)= log(d-1)log(d-1) n+omega(n) and vertical bar C vertical bar =2 log(d-1) n+O(omega(n)). Along the way, we obtain results on near-geodesic paths. We also give the limiting distribution of the number of geodesics between two random vertices in this random graph. (C) 2010 Wiley Periodicals, Inc. J Graph Theory 66: 115-136, 2011
Resumo:
Let X be a compact Hausdorff space, Y be a connected topological manifold, f : X -> Y be a map between closed manifolds and a is an element of Y. The vanishing of the Nielsen root number N(f; a) implies that f is homotopic to a root free map h, i.e., h similar to f and h(-1) (a) = empty set. In this paper, we prove an equivariant analog of this result for G-maps between G-spaces where G is a finite group. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This thesis consists of four empirically oriented papers on central bank independence (CBI) reforms. Paper [1] is an investigation of why politicians around the world have chosen to give up power to independent central banks, thereby reducing their ability to control the economy. A new data-set, including the possible occurrence of CBI-reforms in 132 countries during 1980-2005, was collected. Politicians in non-OECD countries were more likely to delegate power to independent central banks if their country had been characterized by high variability in inflation and if they faced a high probability of being replaced. No such effects were found for OECD countries. Paper [2], using a difference-in-difference approach, studies whether CBI reform matters for inflation performance. The analysis is based on a dataset including the possible occurrence of CBI-reforms in 132 countries during the period of 1980-2005. CBI reform is found to have contributed to bringing down inflation in high-inflation countries, but it seems unrelated to inflation performance in low-inflation countries. Paper [3] investigates whether CBI-reforms are important in reducing inflation and maintaining price stability, using a random-effects random-coefficients model to account for heterogeneity in the effects of CBI-reforms on inflation. CBI-reforms are found to have reduced inflation on average by 3.31 percent, but the effect is only present when countries with historically high inflation rates are included in the sample. Countries with more modest inflation rates have achieved low inflation without institutional reforms that grant central banks more independence, thus undermining the time-inconsistency theory case for CBI. There is furthermore no evidence that CBI-reforms have contributed to lower inflation variability Paper [4] studies the relationship between CBI and a suggested trade-off between price variability and output variability using data on CBI-levels, and data the on implementation dates of CBI-reforms. The results question the existence of such a trade-off, but indicate that there may still be potential gains in stabilization policy from CBI-reforms.
Resumo:
Background: The sensitivity to microenvironmental changes varies among animals and may be under genetic control. It is essential to take this element into account when aiming at breeding robust farm animals. Here, linear mixed models with genetic effects in the residual variance part of the model can be used. Such models have previously been fitted using EM and MCMC algorithms. Results: We propose the use of double hierarchical generalized linear models (DHGLM), where the squared residuals are assumed to be gamma distributed and the residual variance is fitted using a generalized linear model. The algorithm iterates between two sets of mixed model equations, one on the level of observations and one on the level of variances. The method was validated using simulations and also by re-analyzing a data set on pig litter size that was previously analyzed using a Bayesian approach. The pig litter size data contained 10,060 records from 4,149 sows. The DHGLM was implemented using the ASReml software and the algorithm converged within three minutes on a Linux server. The estimates were similar to those previously obtained using Bayesian methodology, especially the variance components in the residual variance part of the model. Conclusions: We have shown that variance components in the residual variance part of a linear mixed model can be estimated using a DHGLM approach. The method enables analyses of animal models with large numbers of observations. An important future development of the DHGLM methodology is to include the genetic correlation between the random effects in the mean and residual variance parts of the model as a parameter of the DHGLM.
Resumo:
The paper investigates which of Shannon’s measures (entropy, conditional entropy, mutual information) is the right one for the task of quantifying information flow in a programming language. We examine earlier relevant contributions from Denning, McLean and Gray and we propose and motivate a specific quantitative definition of information flow. We prove results relating equivalence relations, interference of program variables, independence of random variables and the flow of confidential information. Finally, we show how, in our setting, Shannon’s Perfect Secrecy theorem provides a sufficient condition to determine whether a program leaks confidential information.
Resumo:
Research has shown that belief in an afterlife, a form of symbolic immortality, can alleviate the negative emotions associated with one’s mortality (Deschesne et. al, 2003). We found this aspect of TMT particularly interesting, but lacking any substantial research. Therefore, we set out to determine if belief in an afterlife could diminish the effects of mortality salience. As far as we know, our study is the first to use a pre-screening process to determine participants’ prior beliefs. One prediction might be that those who believe in an afterlife will be less affected by the effects of mortality salience.
Resumo:
Point pattern matching in Euclidean Spaces is one of the fundamental problems in Pattern Recognition, having applications ranging from Computer Vision to Computational Chemistry. Whenever two complex patterns are encoded by two sets of points identifying their key features, their comparison can be seen as a point pattern matching problem. This work proposes a single approach to both exact and inexact point set matching in Euclidean Spaces of arbitrary dimension. In the case of exact matching, it is assured to find an optimal solution. For inexact matching (when noise is involved), experimental results confirm the validity of the approach. We start by regarding point pattern matching as a weighted graph matching problem. We then formulate the weighted graph matching problem as one of Bayesian inference in a probabilistic graphical model. By exploiting the existence of fundamental constraints in patterns embedded in Euclidean Spaces, we prove that for exact point set matching a simple graphical model is equivalent to the full model. It is possible to show that exact probabilistic inference in this simple model has polynomial time complexity with respect to the number of elements in the patterns to be matched. This gives rise to a technique that for exact matching provably finds a global optimum in polynomial time for any dimensionality of the underlying Euclidean Space. Computational experiments comparing this technique with well-known probabilistic relaxation labeling show significant performance improvement for inexact matching. The proposed approach is significantly more robust under augmentation of the sizes of the involved patterns. In the absence of noise, the results are always perfect.
Resumo:
O presente artigo focaliza a genérica e abstrata análise monetária desenvolvida por Marx no inicio do Capital. Mais precisamente, pretende-se avaliar em que medida, se alguma, alguns aspectos da análise de Marx sobre o papel bastante contraditório desempenhado pela moeda no processo de circulação simples de mercadorias suportam uma interpretação sobre-determinista do método dialético por ele empregado. Baseando-se no conceito de sobre-determinação introduzido na literature Marxiana principalmente por Louis Althusser, o artigo conclui que o nexo real-monetário prevalecente na circulação simples de mercadorias pode ser concebida como um nexo sobre-determinado, ou seja, um nexo caracterizado por incorporar um regime de constitutividade.
Resumo:
A dificuldade em se caracterizar alocações ou equilíbrios não estacionários é uma das principais explicações para a utilização de conceitos e hipóteses que trivializam a dinâmica da economia. Tal dificuldade é especialmente crítica em Teoria Monetária, em que a dimensionalidade do problema é alta mesmo para modelos muito simples. Neste contexto, o presente trabalho relata a estratégia computacional de implementação do método recursivo proposto por Monteiro e Cavalcanti (2006), o qual permite calcular a sequência ótima (possivelmente não estacionária) de distribuições de moeda em uma extensão do modelo proposto por Kiyotaki e Wright (1989). Três aspectos deste cálculo são enfatizados: (i) a implementação computacional do problema do planejador envolve a escolha de variáveis contínuas e discretas que maximizem uma função não linear e satisfaçam restrições não lineares; (ii) a função objetivo deste problema não é côncava e as restrições não são convexas; e (iii) o conjunto de escolhas admissíveis não é conhecido a priori. O objetivo é documentar as dificuldades envolvidas, as soluções propostas e os métodos e recursos disponíveis para a implementação numérica da caracterização da dinâmica monetária eficiente sob a hipótese de encontros aleatórios.
Resumo:
We develop portfolio choice theory taking into consideration the first p~ moments of the underIying assets distribution. A rigorous characterization of the opportunity set and of the efficient portfolios frontier is given, as well as of the solutions to the problem with a general utility function and short sales allowed. The extension of c1assical meanvariance properties, like two-fund separation, is also investigated. A general CAPM is derived, based on the theoretical foundations built, and its empirical consequences and testing are discussed
Resumo:
Multiproduct retailers facing similar costs and serving the same public commonly announce different weekly specials. These promotional prices also seem to evolve randomly over the weeks. Here, weekly specials are viewed as the strategic outcome of an oligopolistic price competition among multiproduct retail stores facing nonconvex costs. Existence of an equilibrium in mixed strategies is proven. ldentical stores serving the same public will never charge the same price vector with probability one (cross-store price dispersion). Mixed strategies can generate random price dispersion over time in the repeated version of the mode!.
Resumo:
This Master of Science Thesis deals with the customer satisfaction and loyalty focusing on a private higher education institution in Belém city, Brazil. The literature review focuses on costumer satisfaction and loyalty concepts and theory, models of quality managing systems and methodologies of costumer satisfaction measurement. The research was a survey with a random stratified sample of 329 undergraduate students of Business Administration at the Faculdade do Pará , in the morning and the night periods. The data analysis was made through the descriptive statistics and multiple regression analysis. The main findings are that the model was satisfactory and the main factors affecting Satisfaction to the School were Best Professor Didatics (beta=0.297), Courses Contents (beta=0.280), Clerks Sympathy (beta=0.201), and Number of Students in Classroom (beta=0,187) with a adjusted R2 = 0,47. The main factors affecting School Loyalty with an adjusted R2 = 0,43 were School Image (beta=0.383), Affective Commitment (beta=0.255), and Satisfaction with Professors (beta=0,218). The findings suggest also that may be differences between the set of students and those that complain for something
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)