86 resultados para GENERALIZED DISTRIBUTION
Resumo:
A new graph-based construction of generalized low density codes (GLD-Tanner) with binary BCH constituents is described. The proposed family of GLD codes is optimal on block erasure channels and quasi-optimal on block fading channels. Optimality is considered in the outage probability sense. Aclassical GLD code for ergodic channels (e.g., the AWGN channel,the i.i.d. Rayleigh fading channel, and the i.i.d. binary erasure channel) is built by connecting bitnodes and subcode nodes via a unique random edge permutation. In the proposed construction of full-diversity GLD codes (referred to as root GLD), bitnodes are divided into 4 classes, subcodes are divided into 2 classes, and finally both sides of the Tanner graph are linked via 4 random edge permutations. The study focuses on non-ergodic channels with two states and can be easily extended to channels with 3 states or more.
Resumo:
Individuals' life chances in the future will very much depend on how we invest in our children now. An optimal human capital model would combine a high mean with minimal variance of skills. It is well-established that early childhood learning is key to adult success. The impact of social origins on child outcomes remains strong, and the new role of women poses additional challenges to our conventional nurturing approach to child development. This paper focuses on skill development in the early years, examining how we might best combine family inputs and public policy to invest optimally in our future human capital. I emphasize three issues: one, the uneven capacity of parents to invest in children; two, the impact of mothers' employment on child outcomes; and three, the potential benefits of early pre-school programmes. I conclude that mothers' intra-family bargaining power is decisive for family investments and that universal child care is key if our goal is to arrive at a strong mean with minimal variance.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
We estimate the world distribution of income by integrating individualincome distributions for 125 countries between 1970 and 1998. Weestimate poverty rates and headcounts by integrating the density functionbelow the $1/day and $2/day poverty lines. We find that poverty ratesdecline substantially over the last twenty years. We compute povertyheadcounts and find that the number of one-dollar poor declined by 235million between 1976 and 1998. The number of $2/day poor declined by 450million over the same period. We analyze poverty across different regionsand countries. Asia is a great success, especially after 1980. LatinAmerica reduced poverty substantially in the 1970s but progress stoppedin the 1980s and 1990s. The worst performer was Africa, where povertyrates increased substantially over the last thirty years: the number of$1/day poor in Africa increased by 175 million between 1970 and 1998,and the number of $2/day poor increased by 227. Africa hosted 11% ofthe world s poor in 1960. It hosted 66% of them in 1998. We estimatenine indexes of income inequality implied by our world distribution ofincome. All of them show substantial reductions in global incomeinequality during the 1980s and 1990s.
Resumo:
The problems arising in commercial distribution are complex and involve several players and decision levels. One important decision is relatedwith the design of the routes to distribute the products, in an efficient and inexpensive way.This article deals with a complex vehicle routing problem that can beseen as a new extension of the basic vehicle routing problem. The proposed model is a multi-objective combinatorial optimization problemthat considers three objectives and multiple periods, which models in a closer way the real distribution problems. The first objective is costminimization, the second is balancing work levels and the third is amarketing objective. An application of the model on a small example, with5 clients and 3 days, is presented. The results of the model show the complexity of solving multi-objective combinatorial optimization problems and the contradiction between the several distribution management objective.
Resumo:
In this paper we propose a subsampling estimator for the distribution ofstatistics diverging at either known rates when the underlying timeseries in strictly stationary abd strong mixing. Based on our results weprovide a detailed discussion how to estimate extreme order statisticswith dependent data and present two applications to assessing financialmarket risk. Our method performs well in estimating Value at Risk andprovides a superior alternative to Hill's estimator in operationalizingSafety First portofolio selection.
Resumo:
We study the earnings structure and the equilibrium assignment of workers when workers exert intra-firm spillovers on each other.We allow for arbitrary spillovers provided output depends on some aggregate index of workers' skill. Despite the possibility of increasing returns to skills, equilibrium typically exists. We show that equilibrium will typically be segregated; that the skill space can be partitioned into a set of segments and any firm hires from only one segment. Next, we apply the model to analyze the effect of information technology on segmentation and the distribution of income. There are two types of human capital, productivity and creativity, i.e. the ability to produce ideas that may be duplicated over a network. Under plausible assumptions, inequality rises and then falls when network size increases, and the poorest workers cannot lose. We also analyze the impact of an improvement in worker quality and of an increased international mobility of ideas.
Resumo:
Do the contests with the largest prizes attract the most able contestants? Towhat extent do contestants avoid competition? In this paper, we show, theoreticallyand empirically, that the distribution of abilities plays a crucial role in determiningcontest choice. Sorting exists only when the proportion of high-ability contestantsis sufficiently small. As this proportion increases, contestants shy away from competitionand sorting decreases, such that, reverse sorting becomes a possibility. Wetest our theoretical predictions using a large panel data set containing contest choiceover three decades. We use exogenous variation in the participation of highly-ablecompetitors to provide empirical evidence for the relationship among prizes, competition,and sorting.
Resumo:
This paper studies the dynamics of the distribution of wealth in ageneral equilibrium framework. It considers an overlapping generationsmodel with production and altruistic preferences in which individualsface an uncertain lifetime and annuity markets do not exist. Thispaper focuses on the role that accidental bequests, voluntary bequests,and non--negativity constraints on bequests play in the dynamics of thedistribution of wealth. It is proved that the equilibrium interestrate is lower than the one that satisfies the modified goldenrule. In this economy, a social security system not only plays aninsurance role, but also prevents capital overaccumulation. In fact,this paper shows that a pay--as--you--go social security systemdecentralizes the social planner solution as a competitive equilibrium.
Resumo:
It is proved the algebraic equality between Jennrich's (1970) asymptotic$X^2$ test for equality of correlation matrices, and a Wald test statisticderived from Neudecker and Wesselman's (1990) expression of theasymptoticvariance matrix of the sample correlation matrix.
Resumo:
Asymptotic chi-squared test statistics for testing the equality ofmoment vectors are developed. The test statistics proposed aregeneralizedWald test statistics that specialize for different settings by inserting andappropriate asymptotic variance matrix of sample moments. Scaled teststatisticsare also considered for dealing with situations of non-iid sampling. Thespecializationwill be carried out for testing the equality of multinomial populations, andtheequality of variance and correlation matrices for both normal andnon-normaldata. When testing the equality of correlation matrices, a scaled versionofthe normal theory chi-squared statistic is proven to be an asymptoticallyexactchi-squared statistic in the case of elliptical data.
Resumo:
This paper shows that the distribution of observed consumption is not a good proxy for the distribution of heterogeneous consumers when the current tariff is an increasing block tariff. We use a two step method to recover the "true" distribution of consumers. First, we estimate the demand function induced by the current tariff. Second, using the demand system, we specify the distribution of consumers as a function of observed consumption to recover the true distribution. Finally, we design a new two-part tariff which allows us to evaluate the equity of the existence of an increasing block tariff.
Resumo:
In the mid-1980s, many European countries introduced fixed-term contracts.Since then their labor markets have become more dynamic. This paper studiesthe implications of such reforms for the duration distribution ofunemployment, with particular emphasis on the changes in the durationdependence. I estimate a parametric duration model using cross-sectionaldata drawn from the Spanish Labor Force Survey from 1980 to 1994 to analyzethe chances of leaving unemployment before and after the introduction offixed-term contracts. I find that duration dependence has increased sincesuch reform. Semi-parametric estimation of the model also shows that forlong spells, the probability of leaving unemployment has decreased sincesuch reform.
Resumo:
The problems arising in the logistics of commercial distribution are complexand involve several players and decision levels. One important decision isrelated with the design of the routes to distribute the products, in anefficient and inexpensive way.This article explores three different distribution strategies: the firststrategy corresponds to the classical vehicle routing problem; the second isa master route strategy with daily adaptations and the third is a strategythat takes into account the cross-functional planning through amulti-objective model with two objectives. All strategies are analyzed ina multi-period scenario. A metaheuristic based on the Iteratetd Local Search,is used to solve the models related with each strategy. A computationalexperiment is performed to evaluate the three strategies with respect to thetwo objectives. The cross functional planning strategy leads to solutions thatput in practice the coordination between functional areas and better meetbusiness objectives.