42 resultados para Latent Dirichlet Allocation
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
We consider the allocation of a finite number of indivisible objects to the same number of agents according to an exogenously given queue. We assume that the agents collaborate in order to achieve an efficient outcome for society. We allow for side-payments and provide a method for obtaining stable outcomes.
Resumo:
We study a simple model of assigning indivisible objects (e.g., houses, jobs, offices, etc.) to agents. Each agent receives at most one object and monetary compensations are not possible. We completely describe all rules satisfying efficiency and resource-monotonicity. The characterized rules assign the objects in a sequence of steps such that at each step there is either a dictator or two agents "trade" objects from their hierarchically specified "endowments."
Resumo:
How should an equity-motivated policy-marker allocate public capital (infrastructure) across regions. Should it aim at reducing interregional differences in per capita output, or at maximizing total output? Such a normative question is examined in a model where the policy-marker is exclusively concerned about personal inequality and has access to two policy instruments. (i) a personal tax-transfer system (taxation is distortionary), and (ii) the regional allocation of public investment. I show that the case for public investment as a significant instrument for interpersonal redistribution is rather weak. In the most favorable case, when the tax code is constrained to be uniform across regions, it is optimal to distort the allocation of public investment in favor of the poor regions, but only to a limited extent. The reason is that poor individuals are relatively more sensitive to public trans fers, which are maximized by allocating public investment efficiently. If! the tax code can vary across regions then the optimal policy may involve an allocation of public investment distorted in favor of the rich regions.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
We prove existence theorems for the Dirichlet problem for hypersurfaces of constant special Lagrangian curvature in Hadamard manifolds. The first results are obtained using the continuity method and approximation and then refined using two iterations of the Perron method. The a-priori estimates used in the continuity method are valid in any ambient manifold.
Resumo:
We study the lysis timing of a bacteriophage population by means of a continuously infection-age-structured population dynamics model. The features of the model are the infection process of bacteria, the natural death process, and the lysis process which means the replication of bacteriophage viruses inside bacteria and the destruction of them. We consider that the length of the lysis timing (or latent period) is distributed according to a general probability distribution function. We have carried out an optimization procedure and we have found the latent period corresponding to the maximal fitness (i.e. maximal growth rate) of the bacteriophage population.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.
Resumo:
This paper focuses on one of the methods for bandwidth allocation in an ATM network: the convolution approach. The convolution approach permits an accurate study of the system load in statistical terms by accumulated calculations, since probabilistic results of the bandwidth allocation can be obtained. Nevertheless, the convolution approach has a high cost in terms of calculation and storage requirements. This aspect makes real-time calculations difficult, so many authors do not consider this approach. With the aim of reducing the cost we propose to use the multinomial distribution function: the enhanced convolution approach (ECA). This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements and makes a simple deconvolution process possible. The ECA is used in connection acceptance control, and some results are presented
Resumo:
Globalization involves several facility location problems that need to be handled at large scale. Location Allocation (LA) is a combinatorial problem in which the distance among points in the data space matter. Precisely, taking advantage of the distance property of the domain we exploit the capability of clustering techniques to partition the data space in order to convert an initial large LA problem into several simpler LA problems. Particularly, our motivation problem involves a huge geographical area that can be partitioned under overall conditions. We present different types of clustering techniques and then we perform a cluster analysis over our dataset in order to partition it. After that, we solve the LA problem applying simulated annealing algorithm to the clustered and non-clustered data in order to work out how profitable is the clustering and which of the presented methods is the most suitable
Resumo:
The mutual information of independent parallel Gaussian-noise channels is maximized, under an average power constraint, by independent Gaussian inputs whose power is allocated according to the waterfilling policy. In practice, discrete signalling constellations with limited peak-to-average ratios (m-PSK, m-QAM, etc) are used in lieu of the ideal Gaussian signals. This paper gives the power allocation policy that maximizes the mutual information over parallel channels with arbitrary input distributions. Such policy admits a graphical interpretation, referred to as mercury/waterfilling, which generalizes the waterfilling solution and allows retaining some of its intuition. The relationship between mutual information of Gaussian channels and nonlinear minimum mean-square error proves key to solving the power allocation problem.
Resumo:
This paper formulates power allocation policies that maximize the region of mutual informationsachievable in multiuser downlink OFDM channels. Arbitrary partitioning ofthe available tones among users and arbitrary modulation formats, possibly different forevery user, are considered. Two distinct policies are derived, respectively for slow fadingchannels tracked instantaneously by the transmitter and for fast fading channels knownonly statistically thereby. With instantaneous channel tracking, the solution adopts theform of a multiuser mercury/waterfilling procedure that generalizes the single-user mercury/waterfilling introduced in [1, 2]. With only statistical channel information, in contrast,the mercury/waterfilling interpretation is lost. For both policies, a number of limitingregimes are explored and illustrative examples are provided.
Resumo:
This study examines parental time investment in their children, distinguishing between developmental and non-developmental care. Our analyses centre on three influential determinants: educational background, marital homogamy, and spouses' relative bargaining power. We find that the emphasis on quality care time is correlated with parents' education, and that marital homogamy reduces couple specialization, but only among the highly educated. In line with earlier research, we identify gendered parental behaviour. The presence of boys is an important condition for fathers' time dedication, but primarly among lower educated fathers. To the extent that parental stimulation is decisive for child outcomes, our findings suggest the persistence of important inequalities. This emerges through our special attention to behavioural differences across the educational distribution among households.
Resumo:
Using data from the Spanish household budget survey, we investigate life- cycle effects on several product expenditures. A latent-variable model approach is adopted to evaluate the impact of income on expenditures, controlling for the number of members in the family. Two latent factors underlying repeated measures of monetary and non-monetary income are used as explanatory variables in the expenditure regression equations, thus avoiding possible bias associated to the measurement error in income. The proposed methodology also takes care of the case in which product expenditures exhibit a pattern of infrequent purchases. Multiple-group analysis is used to assess the variation of key parameters of the model across various household life-cycle typologies. The analysis discloses significant life-cycle effects on the mean levels of expenditures; it also detects significant life-cycle effects on the way expenditures are affected by income and family size. Asymptotic robust methods are used to account for possible non-normality of the data.