373 resultados para Renda -- Distribució -- Models matemàtics
Resumo:
Theory of compositional data analysis is often focused on the composition only. However in practical applications we often treat a composition together with covariableswith some other scale. This contribution systematically gathers and develop statistical tools for this situation. For instance, for the graphical display of the dependenceof a composition with a categorical variable, a colored set of ternary diagrams mightbe a good idea for a first look at the data, but it will fast hide important aspects ifthe composition has many parts, or it takes extreme values. On the other hand colored scatterplots of ilr components could not be very instructive for the analyst, if theconventional, black-box ilr is used.Thinking on terms of the Euclidean structure of the simplex, we suggest to set upappropriate projections, which on one side show the compositional geometry and on theother side are still comprehensible by a non-expert analyst, readable for all locations andscales of the data. This is e.g. done by defining special balance displays with carefully-selected axes. Following this idea, we need to systematically ask how to display, explore,describe, and test the relation to complementary or explanatory data of categorical, real,ratio or again compositional scales.This contribution shows that it is sufficient to use some basic concepts and very fewadvanced tools from multivariate statistics (principal covariances, multivariate linearmodels, trellis or parallel plots, etc.) to build appropriate procedures for all these combinations of scales. This has some fundamental implications in their software implementation, and how might they be taught to analysts not already experts in multivariateanalysis
Resumo:
In order to explain the speed of Vesicular Stomatitis Virus VSV infections, we develop a simple model that improves previous approaches to the propagation of virus infections. For VSV infections, we find that the delay time elapsed between the adsorption of a viral particle into a cell and the release of its progeny has a veryimportant effect. Moreover, this delay time makes the adsorption rate essentially irrelevant in order to predict VSV infection speeds. Numerical simulations are in agreement with the analytical results. Our model satisfactorily explains the experimentally measured speeds of VSV infections
Resumo:
A version of Matheron’s discrete Gaussian model is applied to cell composition data.The examples are for map patterns of felsic metavolcanics in two different areas. Q-Qplots of the model for cell values representing proportion of 10 km x 10 km cell areaunderlain by this rock type are approximately linear, and the line of best fit can be usedto estimate the parameters of the model. It is also shown that felsic metavolcanics in theAbitibi area of the Canadian Shield can be modeled as a fractal
Resumo:
There are two principal chemical concepts that are important for studying the naturalenvironment. The first one is thermodynamics, which describes whether a system is atequilibrium or can spontaneously change by chemical reactions. The second main conceptis how fast chemical reactions (kinetics or rate of chemical change) take place wheneverthey start. In this work we examine a natural system in which both thermodynamics andkinetic factors are important in determining the abundance of NH+4 , NO−2 and NO−3 insuperficial waters. Samples were collected in the Arno Basin (Tuscany, Italy), a system inwhich natural and antrophic effects both contribute to highly modify the chemical compositionof water. Thermodynamical modelling based on the reduction-oxidation reactionsinvolving the passage NH+4 -& NO−2 -& NO−3 in equilibrium conditions has allowed todetermine the Eh redox potential values able to characterise the state of each sample and,consequently, of the fluid environment from which it was drawn. Just as pH expressesthe concentration of H+ in solution, redox potential is used to express the tendency of anenvironment to receive or supply electrons. In this context, oxic environments, as thoseof river systems, are said to have a high redox potential because O2 is available as anelectron acceptor.Principles of thermodynamics and chemical kinetics allow to obtain a model that oftendoes not completely describe the reality of natural systems. Chemical reactions may indeedfail to achieve equilibrium because the products escape from the site of the rectionor because reactions involving the trasformation are very slow, so that non-equilibriumconditions exist for long periods. Moreover, reaction rates can be sensitive to poorly understoodcatalytic effects or to surface effects, while variables as concentration (a largenumber of chemical species can coexist and interact concurrently), temperature and pressurecan have large gradients in natural systems. By taking into account this, data of 91water samples have been modelled by using statistical methodologies for compositionaldata. The application of log–contrast analysis has allowed to obtain statistical parametersto be correlated with the calculated Eh values. In this way, natural conditions in whichchemical equilibrium is hypothesised, as well as underlying fast reactions, are comparedwith those described by a stochastic approach
Resumo:
The composition of the labour force is an important economic factor for a country.Often the changes in proportions of different groups are of interest.I this paper we study a monthly compositional time series from the Swedish LabourForce Survey from 1994 to 2005. Three models are studied: the ILR-transformed series,the ILR-transformation of the compositional differenced series of order 1, and the ILRtransformationof the compositional differenced series of order 12. For each of thethree models a VAR-model is fitted based on the data 1994-2003. We predict the timeseries 15 steps ahead and calculate 95 % prediction regions. The predictions of thethree models are compared with actual values using MAD and MSE and the predictionregions are compared graphically in a ternary time series plot.We conclude that the first, and simplest, model possesses the best predictive power ofthe three models
Resumo:
Viruses rapidly evolve, and HIV in particular is known to be one of the fastest evolving human viruses. It is now commonly accepted that viral evolution is the cause of the intriguing dynamics exhibited during HIV infections and the ultimate success of the virus in its struggle with the immune system. To study viral evolution, we use a simple mathematical model of the within-host dynamics of HIV which incorporates random mutations. In this model, we assume a continuous distribution of viral strains in a one-dimensional phenotype space where random mutations are modelled by di ffusion. Numerical simulations show that random mutations combined with competition result in evolution towards higher Darwinian fitness: a stable traveling wave of evolution, moving towards higher levels of fi tness, is formed in the phenoty space.
Resumo:
En el prosent projecte s'introduirà la teoria del disseny estadístic d'experiments i ens endinsarem amb més profunditat en els dissenys factorials complerts. Aquest tipus de dissenys, són d'aplicació en sistemes en els que es desitja estudiar la influència que tenen els k factors sobre una variable resposta. Els dissenys factorials complerts, són aquells en els que els k factors poden prendre diversos nivells, i es contemplen totes les posibles combinacions entre ells. Aquest projecte es centrarà més concretament en els dissenys factorials complerts 2, on el número 2 indica que cadascun els factors pren 2 nivells diferents. S'explicarà la teoria corresponent a aquests dissenys amb l'ajuda de diversos exemples, explicant des de un disseny factorial en el que s'estudia la influència de 2 factros (2), fins a un en el que s'estudia la influència de 4 factors (2). També s'ha introduït alguns mètodes que ens ajudaran a trobar models matemàtics que s'ajustin al sistema, i algunes metodologies d'optimització com la metodologia de superficie de resposta o el mètode simplex, per poder treure el màxim partit als nostres recursos. Una vegada introduïts tots aquests conceptes, es procedirà a realitzar un estudi i optimització d'una reacció química que consisteix en l'eliminació del coure d'una dissolució per a la posterior utilització d'aquesta dissolució en la industria per a l'extracció d'or i plata. El segon cas d'aplicació serà la realització de l'estudi i optimització del procés d'obtenció de biodièsel. En ambdós casos s'aplicarà un disseny factorial complet 2, però en cada un s'aplicarà una metodologia diferent per realitzar la optimització. Donat que aquest és un projecte purament centrat en el disseny d'experiments i en el tractament de les dades obtingudes, l'experimentació no ha sigut realitzada per nosaltres, sinó que la informació referent a la mateixa s'ha obtingut d'articles acadèmics realitzats per diferents universitats que han realitzat els estudis corresponents.
Resumo:
Vegeu el resum a l'inici del document de l'arxiu adjunt
Resumo:
In this paper we axiomatize the strong constrained egalitarian solution (Dutta and Ray, 1991) over the class of weak superadditive games using constrained egalitarianism, order-consistency, and converse order-consistency. JEL classification: C71, C78. Keywords: Cooperative TU-game, strong constrained egalitarian solution, axiomatization.
Resumo:
In this note, we consider claims problems with indivisible goods. Specifically, by applying recursively the P-rights lower bound (Jiménez-Gómez and Marco-Gil (2008)), we ensure the fulfillment of Weak Order Preservation, considered by many authors as a minimal requirement of fairness. Moreover, we retrieve the Discrete Constrained Equal Losses and the Discrete Constrained Equal Awards rules (Herrero and Martíınez (2008)). Finally, by the recursive double imposition of a lower and an upper bound, we obtain the average between them. Keywords: Claims problems, Indivisibilities, Order Preservation, Constrained Egalitarian rules, Midpoint. JEL classification: C71, D63, D71.
Resumo:
How should scholarships be distributed among the (public) higher education students? We raise this situation as a redistribution problem. Following the approach developed in Fleurbaey (1994) and Bossert (1995), redistribution should be based on the notion of solidarity and it re-allocates resources taking into account only agents’ relevant characteristics. We also follow Luttens (2010a), who considers that compensation of relevant characteristics must be based on a lower bound on what every individual deserves. In doing so, we use the so-called fair bound (Moulin (2002)) to define an egalitarian redistribution mechanism and characterize it in terms of non-negativity, priority in lower bound and solidarity. Finally, we apply our approach to the scholarships redistribution problem. Keywords: Redistribution mechanism, Lower bounds, Scholarship, Solidarity. JEL classification: C71, D63, D71
Resumo:
This paper studies the limits of discrete time repeated games with public monitoring. We solve and characterize the Abreu, Milgrom and Pearce (1991) problem. We found that for the "bad" ("good") news model the lower (higher) magnitude events suggest cooperation, i.e., zero punishment probability, while the highrt (lower) magnitude events suggest defection, i.e., punishment with probability one. Public correlation is used to connect these two sets of signals and to make the enforceability to bind. The dynamic and limit behavior of the punishment probabilities for variations in ... (the discount rate) and ... (the time interval) are characterized, as well as the limit payo¤s for all these scenarios (We also introduce uncertainty in the time domain). The obtained ... limits are to the best of my knowledge, new. The obtained ... limits coincide with Fudenberg and Levine (2007) and Fudenberg and Olszewski (2011), with the exception that we clearly state the precise informational conditions that cause the limit to converge from above, to converge from below or to degenerate. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Pub- lic Monitoring, Moral Hazard, Stochastic Processes.
Resumo:
It can be assumed that the composition of Mercury’s thin gas envelope (exosphere) is related to thecomposition of the planets crustal materials. If this relationship is true, then inferences regarding the bulkchemistry of the planet might be made from a thorough exospheric study. The most vexing of allunsolved problems is the uncertainty in the source of each component. Historically, it has been believedthat H and He come primarily from the solar wind, while Na and K originate from volatilized materialspartitioned between Mercury’s crust and meteoritic impactors. The processes that eject atoms andmolecules into the exosphere of Mercury are generally considered to be thermal vaporization, photonstimulateddesorption (PSD), impact vaporization, and ion sputtering. Each of these processes has its owntemporal and spatial dependence. The exosphere is strongly influenced by Mercury’s highly ellipticalorbit and rapid orbital speed. As a consequence the surface undergoes large fluctuations in temperatureand experiences differences of insolation with longitude. We will discuss these processes but focus moreon the expected surface composition and solar wind particle sputtering which releases material like Caand other elements from the surface minerals and discuss the relevance of composition modelling
Resumo:
Los ciclones tropicales son los fenómenos climatológicos más destructivos. El PDI y otros índices que se definen aquí permiten estimar la energía que éstos poseen. Tales índices ajustan una ley de potencias en una parte de su recorrido. En el presente trabajo se procede al ajuste de las respectivas leyes de potencias de los índices para los ciclones tropicales en el norte del Océano Atlántico y noreste del Pacífico en los periodos de 1988 a 2010 y de 2001 a 2010 respectivamente, a partir de los datos registrados por el NOAA.
Resumo:
We put together the different conceptual issues involved in measuring inequality of opportunity, discuss how these concepts have been translated into computable measures, and point out the problems and choices researchers face when implementing these measures. Our analysis identifies and suggests several new possibilities to measure inequality of opportunity. The approaches are illustrated with a selective survey of the empirical literature on income inequality of opportunity.