995 resultados para gradually truncated log-normal
Resumo:
We study the statistical distribution of firm size for USA and Brazilian publicly traded firms through the Zipf plot technique. Sale size is used to measure firm size. The Brazilian firm size distribution is given by a log-normal distribution without any adjustable parameter. However, we also need to consider different parameters of log-normal distribution for the largest firms in the distribution, which are mostly foreign firms. The log-normal distribution has to be gradually truncated after a certain critical value for USA firms. Therefore, the original hypothesis of proportional effect proposed by Gibrat is valid with some modification for very large firms. We also consider the possible mechanisms behind this distribution. (c) 2006 Published by Elsevier B.V.
Resumo:
Traditionally, it is assumed that the population size of cities in a country follows a Pareto distribution. This assumption is typically supported by nding evidence of Zipf's Law. Recent studies question this nding, highlighting that, while the Pareto distribution may t reasonably well when the data is truncated at the upper tail, i.e. for the largest cities of a country, the log-normal distribution may apply when all cities are considered. Moreover, conclusions may be sensitive to the choice of a particular truncation threshold, a yet overlooked issue in the literature. In this paper, then, we reassess the city size distribution in relation to its sensitivity to the choice of truncation point. In particular, we look at US Census data and apply a recursive-truncation approach to estimate Zipf's Law and a non-parametric alternative test where we consider each possible truncation point of the distribution of all cities. Results con rm the sensitivity of results to the truncation point. Moreover, repeating the analysis over simulated data con rms the di culty of distinguishing a Pareto tail from the tail of a log-normal and, in turn, identifying the city size distribution as a false or a weak Pareto law.
Resumo:
There is increasing concern about soil enrichment with K+ and subsequent potential losses following long-term application of poor quality water to agricultural land. Different models are increasingly being used for predicting or analyzing water flow and chemical transport in soils and groundwater. The convective-dispersive equation (CDE) and the convective log-normal transfer function (CLT) models were fitted to the potassium (K+) leaching data. The CDE and CLT models produced equivalent goodness of fit. Simulated breakthrough curves for a range of CaCl2 concentration based on parameters of 15 mmol l(-1) CaCl2 were characterised by an early peak position associated with higher K+ concentration as the CaCl2 concentration used in leaching experiments decreased. In another method, the parameters estimated from 15 mmol l(-1) CaCl2 solution were used for all other CaCl2 concentrations, and the best value of retardation factor (R) was optimised for each data set. A better prediction was found. With decreasing CaCl2 concentration the value of R is required to be more than that measured (except for 10 mmol l(-1) CaCl2), if the estimated parameters of 15 mmol l(-1) CaCl2 are used. The two models suffer from the fact that they need to be calibrated against a data set, and some of their parameters are not measurable and cannot be determined independently.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Power-law distributions have been observed in various economical and physical systems. Levy flights have infinite variance which discourage a physical approach. We introduce a class of stochastic processes, the gradually truncated Levy flight in which large steps of a Levy flight are gradually eliminated. It has finite variance and the system can be analyzed in a closed form. We applied the present method to explain the distribution of a particular economical index. The present method can be applied to describe time series in a variety of fields, i.e. turbulent flow, anomalous diffusion, polymers, etc. (C) 1999 Elsevier B.V. B.V. All rights reserved.
Resumo:
Tsallis postulated a generalized form for entropy and give rise to a new statistics now known as Tsallis statistics. In the present work, we compare the Tsallis statistics with the gradually truncated Levy flight, and discuss the distribution of an economical index-the Standard and Poor's 500-using the values of standard deviation as calculated by our model. We find that both statistics give almost the same distribution. Thus we feel that gradual truncation of Levy distribution, after certain critical step size for describing complex systems, is a requirement of generalized thermodynamics or similar. The gradually truncated Levy flight is based on physical considerations and bring a better physical picture of the dynamics of the whole system. Tsallis statistics gives a theoretical support. Both statistics together can be utilized for the development of a more exact portfolio theory or to understand better the complexities in human and financial behaviors. A comparison of both statistics is made. (C) 2002 Published by Elsevier B.V. B.V.
Resumo:
Power-law distributions, i.e. Levy flights have been observed in various economical, biological, and physical systems in high-frequency regime. These distributions can be successfully explained via gradually truncated Levy flight (GTLF). In general, these systems converge to a Gaussian distribution in the low-frequency regime. In the present work, we develop a model for the physical basis for the cut-off length in GTLF and its variation with respect to the time interval between successive observations. We observe that GTLF automatically approach a Gaussian distribution in the low-frequency regime. We applied the present method to analyze time series in some physical and financial systems. The agreement between the experimental results and theoretical curves is excellent. The present method can be applied to analyze time series in a variety of fields, which in turn provide a basis for the development of further microscopic models for the system. © 2000 Elsevier Science B.V. All rights reserved.
Resumo:
The discovery that the epsilon 4 allele of the apolipoprotein E (apoE) gene is a putative risk factor for Alzheimer disease (AD) in the general population has highlighted the role of genetic influences in this extremely common and disabling illness. It has long been recognized that another genetic abnormality, trisomy 21 (Down syndrome), is associated with early and severe development of AD neuropathological lesions. It remains a challenge, however, to understand how these facts relate to the pathological changes in the brains of AD patients. We used computerized image analysis to examine the size distribution of one of the characteristic neuropathological lesions in AD, deposits of A beta peptide in senile plaques (SPs). Surprisingly, we find that a log-normal distribution fits the SP size distribution quite well, motivating a porous model of SP morphogenesis. We then analyzed SP size distribution curves in genotypically defined subgroups of AD patients. The data demonstrate that both apoE epsilon 4/AD and trisomy 21/AD lead to increased amyloid deposition, but by apparently different mechanisms. The size distribution curve is shifted toward larger plaques in trisomy 21/AD, probably reflecting increased A beta production. In apoE epsilon 4/AD, the size distribution is unchanged but the number of SP is increased compared to apoE epsilon 3, suggesting increased probability of SP initiation. These results demonstrate that subgroups of AD patients defined on the basis of molecular characteristics have quantitatively different neuropathological phenotypes.
Resumo:
The size frequency distributions of diffuse, primitive and classic β- amyloid (Aβ) deposits were studied in single sections of cortical tissue from patients with Alzheimer's disease (AD) and Down's syndrome (DS) and compared with those predicted by the log-normal model. In a sample of brain regions, these size distributions were compared with those obtained by serial reconstruction through the tissue and the data used to adjust the size distributions obtained in single sections. The adjusted size distributions of the diffuse, primitive and classic deposits deviated significantly from a log-normal model in AD and DS, the greatest deviations from the model being observed in AD. More Aβ deposits were observed close to the mean and fewer in the larger size classes than predicted by the model. Hence, the growth of Aβ deposits in AD and DS does not strictly follow the log-normal model, deposits growing to within a more restricted size range than predicted. However, Aβ deposits grow to a larger size in DS compared with AD which may reflect differences in the mechanism of Aβ formation.
Resumo:
The size frequency distributions of diffuse, primitive and cored senile plaques (SP) were studied in single sections of the temporal lobe from 10 patients with Alzheimer’s disease (AD). The size distribution curves were unimodal and positively skewed. The size distribution curve of the diffuse plaques was shifted towards larger plaques while those of the neuritic and cored plaques were shifted towards smaller plaques. The neuritic/diffuse plaque ratio was maximal in the 11 – 30 micron size class and the cored/ diffuse plaque ratio in the 21 – 30 micron size class. The size distribution curves of the three types of plaque deviated significantly from a log-normal distribution. Distributions expressed on a logarithmic scale were ‘leptokurtic’, i.e. with excess of observations near the mean. These results suggest that SP in AD grow to within a more restricted size range than predicted from a log-normal model. In addition, there appear to be differences in the patterns of growth of diffuse, primitive and cored plaques. If neuritic and cored plaques develop from earlier diffuse plaques, then smaller diffuse plaques are more likely to be converted to mature plaques.
Resumo:
In many of the Statnotes described in this series, the statistical tests assume the data are a random sample from a normal distribution These Statnotes include most of the familiar statistical tests such as the ‘t’ test, analysis of variance (ANOVA), and Pearson’s correlation coefficient (‘r’). Nevertheless, many variables exhibit a more or less ‘skewed’ distribution. A skewed distribution is asymmetrical and the mean is displaced either to the left (positive skew) or to the right (negative skew). If the mean of the distribution is low, the degree of variation large, and when values can only be positive, a positively skewed distribution is usually the result. Many distributions have potentially a low mean and high variance including that of the abundance of bacterial species on plants, the latent period of an infectious disease, and the sensitivity of certain fungi to fungicides. These positively skewed distributions are often fitted successfully by a variant of the normal distribution called the log-normal distribution. This statnote describes fitting the log-normal distribution with reference to two scenarios: (1) the frequency distribution of bacterial numbers isolated from cloths in a domestic environment and (2), the sizes of lichenised ‘areolae’ growing on the hypothalus of Rhizocarpon geographicum (L.) DC.
Resumo:
The main objective of this paper is to study a logarithm extension of the bimodal skew normal model introduced by Elal-Olivero et al. [1]. The model can then be seen as an alternative to the log-normal model typically used for fitting positive data. We study some basic properties such as the distribution function and moments, and discuss maximum likelihood for parameter estimation. We report results of an application to a real data set related to nickel concentration in soil samples. Model fitting comparison with several alternative models indicates that the model proposed presents the best fit and so it can be quite useful in real applications for chemical data on substance concentration. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
L'étude du mouvement des organismes est essentiel pour la compréhension du fonctionnement des écosystèmes. Dans le cas des écosystèmes marins exploités, cela amène à s'intéresser aux stratégies spatiales des pêcheurs. L'une des approches les plus utilisées pour la modélisation du mouvement des prédateurs supé- rieurs est la marche aléatoire de Lévy. Une marche aléatoire est un modèle mathématique composé par des déplacements aléatoires. Dans le cas de Lévy, les longueurs des déplacements suivent une loi stable de Lévy. Dans ce cas également, les longueurs, lorsqu'elles tendent vers l'in ni (in praxy lorsqu'elles sont grandes, grandes par rapport à la médiane ou au troisième quartile par exemple), suivent une loi puissance caractéristique du type de marche aléatoire de Lévy (Cauchy, Brownien ou strictement Lévy). Dans la pratique, outre que cette propriété est utilisée de façon réciproque sans fondement théorique, les queues de distribution, notion par ailleurs imprécise, sont modélisée par des lois puissances sans que soient discutées la sensibilité des résultats à la dé nition de la queue de distribution, et la pertinence des tests d'ajustement et des critères de choix de modèle. Dans ce travail portant sur les déplacements observés de trois bateaux de pêche à l'anchois du Pérou, plusieurs modèles de queues de distribution (log-normal, exponentiel, exponentiel tronqué, puissance et puissance tronqué) ont été comparés ainsi que deux dé nitions possible de queues de distribution (de la médiane à l'in ni ou du troisième quartile à l'in ni). Au plan des critères et tests statistiques utilisés, les lois tronquées (exponentielle et puissance) sont apparues les meilleures. Elles intègrent en outre le fait que, dans la pratique, les bateaux ne dépassent pas une certaine limite de longueur de déplacement. Le choix de modèle est apparu sensible au choix du début de la queue de distribution : pour un même bateau, le choix d'un modèle tronqué ou l'autre dépend de l'intervalle des valeurs de la variable sur lequel le modèle est ajusté. Pour nir, nous discutons les implications en écologie des résultats de ce travail.
Resumo:
In this paper, the generalized log-gamma regression model is modified to allow the possibility that long-term survivors may be present in the data. This modification leads to a generalized log-gamma regression model with a cure rate, encompassing, as special cases, the log-exponential, log-Weibull and log-normal regression models with a cure rate typically used to model such data. The models attempt to simultaneously estimate the effects of explanatory variables on the timing acceleration/deceleration of a given event and the surviving fraction, that is, the proportion of the population for which the event never occurs. The normal curvatures of local influence are derived under some usual perturbation schemes and two martingale-type residuals are proposed to assess departures from the generalized log-gamma error assumption as well as to detect outlying observations. Finally, a data set from the medical area is analyzed.