920 resultados para Log-normal distribution


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the 'global' mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy. © 2006 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62H10.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Let (X, Y) be bivariate normal random vectors which represent the responses as a result of Treatment 1 and Treatment 2. The statistical inference about the bivariate normal distribution parameters involving missing data with both treatment samples is considered. Assuming the correlation coefficient ρ of the bivariate population is known, the MLE of population means and variance (ξ, η, and σ2) are obtained. Inferences about these parameters are presented. Procedures of constructing confidence interval for the difference of population means ξ – η and testing hypothesis about ξ – η are established. The performances of the new estimators and testing procedure are compared numerically with the method proposed in Looney and Jones (2003) on the basis of extensive Monte Carlo simulation. Simulation studies indicate that the testing power of the method proposed in this thesis study is higher.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multivariate normal distribution is commonly encountered in any field, a frequent issue is the missing values in practice. The purpose of this research was to estimate the parameters in three-dimensional covariance permutation-symmetric normal distribution with complete data and all possible patterns of incomplete data. In this study, MLE with missing data were derived, and the properties of the MLE as well as the sampling distributions were obtained. A Monte Carlo simulation study was used to evaluate the performance of the considered estimators for both cases when ρ was known and unknown. All results indicated that, compared to estimators in the case of omitting observations with missing data, the estimators derived in this article led to better performance. Furthermore, when ρ was unknown, using the estimate of ρ would lead to the same conclusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although transit travel time variability is essential for understanding the deterioration of reliability, optimising transit schedule and route choice; it has not attracted enough attention from the literature. This paper proposes public transport-oriented definitions of travel time variability and explores the distributions of public transport travel time using the Transit Signal Priority data. First, definitions of public transport travel time variability are established by extending the common definitions of variability in the literature and by using route and services data of public transport vehicles. Second, the paper explores the distribution of public transport travel time. A new approach for analysing the distributions involving all transit vehicles as well as vehicles from a specific route is proposed. The Lognormal distribution is revealed as the descriptors for public transport travel time from the same route and service. The methods described in this study could be of interest for both traffic managers and transit operators for planning and managing the transit systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The unconfined aquifer of the Continental Terminal in Niger was investigated by magnetic resonance sounding (MRS) and by 14 pumping tests in order to improve calibration of MRS outputs at field scale. The reliability of the standard relationship used for estimating aquifer transmissivity by MRS was checked; it was found that the parametric factor can be estimated with an uncertainty a parts per thousand currency sign150% by a single point of calibration. The MRS water content (theta (MRS)) was shown to be positively correlated with the specific yield (Sy), and theta (MRS) always displayed higher values than Sy. A conceptual model was subsequently developed, based on estimated changes of the total porosity, Sy, and the specific retention Sr as a function of the median grain size. The resulting relationship between theta (MRS) and Sy showed a reasonably good fit with the experimental dataset, considering the inherent heterogeneity of the aquifer matrix (residual error is similar to 60%). Interpreted in terms of aquifer parameters, MRS data suggest a log-normal distribution of the permeability and a one-sided Gaussian distribution of Sy. These results demonstrate the efficiency of the MRS method for fast and low-cost prospection of hydraulic parameters for large unconfined aquifers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of an international network of large plots to study tropical vegetation dynamics on a long-term basis, a 50-hectare permanent plot was set up during 1988-89 in the deciduous forests of Mudumalai, southern India. Within this plot 25,929 living woody plants (71 species) above 1 cm DBH (diameter at breast height) were identified, measured, tagged and mapped. Species abundances corresponded to the characteristic log-normal distribution. The four most abundant species (Kydia calycina, Lagerstroemia microcarpa, Terminalia crenulata and Helicteres isora) constituted nearly 56% of total stems, while seven species were represented by only one individual each in the plot. Variance/mean ratios of density showed most species to have clumped distributions. The population declined overall by 14% during the first two years, largely due to elephant and fire-mediated damage to Kydia calycina and Helicteres isora. In this article we discuss the need for large plots to study vegetation dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a method to integrate environmental time series into stock assessment models and to test the significance of correlations between population processes and the environmental time series. Parameters that relate the environmental time series to population processes are included in the stock assessment model, and likelihood ratio tests are used to determine if the parameters improve the fit to the data significantly. Two approaches are considered to integrate the environmental relationship. In the environmental model, the population dynamics process (e.g. recruitment) is proportional to the environmental variable, whereas in the environmental model with process error it is proportional to the environmental variable, but the model allows an additional temporal variation (process error) constrained by a log-normal distribution. The methods are tested by using simulation analysis and compared to the traditional method of correlating model estimates with environmental variables outside the estimation procedure. In the traditional method, the estimates of recruitment were provided by a model that allowed the recruitment only to have a temporal variation constrained by a log-normal distribution. We illustrate the methods by applying them to test the statistical significance of the correlation between sea-surface temperature (SST) and recruitment to the snapper (Pagrus auratus) stock in the Hauraki Gulf–Bay of Plenty, New Zealand. Simulation analyses indicated that the integrated approach with additional process error is superior to the traditional method of correlating model estimates with environmental variables outside the estimation procedure. The results suggest that, for the snapper stock, recruitment is positively correlated with SST at the time of spawning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Noise and vibration from underground railways is a major source of disturbance to inhabitants near subways. To help designers meet noise and vibration limits, numerical models are used to understand vibration propagation from these underground railways. However, the models commonly assume the ground is homogeneous and neglect to include local variability in the soil properties. Such simplifying assumptions add a level of uncertainty to the predictions which is not well understood. The goal of the current paper is to quantify the effect of soil inhomogeneity on surface vibration. The thin-layer method (TLM) is suggested as an efficient and accurate means of simulating vibration from underground railways in arbitrarily layered half-spaces. Stochastic variability of the soils elastic modulus is introduced using a KL expansion; the modulus is assumed to have a log-normal distribution and a modified exponential covariance kernel. The effect of horizontal soil variability is investigated by comparing the stochastic results for soils varied only in the vertical direction to soils with 2D variability. Results suggest that local soil inhomogeneity can significantly affect surface velocity predictions; 90 percent confidence intervals showing 8 dB averages and peak values up to 12 dB are computed. This is a significant source of uncertainty and should be considered when using predictions from models assuming homogeneous soil properties. Furthermore, the effect of horizontal variability of the elastic modulus on the confidence interval appears to be negligible. This suggests that only vertical variation needs to be taken into account when modelling ground vibration from underground railways. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistically planar turbulent partially premixed flames for different initial intensities of decaying turbulence have been simulated for global equivalence ratios = 0.7 and 1.0 using three-dimensional, simplified chemistry-based direct numerical simulations (DNS). The simulation parameters are chosen such that the flames represent the thin reaction zones regime combustion. A random bimodal distribution of equivalence ratio is introduced in the unburned gas ahead of the flame to account for the mixture inhomogeneity. The results suggest that the probability density functions (PDFs) of the mixture fraction gradient magnitude |Δξ| (i.e., P(|Δξ|)) can be reasonably approximated using a log-normal distribution. However, this presumed PDF distribution captures only the qualitative nature of the PDF of the reaction progress variable gradient magnitude |Δc| (i.e., P(|Δc|)). It has been found that a bivariate log-normal distribution does not sufficiently capture the quantitative behavior of the joint PDF of |Δξ| and |Δc| (i.e., P(|Δξ|, |Δc|)), and the agreement with the DNS data has been found to be poor in certain regions of the flame brush, particularly toward the burned gas side of the flame brush. Moreover, the variables |Δξ| and |Δc| show appreciable correlation toward the burned gas side of the flame brush. These findings are corroborated further using a DNS data of a lifted jet flame to study the flame geometry dependence of these statistics. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In regression analysis of counts, a lack of simple and efficient algorithms for posterior computation has made Bayesian approaches appear unattractive and thus underdeveloped. We propose a lognormal and gamma mixed negative binomial (NB) regression model for counts, and present efficient closed-form Bayesian inference; unlike conventional Poisson models, the proposed approach has two free parameters to include two different kinds of random effects, and allows the incorporation of prior information, such as sparsity in the regression coefficients. By placing a gamma distribution prior on the NB dispersion parameter r, and connecting a log-normal distribution prior with the logit of the NB probability parameter p, efficient Gibbs sampling and variational Bayes inference are both developed. The closed-form updates are obtained by exploiting conditional conjugacy via both a compound Poisson representation and a Polya-Gamma distribution based data augmentation approach. The proposed Bayesian inference can be implemented routinely, while being easily generalizable to more complex settings involving multivariate dependence structures. The algorithms are illustrated using real examples. Copyright 2012 by the author(s)/owner(s).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The kinetics of the recovery of the photoinduced transient bleaching of colloidal CdS in the presence of different electron acceptors are examined. In the presence of the zwitterionic viologen, N,N'-dipropyl-2,2'-bipyridinium disulphonate, excitation of colloidal CdS at different flash intensities generates a series of decay profiles which are superimposed when normalized. The shape of the decay curves are as predicted by a first-order activation-controlled model for a log-normal distribution of particles sizes. In contrast, the variation in flash intensity in the presence of a second viologen, N,N'-dipropyl-4,4'-bipyridinium sulphonate, generates normalized decay traces which broaden with increasing flash intensity. This behaviour is predicted by a zero-order diffusion-controlled model for a log-normal distribution of particle radii. The photoreduction of a number of other oxidants sensitized by colloidal CdS is examined and the shape of the decay kinetics interpreted via either the first- or zero-order kinetics models. The rate constants and activation energies derived using these models are consistent with the values expected for an activation- or diffusion-controlled reaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The exposure to dust and polynuclear aromatic hydrocarbons (PAH) of 15 truck drivers from Geneva, Switzerland, was measured. The drivers were divided between "long-distance" drivers and "local" drivers and between smokers and nonsmokers and were compared with a control group of 6 office workers who were also divided into smokers and nonsmokers. Dust was measured on 1 workday both by a direct-reading instrument and by sampling. The local drivers showed higher exposure to dust (0.3 mg/m3) and PAH than the long-distance drivers (0.1 mg/m3), who showed no difference with the control group. This observation may be due to the fact that the local drivers spend more time in more polluted areas, such as streets with heavy traffic and construction sites, than do the long-distance drivers. Smoking does not influence exposure to dust and PAH of professional truck drivers, as measured in this study, probably because the ventilation rate of the truck cabins is relatively high even during cold days (11-15 r/h). The distribution of dust concentrations was shown in some cases to be quite different from the expected log-normal distribution. The contribution of diesel exhaust to these exposures could not be estimated since no specific tracer was used. However, the relatively low level of dust exposure dose not support the hypothesis that present day levels of diesel exhaust particulates play a significant role in the excess occurrence of lung cancer observed in professional truck drivers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extensive experimental and simulation study is carried out in conventional magnetorheological fluids formulated by dispersion of mixtures of carbonyl iron particles having different sizes in Newtonian carriers. Apparent yield stress data are reported for a wide range of polydispersity indexes (PDI) from PDI = 1.63 to PDI = 3.31, which for a log-normal distribution corresponds to the standard deviation ranging from to . These results demonstrate that the effect of polydispersity is negligible in this range in spite of exhibiting very different microstructures. Experimental data in the magnetic saturation regime are in quantitative good agreement with particle-level simulations under the assumption of dipolar magnetostatic forces. The insensitivity of the yield stresses to the polydispersity can be understood from the interplay between the particle cluster size distribution and the packing density of particles inside the clusters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A retrospective assessment of exposure to benzene was carried out for a nested case control study of lympho-haematopoietic cancers, including leukaemia, in the Australian petroleum industry. Each job or task in the industry was assigned a Base Estimate (BE) of exposure derived from task-based personal exposure assessments carried out by the company occupational hygienists. The BEs corresponded to the estimated arithmetic mean exposure to benzene for each job or task and were used in a deterministic algorithm to estimate the exposure of subjects in the study. Nearly all of the data sets underlying the BEs were found to contain some values below the limit of detection (LOD) of the sampling and analytical methods and some were very heavily censored; up to 95% of the data were below the LOD in some data sets. It was necessary, therefore, to use a method of calculating the arithmetic mean exposures that took into account the censored data. Three different methods were employed in an attempt to select the most appropriate method for the particular data in the study. A common method is to replace the missing (censored) values with half the detection limit. This method has been recommended for data sets where much of the data are below the limit of detection or where the data are highly skewed; with a geometric standard deviation of 3 or more. Another method, involving replacing the censored data with the limit of detection divided by the square root of 2, has been recommended when relatively few data are below the detection limit or where data are not highly skewed. A third method that was examined is Cohen's method. This involves mathematical extrapolation of the left-hand tail of the distribution, based on the distribution of the uncensored data, and calculation of the maximum likelihood estimate of the arithmetic mean. When these three methods were applied to the data in this study it was found that the first two simple methods give similar results in most cases. Cohen's method on the other hand, gave results that were generally, but not always, higher than simpler methods and in some cases gave extremely high and even implausible estimates of the mean. It appears that if the data deviate substantially from a simple log-normal distribution, particularly if high outliers are present, then Cohen's method produces erratic and unreliable estimates. After examining these results, and both the distributions and proportions of censored data, it was decided that the half limit of detection method was most suitable in this particular study.