945 resultados para Probability Distribution Function
Resumo:
The two-parameter Birnbaum-Saunders distribution has been used successfully to model fatigue failure times. Although censoring is typical in reliability and survival studies, little work has been published on the analysis of censored data for this distribution. In this paper, we address the issue of performing testing inference on the two parameters of the Birnbaum-Saunders distribution under type-II right censored samples. The likelihood ratio statistic and a recently proposed statistic, the gradient statistic, provide a convenient framework for statistical inference in such a case, since they do not require to obtain, estimate or invert an information matrix, which is an advantage in problems involving censored data. An extensive Monte Carlo simulation study is carried out in order to investigate and compare the finite sample performance of the likelihood ratio and the gradient tests. Our numerical results show evidence that the gradient test should be preferred. Further, we also consider the generalized Birnbaum-Saunders distribution under type-II right censored samples and present some Monte Carlo simulations for testing the parameters in this class of models using the likelihood ratio and gradient tests. Three empirical applications are presented. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
In this paper we present an extension of the generalized Birnbaum-Saunders distribution family introduced in [Diaz-Garcia, J.A., Leiva-Sanchez, V., 2005. A new family of life distributions based on the contoured elliptically distributions. Journal of Statistical Planning and Inference 128 (2), 445-457] with a view to make it even more flexible in terms of its kurtosis coefficient. Properties involving moments and asymmetry and kurtosis indexes are studied for some special members of this family such as the slash Birnbaum-Saunders and slash-t Birnbaum-Saunders. Simulation studies for some particular cases and a real data analysis are also reported, illustrating the usefulness of the extension considered. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this article, we consider local influence analysis for the skew-normal linear mixed model (SN-LMM). As the observed data log-likelihood associated with the SN-LMM is intractable, Cook`s well-known approach cannot be applied to obtain measures of local influence. Instead, we develop local influence measures following the approach of Zhu and Lee (2001). This approach is based on the use of an EM-type algorithm and is measurement invariant under reparametrizations. Four specific perturbation schemes are discussed. Results obtained for a simulated data set and a real data set are reported, illustrating the usefulness of the proposed methodology.
Resumo:
In this article, we study a new class of non negative distributions generated by the symmetric distributions around zero. For the special case of the distribution generated using the normal distribution, properties like moments generating function, stochastic representation, reliability connections, and inference aspects using methods of moments and maximum likelihood are studied. Moreover, a real data set is analyzed, illustrating the fact that good fits can result.
Resumo:
In this paper we introduce the Weibull power series (WPS) class of distributions which is obtained by compounding Weibull and power series distributions where the compounding procedure follows same way that was previously carried out by Adamidis and Loukas (1998) This new class of distributions has as a particular case the two-parameter exponential power series (EPS) class of distributions (Chahkandi and Gawk 2009) which contains several lifetime models such as exponential geometric (Adamidis and Loukas 1998) exponential Poisson (Kus 2007) and exponential logarithmic (Tahmasbi and Rezaei 2008) distributions The hazard function of our class can be increasing decreasing and upside down bathtub shaped among others while the hazard function of an EPS distribution is only decreasing We obtain several properties of the WPS distributions such as moments order statistics estimation by maximum likelihood and inference for a large sample Furthermore the EM algorithm is also used to determine the maximum likelihood estimates of the parameters and we discuss maximum entropy characterizations under suitable constraints Special distributions are studied in some detail Applications to two real data sets are given to show the flexibility and potentiality of the new class of distributions (C) 2010 Elsevier B V All rights reserved
Resumo:
There are several versions of the lognormal distribution in the statistical literature, one is based in the exponential transformation of generalized normal distribution (GN). This paper presents the Bayesian analysis for the generalized lognormal distribution (logGN) considering independent non-informative Jeffreys distributions for the parameters as well as the procedure for implementing the Gibbs sampler to obtain the posterior distributions of parameters. The results are used to analyze failure time models with right-censored and uncensored data. The proposed method is illustrated using actual failure time data of computers.
Resumo:
Solutions to combinatorial optimization problems, such as problems of locating facilities, frequently rely on heuristics to minimize the objective function. The optimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. Pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small, almost dormant, branch of the literature suggests using statistical principles to estimate the minimum and its bounds as a tool to decide upon stopping and evaluating the quality of the solution. In this paper we examine the functioning of statistical bounds obtained from four different estimators by using simulated annealing on p-median test problems taken from Beasley’s OR-library. We find the Weibull estimator and the 2nd order Jackknife estimator preferable and the requirement of sample size to be about 10 being much less than the current recommendation. However, reliable statistical bounds are found to depend critically on a sample of heuristic solutions of high quality and we give a simple statistic useful for checking the quality. We end the paper with an illustration on using statistical bounds in a problem of locating some 70 distribution centers of the Swedish Post in one Swedish region.
Resumo:
The aim of this paper is to evaluate the performance of two divergent methods for delineating commuting regions, also called labour market areas, in a situation that the base spatial units differ largely in size as a result of an irregular population distribution. Commuting patterns in Sweden have been analyzed with geographical information system technology by delineating commuting regions using two regionalization methods. One, a rule-based method, uses one-way commuting flows to delineate local labour market areas in a top-down procedure based on the selection of predefined employment centres. The other method, the interaction-based Intramax analysis, uses two-way flows in a bottom-up procedure based on numerical taxonomy principles. A comparison of these methods will expose a number of strengths and weaknesses. For both methods, the same data source has been used. The performance of both methods has been evaluated for the country as a whole using resident employed population, self-containment levels and job ratios for criteria. A more detailed evaluation has been done in the Goteborg metropolitan area by comparing regional patterns with the commuting fields of a number of urban centres in this area. It is concluded that both methods could benefit from the inclusion of additional control measures to identify improper allocations of municipalities.
Resumo:
Drinking water distribution networks risk exposure to malicious or accidental contamination. Several levels of responses are conceivable. One of them consists to install a sensor network to monitor the system on real time. Once a contamination has been detected, this is also important to take appropriate counter-measures. In the SMaRT-OnlineWDN project, this relies on modeling to predict both hydraulics and water quality. An online model use makes identification of the contaminant source and simulation of the contaminated area possible. The objective of this paper is to present SMaRT-OnlineWDN experience and research results for hydraulic state estimation with sampling frequency of few minutes. A least squares problem with bound constraints is formulated to adjust demand class coefficient to best fit the observed values at a given time. The criterion is a Huber function to limit the influence of outliers. A Tikhonov regularization is introduced for consideration of prior information on the parameter vector. Then the Levenberg-Marquardt algorithm is applied that use derivative information for limiting the number of iterations. Confidence intervals for the state prediction are also given. The results are presented and discussed on real networks in France and Germany.
Resumo:
Hydrological loss is a vital component in many hydrological models, which are usedin forecasting floods and evaluating water resources for both surface and subsurface flows. Due to the complex and random nature of the rainfall runoff process, hydrological losses are not yet fully understood. Consequently, practitioners often use representative values of the losses for design applications such as rainfall-runoff modelling which has led to inaccurate quantification of water quantities in the resulting applications. The existing hydrological loss models must be revisited and modellers should be encouraged to utilise other available data sets. This study is based on three unregulated catchments situated in Mt. Lofty Ranges of South Australia (SA). The paper focuses on conceptual models for: initial loss (IL), continuing loss (CL) and proportional loss (PL) with rainfall characteristics (total rainfall (TR) and storm duration (D)), and antecedent wetness (AW) conditions. The paper introduces two methods that can be implemented to estimate IL as a function of TR, D and AW. The IL distribution patterns and parameters for the study catchments are determined using multivariate analysis and descriptive statistics. The possibility of generalising the methods and the limitations of this are also discussed. This study will yield improvements to existing loss models and will encourage practitioners to utilise multiple data sets to estimate losses, instead of using hypothetical or representative values to generalise real situations.
Resumo:
This paper investigates the income inequality generated by a jobsearch process when di§erent cohorts of homogeneous workers are allowed to have di§erent degrees of impatience. Using the fact the average wage under the invariant Markovian distribution is a decreasing function of the discount factor (Cysne (2004, 2006)), I show that the Lorenz curve and the between-cohort Gini coe¢ cient of income inequality can be easily derived in this case. An example with arbitrary measures regarding the wage o§ers and the distribution of time preferences among cohorts provides some insights into how much income inequality can be generated, and into how it varies as a function of the probability of unemployment and of the probability that the worker does not Önd a job o§er each period.
Resumo:
This paper investigates the income inequality generated by a jobsearch process when di§erent cohorts of homogeneous workers are allowed to have di§erent degrees of impatience. Using the fact the average wage under the invariant Markovian distribution is a decreasing function of the time preference (Cysne (2004)), I show that the Lorenz curve and the between-cohort Gini coe¢ cient of income inequality can be easily derived in this case. An example with arbitrary measures regarding the wage o§ers and the distribution of time preferences among cohorts provides some quantitative insights into how much income inequality can be generated, and into how it varies as a function of the probability of unemployment and of the probability that the worker does not Önd a job o§er each period.
Resumo:
This pap er analyzes the distribution of money holdings in a commo dity money search-based mo del with intermediation. Intro ducing heterogeneity of costs to the Kiyotaki e Wright ( 1989 ) mo del, Cavalcanti e Puzzello ( 2010) gives rise to a non-degenerated distribution of money. We extend further this mo del intro ducing intermediation in the trading pro cess. We show that the distribution of money matters for savings decisions. This gives rises to a xed p oint problem for the saving function that di cults nding the optimal solution. Through some examples, we show that this friction shrinks the distribution of money. In contrast to the Cavalcanti e Puzzello ( 2010 ) mo del, the optimal solution may not present the entire surplus going to the consumer. At the end of the pap er, we present a strong result, for a su cient large numb er of intermediaries the distribution of money is degenerated.
Resumo:
According to Diamond (1977), one of the reasons for the existence of social security systems is that they function as an income redistribution mechanism. There is an extensive literature that tests whether social security systems produce the desired results in developed countries (mainly for the U.S.A.). Nevertheless, there is not an obvious consensus about this social security property and there is little evidence for developing countries. In this article, we test this property for the Brazilian Social Security System. In addition, we also look at another question which has not been answered yet in the previous literature. Is the trend of social security systems increasingly progressive or regressive? We conclude that the changes in Brazilian Social Security legislation reduced inequality between 1987 and 1996, but only for the elderly. For the other age groups, there is a stable trend. Results for the period between 1996 and 2006 reveal that the Brazilian system is neutral for all cohorts. Therefore, we found out that social security systems are not an effective mechanism for income redistribution, as predicted by previous studies.
Resumo:
The paper analysis a general equilibrium model with two periods, several households and a government that has to finance some expenditures in the first period. Households may have some private information either about their type (adverse selection) or about some action levei chosen in the first period that affects the probability of certain states of nature in the second period (moral hazard). Trade of financiai assets are intermediated by a finite collection of banks. Banks objective functions are determined in equilibrium by shareholders. Due to private information it may be optimal for the banks to introduce constraints in the set of available portfolios for each household as wellas household specific asset prices. In particular, households may face distinct interest rates for holding the risk-free asset. The government finances its expenditures either by taxing households in the first period or by issuing bonds in the first period and taxing households in the second period. Taxes may be state-dependent. Suppose government policies are neutml: i) government policies do not affect the distribution of wealth across households; and ii) if the government decides to tax a household in the second period there is a portfolio available for the banks that generates the Mme payoff in each state of nature as the household taxes. Tben, Ricardian equivalence holds if and only if an appropriate boundary condition is satisfied. Moreover, at every free-entry equilibrium the boundary condition is satisfied and thus Ricardian equivalence holds. These results do not require any particular assumption on the banks' objective function. In particular, we do not assume banks to be risk neutral.