374 resultados para Weibull-jakauma
Resumo:
The England and Wales precipitation (EWP) dataset is a homogeneous time series of daily accumulations from 1931 to 2014, composed from rain gauge observations spanning the region. The daily regional-average precipitation statistics are shown to be well described by a Weibull distribution, which is used to define extremes in terms of percentiles. Computed trends in annual and seasonal precipitation are sensitive to the period chosen, due to large variability on interannual and decadal timescales. Atmospheric circulation patterns associated with seasonal precipitation variability are identified. These patterns project onto known leading modes of variability, all of which involve displacements of the jet stream and storm-track over the eastern Atlantic. The intensity of daily precipitation for each calendar season is investigated by partitioning all observations into eight intensity categories contributing equally to the total precipitation in the dataset. Contrary to previous results based on shorter periods, no significant trends of the most intense categories are found between 1931 and 2014. The regional-average precipitation is found to share statistical properties common to the majority of individual stations across England and Wales used in previous studies. Statistics of the EWP data are examined for multi-day accumulations up to 10 days, which are more relevant for river flooding. Four recent years (2000, 2007, 2008 and 2012) have a greater number of extreme events in the 3-and 5-day accumulations than any previous year in the record. It is the duration of precipitation events in these years that is remarkable, rather than the magnitude of the daily accumulations.
Resumo:
In this paper, we consider some non-homogeneous Poisson models to estimate the probability that an air quality standard is exceeded a given number of times in a time interval of interest. We assume that the number of exceedances occurs according to a non-homogeneous Poisson process (NHPP). This Poisson process has rate function lambda(t), t >= 0, which depends on some parameters that must be estimated. We take into account two cases of rate functions: the Weibull and the Goel-Okumoto. We consider models with and without change-points. When the presence of change-points is assumed, we may have the presence of either one, two or three change-points, depending of the data set. The parameters of the rate functions are estimated using a Gibbs sampling algorithm. Results are applied to ozone data provided by the Mexico City monitoring network. In a first instance, we assume that there are no change-points present. Depending on the adjustment of the model, we assume the presence of either one, two or three change-points. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
In this paper we introduce a parametric model for handling lifetime data where an early lifetime can be related to the infant-mortality failure or to the wear processes but we do not know which risk is responsible for the failure. The maximum likelihood approach and the sampling-based approach are used to get the inferences of interest. Some special cases of the proposed model are studied via Monte Carlo methods for size and power of hypothesis tests. To illustrate the proposed methodology, we introduce an example consisting of a real data set.
Resumo:
In this paper, we introduce a Bayesian analysis for survival multivariate data in the presence of a covariate vector and censored observations. Different ""frailties"" or latent variables are considered to capture the correlation among the survival times for the same individual. We assume Weibull or generalized Gamma distributions considering right censored lifetime data. We develop the Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods.
Resumo:
In this paper, we consider the problem of estimating the number of times an air quality standard is exceeded in a given period of time. A non-homogeneous Poisson model is proposed to analyse this issue. The rate at which the Poisson events occur is given by a rate function lambda(t), t >= 0. This rate function also depends on some parameters that need to be estimated. Two forms of lambda(t), t >= 0 are considered. One of them is of the Weibull form and the other is of the exponentiated-Weibull form. The parameters estimation is made using a Bayesian formulation based on the Gibbs sampling algorithm. The assignation of the prior distributions for the parameters is made in two stages. In the first stage, non-informative prior distributions are considered. Using the information provided by the first stage, more informative prior distributions are used in the second one. The theoretical development is applied to data provided by the monitoring network of Mexico City. The rate function that best fit the data varies according to the region of the city and/or threshold that is considered. In some cases the best fit is the Weibull form and in other cases the best option is the exponentiated-Weibull. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
In this paper we propose a new lifetime distribution which can handle bathtub-shaped unimodal increasing and decreasing hazard rate functions The model has three parameters and generalizes the exponential power distribution proposed by Smith and Bain (1975) with the inclusion of an additional shape parameter The maximum likelihood estimation procedure is discussed A small-scale simulation study examines the performance of the likelihood ratio statistics under small and moderate sized samples Three real datasets Illustrate the methodology (C) 2010 Elsevier B V All rights reserved
Resumo:
Kumaraswamy [Generalized probability density-function for double-bounded random-processes, J. Hydrol. 462 (1980), pp. 79-88] introduced a distribution for double-bounded random processes with hydrological applications. For the first time, based on this distribution, we describe a new family of generalized distributions (denoted with the prefix `Kw`) to extend the normal, Weibull, gamma, Gumbel, inverse Gaussian distributions, among several well-known distributions. Some special distributions in the new family such as the Kw-normal, Kw-Weibull, Kw-gamma, Kw-Gumbel and Kw-inverse Gaussian distribution are discussed. We express the ordinary moments of any Kw generalized distribution as linear functions of probability weighted moments (PWMs) of the parent distribution. We also obtain the ordinary moments of order statistics as functions of PWMs of the baseline distribution. We use the method of maximum likelihood to fit the distributions in the new class and illustrate the potentiality of the new model with an application to real data.
Resumo:
In this paper we deal with a Bayesian analysis for right-censored survival data suitable for populations with a cure rate. We consider a cure rate model based on the negative binomial distribution, encompassing as a special case the promotion time cure model. Bayesian analysis is based on Markov chain Monte Carlo (MCMC) methods. We also present some discussion on model selection and an illustration with a real dataset.
Resumo:
In this paper, the generalized log-gamma regression model is modified to allow the possibility that long-term survivors may be present in the data. This modification leads to a generalized log-gamma regression model with a cure rate, encompassing, as special cases, the log-exponential, log-Weibull and log-normal regression models with a cure rate typically used to model such data. The models attempt to simultaneously estimate the effects of explanatory variables on the timing acceleration/deceleration of a given event and the surviving fraction, that is, the proportion of the population for which the event never occurs. The normal curvatures of local influence are derived under some usual perturbation schemes and two martingale-type residuals are proposed to assess departures from the generalized log-gamma error assumption as well as to detect outlying observations. Finally, a data set from the medical area is analyzed.
Resumo:
We discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models. We generalize an earlier work, considering the sojourn times in health states are not identically distributed, for a given vector of covariates. Approaches based on semiparametric and parametric (exponential and Weibull distributions) methodologies are considered. A simulation study is conducted to evaluate the performance of the proposed estimator and the jackknife resampling method is used to estimate the variance of such estimator. An application to a real data set is also included.
Resumo:
This thesis work concerns about the Performance evolution of peer to peer networks, where we used different distribution technique’s of peer distribution like Weibull, Lognormal and Pareto distribution process. Then we used a network simulator to evaluate the performance of these three distribution techniques.During the last decade the Internet has expanded into a world-wide network connecting millions of hosts and users and providing services for everyone. Many emerging applications are bandwidth-intensive in their nature; the size of downloaded files including music and videos can be huge, from ten megabits to many gigabits. The efficient use of network resources is thus crucial for the survivability of the Internet. Traffic engineering (TE) covers a range of mechanisms for optimizing operational networks from the traffic perspective. The time scale in traffic engineering varies from the short-term network control to network planning over a longer time period.Here in this thesis work we considered the peer distribution technique in-order to minimise the peer arrival and service process with three different techniques, where we calculated the congestion parameters like blocking time for each peer before entering into the service process, waiting time for a peers while the other peer has been served in the service block and the delay time for each peer. Then calculated the average of each process and graphs have been plotted using Matlab to analyse the results
Resumo:
Quadratic assignment problems (QAPs) are commonly solved by heuristic methods, where the optimum is sought iteratively. Heuristics are known to provide good solutions but the quality of the solutions, i.e., the confidence interval of the solution is unknown. This paper uses statistical optimum estimation techniques (SOETs) to assess the quality of Genetic algorithm solutions for QAPs. We examine the functioning of different SOETs regarding biasness, coverage rate and length of interval, and then we compare the SOET lower bound with deterministic ones. The commonly used deterministic bounds are confined to only a few algorithms. We show that, the Jackknife estimators have better performance than Weibull estimators, and when the number of heuristic solutions is as large as 100, higher order JK-estimators perform better than lower order ones. Compared with the deterministic bounds, the SOET lower bound performs significantly better than most deterministic lower bounds and is comparable with the best deterministic ones.
Resumo:
Solutions to combinatorial optimization problems, such as problems of locating facilities, frequently rely on heuristics to minimize the objective function. The optimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. Pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small, almost dormant, branch of the literature suggests using statistical principles to estimate the minimum and its bounds as a tool to decide upon stopping and evaluating the quality of the solution. In this paper we examine the functioning of statistical bounds obtained from four different estimators by using simulated annealing on p-median test problems taken from Beasley’s OR-library. We find the Weibull estimator and the 2nd order Jackknife estimator preferable and the requirement of sample size to be about 10 being much less than the current recommendation. However, reliable statistical bounds are found to depend critically on a sample of heuristic solutions of high quality and we give a simple statistic useful for checking the quality. We end the paper with an illustration on using statistical bounds in a problem of locating some 70 distribution centers of the Swedish Post in one Swedish region.
Resumo:
Esta dissertação aborda a estimativa das probabilidades de falha de um produto ao longo do período de garantia. As fontes de dados para esta estimativa são a quantidade de produtos vendidos e o número de falhas detectadas em cada mês. Duas metodologias não-paramétricas para esta análise são apresentadas e validadas. A metodologia de análise de dados completos requer o conhecimento da data de venda e de falha de cada produto. A metodologia de análise de dados incompletos requer apenas os totais de vendas e falhas em cada mês ao longo do período de garantia. Para os dois casos, é ainda implementada a suavização das probabilidades de falha estimadas, utilizando distribuições paramétricas Weibull ou Lognormal. As duas técnicas são implementadas em planilha eletrônica e aplicadas na análise de dados simulados. O desempenho de cada metodologia é avaliado com dados de diferentes características, resultando em recomendações para escolha e aplicação da metodologia mais adequada em cada caso
A oferta de bens perecíveis : determinação das curvas e estimativas para o mercado varejista Carioca
Resumo:
Neste artigo desenvolvemos um modelo de otimização que permite determinar curvas de oferta para produtos perecíveis. As informações disponíveis ao produtor(vendedor) sobre a demanda são sumarizadas por uma distribuição de probabilidade, a qual lhe permitirá determinar a oferta que maximiza seu gannho esperado. A função objetivo levará em conta, conjuntamente, as perdas decorrentes de um estoque excedente e parcelas do custo de oportunidade de um estoque insuficiente. Esta formalização inclui a função lucro (contábil) como um caso particular. Aplicações do modelo teórico são feitas para demandas absolutamente contínuas com distribuição admitindo inversa explícita, tais como a exponencial truncada, Pareto, Weibull e uniforme. Estimativas empíricas são obtidas para a oferta de tomates, chuchus e pimentões no mercado varejista Carioca (Julho/94 à Nov/00). Os resultados obtidos confirmam as hipóteses racionais do modelo teórico. As elasticidades-preço da oferta (no varejo) e da demanda (no atacado) são estimadas, assim que o valor da oferta ”contratual ”, nos casos em que parte dos custos dos saldos de estoque são recuperados por vendas em liquidação. A análise gráfica das curvas de oferta e de densidade da demanda sugerem a presença de significativo poder de mercado na comercialização do pimentão. Duas extensões imediatas do modelo formal são desenvolvidas. A primeira delas incorpora a existência de poder de mercado no mercado varejista. A segunda introduz uma estrutura de jogo simultâneo no mercado oligopolista onde cada produtor escolhe a curva que maximiza seu lucro esperado condicional, dado que as ofertas dos concorrentes igualam suas demandas. No equilíbrio de Nash, curvas de oferta ótimas são obtidas. Comparações são feitas com as curvas ótimas obtidas em regime de autarquia.