971 resultados para Generalized extreme value distribution


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Normal Quantile Transform (NQT) has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF) of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP) developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spatially dense observations of gust speeds are necessary for various applications, but their availability is limited in space and time. This work presents an approach to help to overcome this problem. The main objective is the generation of synthetic wind gust velocities. With this aim, theoretical wind and gust distributions are estimated from 10 yr of hourly observations collected at 123 synoptic weather stations provided by the German Weather Service. As pre-processing, an exposure correction is applied on measurements of the mean wind velocity to reduce the influence of local urban and topographic effects. The wind gust model is built as a transfer function between distribution parameters of wind and gust velocities. The aim of this procedure is to estimate the parameters of gusts at stations where only wind speed data is available. These parameters can be used to generate synthetic gusts, which can improve the accuracy of return periods at test sites with a lack of observations. The second objective is to determine return periods much longer than the nominal length of the original time series by considering extreme value statistics. Estimates for both local maximum return periods and average return periods for single historical events are provided. The comparison of maximum and average return periods shows that even storms with short average return periods may lead to local wind gusts with return periods of several decades. Despite uncertainties caused by the short length of the observational records, the method leads to consistent results, enabling a wide range of possible applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the generalized log-gamma regression model is modified to allow the possibility that long-term survivors may be present in the data. This modification leads to a generalized log-gamma regression model with a cure rate, encompassing, as special cases, the log-exponential, log-Weibull and log-normal regression models with a cure rate typically used to model such data. The models attempt to simultaneously estimate the effects of explanatory variables on the timing acceleration/deceleration of a given event and the surviving fraction, that is, the proportion of the population for which the event never occurs. The normal curvatures of local influence are derived under some usual perturbation schemes and two martingale-type residuals are proposed to assess departures from the generalized log-gamma error assumption as well as to detect outlying observations. Finally, a data set from the medical area is analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Birnbaum-Saunders (BS) model is a positively skewed statistical distribution that has received great attention in recent decades. A generalized version of this model was derived based on symmetrical distributions in the real line named the generalized BS (GBS) distribution. The R package named gbs was developed to analyze data from GBS models. This package contains probabilistic and reliability indicators and random number generators from GBS distributions. Parameter estimates for censored and uncensored data can also be obtained by means of likelihood methods from the gbs package. Goodness-of-fit and diagnostic methods were also implemented in this package in order to check the suitability of the GBS models. in this article, the capabilities and features of the gbs package are illustrated by using simulated and real data sets. Shape and reliability analyses for GBS models are presented. A simulation study for evaluating the quality and sensitivity of the estimation method developed in the package is provided and discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The two-parameter Birnbaum-Saunders distribution has been used successfully to model fatigue failure times. Although censoring is typical in reliability and survival studies, little work has been published on the analysis of censored data for this distribution. In this paper, we address the issue of performing testing inference on the two parameters of the Birnbaum-Saunders distribution under type-II right censored samples. The likelihood ratio statistic and a recently proposed statistic, the gradient statistic, provide a convenient framework for statistical inference in such a case, since they do not require to obtain, estimate or invert an information matrix, which is an advantage in problems involving censored data. An extensive Monte Carlo simulation study is carried out in order to investigate and compare the finite sample performance of the likelihood ratio and the gradient tests. Our numerical results show evidence that the gradient test should be preferred. Further, we also consider the generalized Birnbaum-Saunders distribution under type-II right censored samples and present some Monte Carlo simulations for testing the parameters in this class of models using the likelihood ratio and gradient tests. Three empirical applications are presented. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solutions to combinatorial optimization problems, such as problems of locating facilities, frequently rely on heuristics to minimize the objective function. The optimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. Pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small, almost dormant, branch of the literature suggests using statistical principles to estimate the minimum and its bounds as a tool to decide upon stopping and evaluating the quality of the solution. In this paper we examine the functioning of statistical bounds obtained from four different estimators by using simulated annealing on p-median test problems taken from Beasley’s OR-library. We find the Weibull estimator and the 2nd order Jackknife estimator preferable and the requirement of sample size to be about 10 being much less than the current recommendation. However, reliable statistical bounds are found to depend critically on a sample of heuristic solutions of high quality and we give a simple statistic useful for checking the quality. We end the paper with an illustration on using statistical bounds in a problem of locating some 70 distribution centers of the Swedish Post in one Swedish region. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solutions to combinatorial optimization, such as p-median problems of locating facilities, frequently rely on heuristics to minimize the objective function. The minimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. However, pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small branch of the literature suggests using statistical principles to estimate the minimum and use the estimate for either stopping or evaluating the quality of the solution. In this paper we use test-problems taken from Baesley's OR-library and apply Simulated Annealing on these p-median problems. We do this for the purpose of comparing suggested methods of minimum estimation and, eventually, provide a recommendation for practioners. An illustration ends the paper being a problem of locating some 70 distribution centers of the Swedish Post in a region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a spatial-temporal downscaling approach to construction of the intensity-duration-frequency (IDF) relations at a local site in the context of climate change and variability. More specifically, the proposed approach is based on a combination of a spatial downscaling method to link large-scale climate variables given by General Circulation Model (GCM) simulations with daily extreme precipitations at a site and a temporal downscaling procedure to describe the relationships between daily and sub-daily extreme precipitations based on the scaling General Extreme Value (GEV) distribution. The feasibility and accuracy of the suggested method were assessed using rainfall data available at eight stations in Quebec (Canada) for the 1961-2000 period and climate simulations under four different climate change scenarios provided by the Canadian (CGCM3) and UK (HadCM3) GCM models. Results of this application have indicated that it is feasible to link sub-daily extreme rainfalls at a local site with large-scale GCM-based daily climate predictors for the construction of the IDF relations for present (1961-1990) and future (2020s, 2050s, and 2080s) periods at a given site under different climate change scenarios. In addition, it was found that annual maximum rainfalls downscaled from the HadCM3 displayed a smaller change in the future, while those values estimated from the CGCM3 indicated a large increasing trend for future periods. This result has demonstrated the presence of high uncertainty in climate simulations provided by different GCMs. In summary, the proposed spatial-temporal downscaling method provided an essential tool for the estimation of extreme rainfalls that are required for various climate-related impact assessment studies for a given region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Market risk exposure plays a key role for nancial institutions risk management. A possible measure for this exposure is to evaluate losses likely to incurwhen the price of the portfolio's assets declines using Value-at-Risk (VaR) estimates, one of the most prominent measure of nancial downside market risk. This paper suggests an evolving possibilistic fuzzy modeling approach for VaR estimation. The approach is based on an extension of the possibilistic fuzzy c-means clustering and functional fuzzy rule-based modeling, which employs memberships and typicalities to update clusters and creates new clusters based on a statistical control distance-based criteria. ePFM also uses an utility measure to evaluate the quality of the current cluster structure. Computational experiments consider data of the main global equity market indexes of United States, London, Germany, Spain and Brazil from January 2000 to December 2012 for VaR estimation using ePFM, traditional VaR benchmarks such as Historical Simulation, GARCH, EWMA, and Extreme Value Theory and state of the art evolving approaches. The results show that ePFM is a potential candidate for VaR modeling, with better performance than alternative approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose a random intercept Poisson model in which the random effect is assumed to follow a generalized log-gamma (GLG) distribution. This random effect accommodates (or captures) the overdispersion in the counts and induces within-cluster correlation. We derive the first two moments for the marginal distribution as well as the intraclass correlation. Even though numerical integration methods are, in general, required for deriving the marginal models, we obtain the multivariate negative binomial model from a particular parameter setting of the hierarchical model. An iterative process is derived for obtaining the maximum likelihood estimates for the parameters in the multivariate negative binomial model. Residual analysis is proposed and two applications with real data are given for illustration. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From November 2004 to December 2007, size-segregated aerosol samples were collected all-year-round at Dome C (East Antarctica) by using PM10 and PM2.5 samplers, and multi-stage impactors. The data set obtained from the chemical analysis provided the longest and the most time-resolved record of sea spray aerosol (sea salt Na+) in inner Antarctica. Sea spray showed a sharp seasonal pattern. The highest values measured in winter (Apr-Nov) were about ten times larger than in summer (Dec-Mar). For the first time, a size-distribution seasonal pattern was also shown: in winter, sea spray particles are mainly submicrometric, while their summer size-mode is around 1-2 µm. Meteorological analysis on a synoptic scale allowed the definition of atmospheric conditions leading sea spray to Dome C. An extreme-value approach along with specific environmental based criteria was taken to yield stronger fingerprints linking atmospheric circulation (means and anomalies) to extreme sea spray events. Air mass back-trajectory analyses for some high sea spray events allowed the identification of two major air mass pathways, reflecting different size distributions: micrometric fractions for transport from the closer Indian-Pacific sector, and sub-micrometric particles for longer trajectories over the Antarctic Plateau. The seasonal pattern of the SO4**2- /Na+ ratio enabled the identification of few events depleted in sulphate, with respect to the seawater composition. By using methanesulphonic acid (MSA) profile to evaluate the biogenic SO4**2- contribution, a more reliable sea salt sulphate was calculated. In this way, few events (mainly in April and in September) were identified originating probably from the "frost flower" source. A comparison with daily-collected superficial snow samples revealed that there is a temporal shift between aerosol and snow sea spray trends. This feature could imply a more complex deposition processes of sea spray, involving significant contribution of wet and diamond dust deposition, but further work has to be carried out to rule out the effect of wind re-distribution and to have more statistic significance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aims to examine the international value distribution structure among major East Asian economies and the US. The mainstream trade theory explains the gains from trade; however, global value chain (GVC) approach emphasises uneven benefits of globalization among trading partners. The present study is mainly based on this view, examining which economy gains the most and which the least from the East Asian production networks. Two key industries, i.e., electronics and automobile, are our principle focus. Input-output method is employed to trace the creation and flows of value-added within the region. A striking fact is that some ASEAN economies increasingly reduce their shares of value-added, taken by developed countries, particularly by Japan. Policy implications are discussed in the final section.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tropospheric scintillation can become a significant impairment in satellite communication systems, especially in those with low fade-margin. Moreover, fast amplitude fluctuations due to scintillation are even larger when rain is present on the propagation path. Few studies of scintillation during rain have been reported and the statistical characterization is still not totally clear. This paper presents experimental results on the relationship between scintillation and rain attenuation obtained from slant-path attenuation measurements at 50 GHz. The study is focused on the probability density function (PDF) of various scintillation parameters. It is shown that scintillation intensity, measured as the standard deviation of the amplitude fluctuations, increases with rain attenuation; in the range 1-10 dB this relationship can be expressed by power-law or linear equations. The PDFs of scintillation intensity conditioned to a given rain attenuation level are lognormal, while the overall long-term PDF is well fltted by a generalized extreme valué (GEV) distribution. The short-term PDFs of amplitude conditioned to a given intensity are normal, although skewness effects are observed for the strongest intensities. A procedure is given to derive numerically the overall PDF of scintillation amplitude using a combination of conditional PDFs and local statistics of rain attenuation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62G32, 62G20.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 62F10, 62F12.