954 resultados para Extreme Quantile


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In quantitative risk analysis, the problem of estimating small threshold exceedance probabilities and extreme quantiles arise ubiquitously in bio-surveillance, economics, natural disaster insurance actuary, quality control schemes, etc. A useful way to make an assessment of extreme events is to estimate the probabilities of exceeding large threshold values and extreme quantiles judged by interested authorities. Such information regarding extremes serves as essential guidance to interested authorities in decision making processes. However, in such a context, data are usually skewed in nature, and the rarity of exceedance of large threshold implies large fluctuations in the distribution's upper tail, precisely where the accuracy is desired mostly. Extreme Value Theory (EVT) is a branch of statistics that characterizes the behavior of upper or lower tails of probability distributions. However, existing methods in EVT for the estimation of small threshold exceedance probabilities and extreme quantiles often lead to poor predictive performance in cases where the underlying sample is not large enough or does not contain values in the distribution's tail. In this dissertation, we shall be concerned with an out of sample semiparametric (SP) method for the estimation of small threshold probabilities and extreme quantiles. The proposed SP method for interval estimation calls for the fusion or integration of a given data sample with external computer generated independent samples. Since more data are used, real as well as artificial, under certain conditions the method produces relatively short yet reliable confidence intervals for small exceedance probabilities and extreme quantiles.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A method to estimate an extreme quantile that requires no distributional assumptions is presented. The approach is based on transformed kernel estimation of the cumulative distribution function (cdf). The proposed method consists of a double transformation kernel estimation. We derive optimal bandwidth selection methods that have a direct expression for the smoothing parameter. The bandwidth can accommodate to the given quantile level. The procedure is useful for large data sets and improves quantile estimation compared to other methods in heavy tailed distributions. Implementation is straightforward and R programs are available.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Testing weather or not data belongs could been generated by a family of extreme value copulas is difficult. We generalize a test and we prove that it can be applied whatever the alternative hypothesis. We also study the effect of using different extreme value copulas in the context of risk estimation. To measure the risk we use a quantile. Our results have motivated by a bivariate sample of losses from a real database of auto insurance claims. Methods are implemented in R.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although stock prices fluctuate, the variations are relatively small and are frequently assumed to be normal distributed on a large time scale. But sometimes these fluctuations can become determinant, especially when unforeseen large drops in asset prices are observed that could result in huge losses or even in market crashes. The evidence shows that these events happen far more often than would be expected under the generalized assumption of normal distributed financial returns. Thus it is crucial to properly model the distribution tails so as to be able to predict the frequency and magnitude of extreme stock price returns. In this paper we follow the approach suggested by McNeil and Frey (2000) and combine the GARCH-type models with the Extreme Value Theory (EVT) to estimate the tails of three financial index returns DJI,FTSE 100 and NIKKEI 225 representing three important financial areas in the world. Our results indicate that EVT-based conditional quantile estimates are much more accurate than those from conventional AR-GARCH models assuming normal or Student’s t-distribution innovations when doing out-of-sample estimation (within the insample estimation, this is so for the right tail of the distribution of returns).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Normal Quantile Transform (NQT) has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF) of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP) developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regional flood frequency techniques are commonly used to estimate flood quantiles when flood data is unavailable or the record length at an individual gauging station is insufficient for reliable analyses. These methods compensate for limited or unavailable data by pooling data from nearby gauged sites. This requires the delineation of hydrologically homogeneous regions in which the flood regime is sufficiently similar to allow the spatial transfer of information. It is generally accepted that hydrologic similarity results from similar physiographic characteristics, and thus these characteristics can be used to delineate regions and classify ungauged sites. However, as currently practiced, the delineation is highly subjective and dependent on the similarity measures and classification techniques employed. A standardized procedure for delineation of hydrologically homogeneous regions is presented herein. Key aspects are a new statistical metric to identify physically discordant sites, and the identification of an appropriate set of physically based measures of extreme hydrological similarity. A combination of multivariate statistical techniques applied to multiple flood statistics and basin characteristics for gauging stations in the Southeastern U.S. revealed that basin slope, elevation, and soil drainage largely determine the extreme hydrological behavior of a watershed. Use of these characteristics as similarity measures in the standardized approach for region delineation yields regions which are more homogeneous and more efficient for quantile estimation at ungauged sites than those delineated using alternative physically-based procedures typically employed in practice. The proposed methods and key physical characteristics are also shown to be efficient for region delineation and quantile development in alternative areas composed of watersheds with statistically different physical composition. In addition, the use of aggregated values of key watershed characteristics was found to be sufficient for the regionalization of flood data; the added time and computational effort required to derive spatially distributed watershed variables does not increase the accuracy of quantile estimators for ungauged sites. This dissertation also presents a methodology by which flood quantile estimates in Haiti can be derived using relationships developed for data rich regions of the U.S. As currently practiced, regional flood frequency techniques can only be applied within the predefined area used for model development. However, results presented herein demonstrate that the regional flood distribution can successfully be extrapolated to areas of similar physical composition located beyond the extent of that used for model development provided differences in precipitation are accounted for and the site in question can be appropriately classified within a delineated region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the observation of multiple harmonic generation in electric dipole spin resonance in an InAs nanowire double quantum dot. The harmonics display a remarkable detuning dependence: near the interdot charge transition as many as eight harmonics are observed, while at large detunings we only observe the fundamental spin resonance condition. The detuning dependence indicates that the observed harmonics may be due to Landau-Zener transition dynamics at anticrossings in the energy level spectrum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to check for any significant differences in perceived quality of life, specifically aspects of a physical nature, among volunteers who are more physically active and those less physically active in a university community. The sample consisted of 1,966 volunteers in a university community in Brazil. To assess physical activity levels, volunteers responded to the International Physical Activity Questionnaire (IPAQ), and to analyse the perception of quality of life they responded to WHOQOL-bref, which is classified into three groups according to level of physical activity, taking into account the metabolic equivalent index (MET) over a full week. For comparison, consideration was given to the first and third tertiles, respectively, namely groups of more and less active students. The results indicated that individuals who engaged in more physical activity had a more positive perception of quality of life compared to those who were less active in physical aspects related to the ability to work, energy for day-to-day activities and locomotion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the limit state design (LSD) method each design criterion is formally stated and assessed using a performance function. The performance function defines the relationship between the design parameters and the design criterion. In practice, LSD involves factoring up loads and factoring down calculated strengths and material parameters. This provides a convenient way to carry out routine probabilistic-based design. The factors are statistically calculated to produce a design with an acceptably low probability of failure. Hence the ultimate load and the design material properties are mathematical concepts that have no physical interpretation. They may be physically impossible. Similarly, the appropriate analysis model is also defined by the performance function and may not describe the real behaviour at the perceived physical equivalent limit condition. These points must be understood to avoid confusion in the discussion and application of partial factor LSD methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Antarctic nemertean worm, Parborlasia corrugatus, exhibits gigantism, reaching at least 100 g, yet lacks any specialised respiratory organs. The diffusion of oxygen into this worm occurs cutaneously. We examined the metabolic rate of P. corrugatus at -1degreesC in response to decreasing ambient PO2. As the PO2 of the water decreased. so did the metabolic rate of P. corrugatus, indicating that this nemertean worm is an extreme example of an oxyconformer. When the water PO2 decreased below about 120 mmHg, the normally short, round worms became elongated and extremely flattened. This behavioural mechanism would allow for an increase in surface area of the skin, thereby facilitating the diffusion of oxygen.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper an approach to extreme event control in wastewater treatment plant operation by use of automatic supervisory control is discussed. The framework presented is based on the fact that different operational conditions manifest themselves as clusters in a multivariate measurement space. These clusters are identified and linked to specific and corresponding events by use of principal component analysis and fuzzy c-means clustering. A reduced system model is assigned to each type of extreme event and used to calculate appropriate local controller set points. In earlier work we have shown that this approach is applicable to wastewater treatment control using look-up tables to determine current set points. In this work we focus on the automatic determination of appropriate set points by use of steady state and dynamic predictions. The performance of a relatively simple steady-state supervisory controller is compared with that of a model predictive supervisory controller. Also, a look-up table approach is included in the comparison, as it provides a simple and robust alternative to the steady-state and model predictive controllers, The methodology is illustrated in a simulation study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present measurements and numerical simulation of a-Si:H p-i-n detectors with a wide range of intrinsic layer thickness between 2 and 10 pm. Such a large active layer thickness is required in applications like elementary particle detectors or X-ray detectors. For large thickness and depending on the applied bias, we observe a sharp peak in the spectral response in the red region near 700 nm. Simulation results obtained with the program ASCA are in agreement with the measurement and permit the explanation of the experimental data. In thick samples holes recombine or are trapped before reaching the contacts, and the conduction mechanism is fully electron dominated. As a consequence, the peak position in the spectral response is located near the optical band gap of the a-Si:H i-layer. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento, Ciências do Mar, especialidade de Biologia Marinha, 18 de Dezembro de 2015, Universidade dos Açores.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An individual experiences double coverage when he bene ts from more than one health insurance plan at the same time. This paper examines the impact of such supplementary insurance on the demand for health care services. Its novelty is that within the context of count data modelling and without imposing restrictive parametric assumptions, the analysis is carried out for di¤erent points of the conditional distribution, not only for its mean location. Results indicate that moral hazard is present across the whole outcome distribution for both public and private second layers of health insurance coverage but with greater magnitude in the latter group. By looking at di¤erent points we unveil that stronger double coverage e¤ects are smaller for high levels of usage. We use data for Portugal, taking advantage of particular features of the public and private protection schemes on top of the statutory National Health Service. By exploring the last Portuguese Health Survey, we were able to evaluate their impacts on the consumption of doctor visi