911 resultados para extreme value theory


Relevância:

80.00% 80.00%

Publicador:

Resumo:

从经济学角度探讨水资源价值的理论基础,认为仅用劳动价值理论或西方效用价值理论都不能解释水资源价值,经过四象限模型分析可知,需要将两种理论结合起来。效用价值是水资源价值研究的起点,而随着需求的增加,大量劳动的投入,劳动价值的作用越来越明显,在整个互动过程中,经济和技术发展水平是水资源价值的重要影响因素。

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A comparison study was carried out between a wireless sensor node with a bare die flip-chip mounted and its reference board with a BGA packaged transceiver chip. The main focus is the return loss (S parameter S11) at the antenna connector, which was highly depended on the impedance mismatch. Modeling including the different interconnect technologies, substrate properties and passive components, was performed to simulate the system in Ansoft Designer software. Statistical methods, such as the use of standard derivation and regression, were applied to the RF performance analysis, to see the impacts of the different parameters on the return loss. Extreme value search, following on the previous analysis, can provide the parameters' values for the minimum return loss. Measurements fit the analysis and simulation well and showed a great improvement of the return loss from -5dB to -25dB for the target wireless sensor node.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trends in sample extremes are of interest in many contexts, an example being environmental statistics. Parametric models are often used to model trends in such data, but they may not be suitable for exploratory data analysis. This paper outlines a semiparametric approach to smoothing example extremes, based on local polynomial fitting of the generalized extreme value distribution and related models. The uncertainty of fits is assessed by using resampling methods. The methods are applied to data on extreme temperatures and on record times for the womens 3000m race.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this paper is to demonstrate that, even if Marx's solution to the transformation problem can be modified, his basic concusions remain valid.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study is about the stability of random sums and extremes.The difficulty in finding exact sampling distributions resulted in considerable problems of computing probabilities concerning the sums that involve a large number of terms.Functions of sample observations that are natural interest other than the sum,are the extremes,that is , the minimum and the maximum of the observations.Extreme value distributions also arise in problems like the study of size effect on material strengths,the reliability of parallel and series systems made up of large number of components,record values and assessing the levels of air pollution.It may be noticed that the theories of sums and extremes are mutually connected.For instance,in the search for asymptotic normality of sums ,it is assumed that at least the variance of the population is finite.In such cases the contributions of the extremes to the sum of independent and identically distributed(i.i.d) r.vs is negligible.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis, the concept of reversed lack of memory property and its generalizations is studied.We we generalize this property which involves operations different than the ”addition”. In particular an associative, binary operator ” * ” is considered. The univariate reversed lack of memory property is generalized using the binary operator and a class of probability distributions which include Type 3 extreme value, power function, reflected Weibull and negative Pareto distributions are characterized (Asha and Rejeesh (2009)). We also define the almost reversed lack of memory property and considered the distributions with reversed periodic hazard rate under the binary operation. Further, we give a bivariate extension of the generalized reversed lack of memory property and characterize a class of bivariate distributions which include the characterized extension (CE) model of Roy (2002a) apart from the bivariate reflected Weibull and power function distributions. We proved the equality of local proportionality of the reversed hazard rate and generalized reversed lack of memory property. Study of uncertainty is a subject of interest common to reliability, survival analysis, actuary, economics, business and many other fields. However, in many realistic situations, uncertainty is not necessarily related to the future but can also refer to the past. Recently, Di Crescenzo and Longobardi (2009) introduced a new measure of information called dynamic cumulative entropy. Dynamic cumulative entropy is suitable to measure information when uncertainty is related to the past, a dual concept of the cumulative residual entropy which relates to uncertainty of the future lifetime of a system. We redefine this measure in the whole real line and study its properties. We also discuss the implications of generalized reversed lack of memory property on dynamic cumulative entropy and past entropy.In this study, we extend the idea of reversed lack of memory property to the discrete set up. Here we investigate the discrete class of distributions characterized by the discrete reversed lack of memory property. The concept is extended to the bivariate case and bivariate distributions characterized by this property are also presented. The implication of this property on discrete reversed hazard rate, mean past life, and discrete past entropy are also investigated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The classical methods of analysing time series by Box-Jenkins approach assume that the observed series uctuates around changing levels with constant variance. That is, the time series is assumed to be of homoscedastic nature. However, the nancial time series exhibits the presence of heteroscedasticity in the sense that, it possesses non-constant conditional variance given the past observations. So, the analysis of nancial time series, requires the modelling of such variances, which may depend on some time dependent factors or its own past values. This lead to introduction of several classes of models to study the behaviour of nancial time series. See Taylor (1986), Tsay (2005), Rachev et al. (2007). The class of models, used to describe the evolution of conditional variances is referred to as stochastic volatility modelsThe stochastic models available to analyse the conditional variances, are based on either normal or log-normal distributions. One of the objectives of the present study is to explore the possibility of employing some non-Gaussian distributions to model the volatility sequences and then study the behaviour of the resulting return series. This lead us to work on the related problem of statistical inference, which is the main contribution of the thesis

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Numerous studies have proven an effect of a probable climate change on the hydrosphere’s different subsystems. In the 21st century global and regional redistribution of water has to be expected and it is very likely that extreme weather phenomenon will occur more frequently. From a global view the flood situation will exacerbate. In contrast to these discoveries the classical approach of flood frequency analysis provides terms like “mean flood recurrence interval”. But for this analysis to be valid there is a need for the precondition of stationary distribution parameters which implies that the flood frequencies are constant in time. Newer approaches take into account extreme value distributions with time-dependent parameters. But the latter implies a discard of the mentioned old terminology that has been used up-to-date in engineering hydrology. On the regional scale climate change affects the hydrosphere in various ways. So, the question appears to be whether in central Europe the classical approach of flood frequency analysis is not usable anymore and whether the traditional terminology should be renewed. In the present case study hydro-meteorological time series of the Fulda catchment area (6930 km²), upstream of the gauging station Bonaforth, are analyzed for the time period 1960 to 2100. At first a distributed catchment area model (SWAT2005) is build up, calibrated and finally validated. The Edertal reservoir is regulated as well by a feedback control of the catchments output in case of low water. Due to this intricacy a special modeling strategy has been necessary: The study area is divided into three SWAT basin models and an additional physically-based reservoir model is developed. To further improve the streamflow predictions of the SWAT model, a correction by an artificial neural network (ANN) has been tested successfully which opens a new way to improve hydrological models. With this extension the calibration and validation of the SWAT model for the Fulda catchment area is improved significantly. After calibration of the model for the past 20th century observed streamflow, the SWAT model is driven by high resolution climate data of the regional model REMO using the IPCC scenarios A1B, A2, and B1, to generate future runoff time series for the 21th century for the various sub-basins in the study area. In a second step flood time series HQ(a) are derived from the 21st century runoff time series (scenarios A1B, A2, and B1). Then these flood projections are extensively tested with regard to stationarity, homogeneity and statistical independence. All these tests indicate that the SWAT-predicted 21st-century trends in the flood regime are not significant. Within the projected time the members of the flood time series are proven to be stationary and independent events. Hence, the classical stationary approach of flood frequency analysis can still be used within the Fulda catchment area, notwithstanding the fact that some regional climate change has been predicted using the IPCC scenarios. It should be noted, however, that the present results are not transferable to other catchment areas. Finally a new method is presented that enables the calculation of extreme flood statistics, even if the flood time series is non-stationary and also if the latter exhibits short- and longterm persistence. This method, which is called Flood Series Maximum Analysis here, enables the calculation of maximum design floods for a given risk- or safety level and time period.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

RESUMO: Algumas investigações têm apontado que a “motivação dos alunos” influencia a participação e os resultados escolares, destacando-se de um conjunto de outras variáveis. A declaração de Utilidade das disciplinas, atribuída pelos alunos, poderá ser um dos construtos de “motivação” determinantes no “gosto pelas actividades lectivas”, influenciando, hipoteticamente, o empenho na aprendizagem e, a partir daí, os resultados escolares dos alunos. Argumentou-se, nos 3 estudos aqui apresentados, as relações da atribuição de Utilidade das Disciplinas, incluindo a perspectiva do tempo futuro, com o Gosto pelas disciplinas, o empenho na aprendizagem e os resultados dos alunos. De forma a verificar, perante este enquadramento, onde se destacam a Teoria de Expectiva-Valor, o conceito de Perspectiva de Tempo Futuro associado ao conceito de Instrumentalidade, e hipóteses, foi elaborado e aplicado um questionário. Na análise aos questionários realçamos que foi verificado uma tendência para atribuir pelo menos uma disciplina como a que os alunos mais gostam e mais útil consideram; uma justificação com motivos relacionados com a sua perspectiva de vida futura à disciplina que indicam como a mais útil e resultados médios altos a essas mesmas disciplinas. ABSTRACT: Some studies have pointed out that the "motivation of the students’ participation and influence on educational attainment, highlighting a number of other variables. The declaration of Utility of disciplines, awarded by students, may be one of the constructs of "motivation" in determining taste for teaching activities, influencing, hypothetically, the commitment to learning and, thereafter, the students’ school results. It was argued in three studies presented here, the relations of the allocation of Utility of disciplines, including the prospect of future time, with the taste for discipline, the commitment to learning and student outcomes. In order to asses this framework, where we highlight the expectancy-value theory, the concept of Future Time Perspective associated with the concept of Instrumentality, and hypotheses, a questionnaire was designed and applied. In the analysis the questionnaires highlight a trend that was to assign at least one discipline as the students like best and more useful view, a justification with reasons related to their future life prospects which indicate the discipline as the most useful and results medium high on those subjects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the Essence project a 17-member ensemble simulation of climate change in response to the SRES A1b scenario has been carried out using the ECHAM5/MPI-OM climate model. The relatively large size of the ensemble makes it possible to accurately investigate changes in extreme values of climate variables. Here we focus on the annual-maximum 2m-temperature and fit a Generalized Extreme Value (GEV) distribution to the simulated values and investigate the development of the parameters of this distribution. Over most land areas both the location and the scale parameter increase. Consequently the 100-year return values increase faster than the average temperatures. A comparison of simulated 100-year return values for the present climate with observations (station data and reanalysis) shows that the ECHAM5/MPI-OM model, as well as other models, overestimates extreme temperature values. After correcting for this bias, it still shows values in excess of 50°C in Australia, India, the Middle East, North Africa, the Sahel and equatorial and subtropical South America at the end of the century.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A statistical methodology is proposed and tested for the analysis of extreme values of atmospheric wave activity at mid-latitudes. The adopted methods are the classical block-maximum and peak over threshold, respectively based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD). Time-series of the ‘Wave Activity Index’ (WAI) and the ‘Baroclinic Activity Index’ (BAI) are computed from simulations of the General Circulation Model ECHAM4.6, which is run under perpetual January conditions. Both the GEV and the GPD analyses indicate that the extremes ofWAI and BAI areWeibull distributed, this corresponds to distributions with an upper bound. However, a remarkably large variability is found in the tails of such distributions; distinct simulations carried out under the same experimental setup provide sensibly different estimates of the 200-yr WAI return level. The consequences of this phenomenon in applications of the methodology to climate change studies are discussed. The atmospheric configurations characteristic of the maxima and minima of WAI and BAI are also examined.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant nondimensional number (Eckert number Ec) is different from zero, an additional time scale of O(Ec−1) is introduced in the system, as shown with standard multiscale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f3/2 power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties (ergodicity, hyperbolicity) and that cause the model losing ability in describing intrinsically multiscale processes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An analysis of the climate of precipitation extremes as simulated by six European regional climate models (RCMs) is undertaken in order to describe/quantify future changes and to examine/interpret differences between models. Each model has adopted boundary conditions from the same ensemble of global climate model integrations for present (1961–1990) and future (2071–2100) climate under the Intergovernmental Panel on Climate Change A2 emission scenario. The main diagnostics are multiyear return values of daily precipitation totals estimated from extreme value analysis. An evaluation of the RCMs against observations in the Alpine region shows that model biases for extremes are comparable to or even smaller than those for wet day intensity and mean precipitation. In winter, precipitation extremes tend to increase north of about 45°N, while there is an insignificant change or a decrease to the south. In northern Europe the 20-year return value of future climate corresponds to the 40- to 100-year return value of present climate. There is a good agreement between the RCMs, and the simulated change is similar to a scaling of present-day extremes by the change in average events. In contrast, there are large model differences in summer when RCM formulation contributes significantly to scenario uncertainty. The model differences are well explained by differences in the precipitation frequency and intensity process, but in all models, extremes increase more or decrease less than would be expected from the scaling of present-day extremes. There is evidence for a component of the change that affects extremes specifically and is consistent between models despite the large variation in the total response.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a method for describing the distribution of observed temperatures on any day of the year such that the distribution and summary statistics of interest derived from the distribution vary smoothly through the year. The method removes the noise inherent in calculating summary statistics directly from the data thus easing comparisons of distributions and summary statistics between different periods. The method is demonstrated using daily effective temperatures (DET) derived from observations of temperature and wind speed at De Bilt, Holland. Distributions and summary statistics are obtained from 1985 to 2009 and compared to the period 1904–1984. A two-stage process first obtains parameters of a theoretical probability distribution, in this case the generalized extreme value (GEV) distribution, which describes the distribution of DET on any day of the year. Second, linear models describe seasonal variation in the parameters. Model predictions provide parameters of the GEV distribution, and therefore summary statistics, that vary smoothly through the year. There is evidence of an increasing mean temperature, a decrease in the variability in temperatures mainly in the winter and more positive skew, more warm days, in the summer. In the winter, the 2% point, the value below which 2% of observations are expected to fall, has risen by 1.2 °C, in the summer the 98% point has risen by 0.8 °C. Medians have risen by 1.1 and 0.9 °C in winter and summer, respectively. The method can be used to describe distributions of future climate projections and other climate variables. Further extensions to the methodology are suggested.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Normal Quantile Transform (NQT) has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF) of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP) developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.