911 resultados para extreme value theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contents:
1. Role of multi-criteria decision making in natural resource management /​ Gamini Herath and Tony Prato
2. Analysis of forest policy using multi-attribute value theory /​ Jayanath Ananda and Gamini Herath
3. Comparing Riparian revegetation policy options using the analytic hierarchy process /​ M. E. Qureshi and S. R. Harrison
4. Managing environmental and health risks from a lead and zinc smelter : an application of deliberative multi-criteria evaluation /​ Wendy Proctor, Chris McQuade and Anne Dekker
5. Multiple attribute evaluation of management alternatives for the Missouri River System /​ Tony Prato
6. Multi-criteria decision analysis for integrated watershed management /​ Zeyuan Qiu
7. Fuzzy multiple attribute evaluation of agricultural systems /​ Leonie A. Marks and Elizabeth G. Dunn
8. Multi-criteria decision support for energy supply assessment /​ Bram Noble
9. Seaport development in Vietnam : evaluation using the analytic hierarchy process /​ Tran Phuong Dong and David M. Chapman
10. Valuing wetland aquatic resources using the analytic hierarchy process /​ Premachandra Wattage and Simon Mardle
11. Multiple attribute evaluation for national park management /​ Tony Prato
12. The future of MCDA in natural resource management : some generalizations /​ Gamini Herath and Tony Prato.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Any attempt to model an economy requires foundational assumptions about the relations between prices, values and the distribution of wealth. These assumptions exert a profound influence over the results of any model. Unfortunately, there are few areas in economics as vexed as the theory of value. I argue in this paper that the fundamental problem with past theories of value is that it is simply not possible to model the determination of value, the formation of prices and the distribution of income in a real economy with analytic mathematical models. All such attempts leave out crucial processes or make unrealistic assumptions which significantly affect the results. There have been two primary approaches to the theory of value. The first, associated with classical economists such as Ricardo and Marx were substance theories of value, which view value as a substance inherent in an object and which is conserved in exchange. For Marxists, the value of a commodity derives solely from the value of the labour power used to produce it - and therefore any profit is due to the exploitation of the workers. The labour theory of value has been discredited because of its assumption that labour was the only ‘factor’ that contributed to the creation of value, and because of its fundamentally circular argument. Neoclassical theorists argued that price was identical with value and was determined purely by the interaction of supply and demand. Value then, was completely subjective. Returns to labour (wages) and capital (profits) were determined solely by their marginal contribution to production, so that each factor received its just reward by definition. Problems with the neoclassical approach include assumptions concerning representative agents, perfect competition, perfect and costless information and contract enforcement, complete markets for credit and risk, aggregate production functions and infinite, smooth substitution between factors, distribution according to marginal products, firms always on the production possibility frontier and firms’ pricing decisions, ignoring money and credit, and perfectly rational agents with infinite computational capacity. Two critical areas include firstly, the underappreciated Sonnenschein-Mantel- Debreu results which showed that the foundational assumptions of the Walrasian general-equilibrium model imply arbitrary excess demand functions and therefore arbitrary equilibrium price sets. Secondly, in real economies, there is no equilibrium, only continuous change. Equilibrium is never reached because of constant changes in preferences and tastes; technological and organisational innovations; discoveries of new resources and new markets; inaccurate and evolving expectations of businesses, consumers, governments and speculators; changing demand for credit; the entry and exit of firms; the birth, learning, and death of citizens; changes in laws and government policies; imperfect information; generalized increasing returns to scale; random acts of impulse; weather and climate events; changes in disease patterns, and so on. The problem is not the use of mathematical modelling, but the kind of mathematical modelling used. Agent-based models (ABMs), objectoriented programming and greatly increased computer power however, are opening up a new frontier. Here a dynamic bargaining ABM is outlined as a basis for an alternative theory of value. A large but finite number of heterogeneous commodities and agents with differing degrees of market power are set in a spatial network. Returns to buyers and sellers are decided at each step in the value chain, and in each factor market, through the process of bargaining. Market power and its potential abuse against the poor and vulnerable are fundamental to how the bargaining dynamics play out. Ethics therefore lie at the very heart of economic analysis, the determination of prices and the distribution of wealth. The neoclassicals are right then that price is the enumeration of value at a particular time and place, but wrong to downplay the critical roles of bargaining, power and ethics in determining those same prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change has resulted in substantial variations in annual extreme rainfall quantiles in different durations and return periods. Predicting the future changes in extreme rainfall quantiles is essential for various water resources design, assessment, and decision making purposes. Current Predictions of future rainfall extremes, however, exhibit large uncertainties. According to extreme value theory, rainfall extremes are rather random variables, with changing distributions around different return periods; therefore there are uncertainties even under current climate conditions. Regarding future condition, our large-scale knowledge is obtained using global climate models, forced with certain emission scenarios. There are widely known deficiencies with climate models, particularly with respect to precipitation projections. There is also recognition of the limitations of emission scenarios in representing the future global change. Apart from these large-scale uncertainties, the downscaling methods also add uncertainty into estimates of future extreme rainfall when they convert the larger-scale projections into local scale. The aim of this research is to address these uncertainties in future projections of extreme rainfall of different durations and return periods. We plugged 3 emission scenarios with 2 global climate models and used LARS-WG, a well-known weather generator, to stochastically downscale daily climate models’ projections for the city of Saskatoon, Canada, by 2100. The downscaled projections were further disaggregated into hourly resolution using our new stochastic and non-parametric rainfall disaggregator. The extreme rainfall quantiles can be consequently identified for different durations (1-hour, 2-hour, 4-hour, 6-hour, 12-hour, 18-hour and 24-hour) and return periods (2-year, 10-year, 25-year, 50-year, 100-year) using Generalized Extreme Value (GEV) distribution. By providing multiple realizations of future rainfall, we attempt to measure the extent of total predictive uncertainty, which is contributed by climate models, emission scenarios, and downscaling/disaggregation procedures. The results show different proportions of these contributors in different durations and return periods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Market risk exposure plays a key role for nancial institutions risk management. A possible measure for this exposure is to evaluate losses likely to incurwhen the price of the portfolio's assets declines using Value-at-Risk (VaR) estimates, one of the most prominent measure of nancial downside market risk. This paper suggests an evolving possibilistic fuzzy modeling approach for VaR estimation. The approach is based on an extension of the possibilistic fuzzy c-means clustering and functional fuzzy rule-based modeling, which employs memberships and typicalities to update clusters and creates new clusters based on a statistical control distance-based criteria. ePFM also uses an utility measure to evaluate the quality of the current cluster structure. Computational experiments consider data of the main global equity market indexes of United States, London, Germany, Spain and Brazil from January 2000 to December 2012 for VaR estimation using ePFM, traditional VaR benchmarks such as Historical Simulation, GARCH, EWMA, and Extreme Value Theory and state of the art evolving approaches. The results show that ePFM is a potential candidate for VaR modeling, with better performance than alternative approaches.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: Primary 62F35; Secondary 62P99

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the numerical study of the statistical properties of the partially coherent quasi-CW high-Q cavity Raman fiber laser. The statistical properties are different for the radiation generated at the spectrum center or spectral wings. It is found that rare extreme events are generated at the far spectral wings at one pass only. The mechanism of the extreme events generation is a turbulent-like four-wave mixing of numerous longitudinal generation modes. The similar mechanism of extreme waves appearance during the laser generation could be important in other types of fiber lasers. © 2012 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is found that rare extreme events are generated in a Raman fiber laser. The mechanism of the extreme events generation is a turbulent-like four-wave mixing of numerous longitudinal generation modes. © 2012 OSA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the numerical study of the statistical properties of the partially coherent quasi-CW high-Q cavity Raman fiber laser. The statistical properties are different for the radiation generated at one or many cavity passes. It is found that rare extreme events are generated at the far spectral wings of the spectrum. The mechanism of the extreme events generation is a turbulent-like four-wave mixing of numerous longitudinal generation modes. © 2011 Optical Society of America.