974 resultados para Percolation probability
Resumo:
2000 Mathematics Subject Classification: 60J65.
Resumo:
In this note we discuss upper and lower bound for the ruin probability in an insurance model with very heavy-tailed claims and interarrival times.
Resumo:
2000 Mathematics Subject Classification: 33C90, 62E99
Resumo:
2000 Mathematics Subject Classification: 62G07, 62L20.
Resumo:
A címben említett három fogalom a közgazdasági elméletben központi szerepet foglal el. Ezek viszonya elsősorban a közgazdaságtudományi megismerés határait feszegeti. Mit tudunk a gazdasági döntésekről? Milyen információk alapján születnek a döntések? Lehet-e a gazdasági döntéseket „tudományos” alapra helyezni? A bizonytalanság kérdéséről az 1920-as években való megjelenése óta mindent elmondtak. Megvizsgálták a kérdést filozófiailag, matematikailag. Tárgyalták a kérdés számtalan elméleti és gyakorlati aspektusát. Akkor miért kell sokadszorra is foglalkozni a témával? A válasz igen egyszerű: azért, mert a kérdés minden szempontból ténylegesen alapvető, és mindenkor releváns. Úgy hírlik, hogy a római diadalmenetekben a győztes szekerén mindig volt egy rabszolga is, aki folyamatosan figyelmeztette a diadaltól megmámorosodott vezért, hogy ő is csak egy ember, ezt ne feledje el. A gazdasági döntéshozókat hasonló módon újra és újra figyelmeztetni kell arra, hogy a gazdasági döntések a bizonytalanság jegyében születnek. A gazdasági folyamatok megérthetőségének és kontrollálhatóságának van egy igen szoros korlátja. Ezt a korlátot a folyamatok inherens bizonytalansága adja. A gazdasági döntéshozók fülébe folyamatosan duruzsolni kell: ők is csak emberek, és ezért ismereteik igen korlátozottak. A „bátor” döntések során az eredmény bizonytalan, a tévedés azonban bizonyosra vehető. / === / In the article the author presents some remarks on the application of probability theory in financial decision making. From mathematical point of view the risk neutral measures used in finance are some version of separating hyperplanes used in optimization theory and in general equilibrium theory. Therefore they are just formally a probabilities. They interpretation as probabilities are misleading analogies leading to wrong decisions.
Resumo:
There is a long debate (going back to Keynes) how to interpret the concept of probability in economics, in business decisions, in finance. Iván Bélyácz suggested that the Black–Scholes– Merton analysis of fi nancial derivatives has a contribution to this risk vs. uncertainty debate. This article tries to interpret this suggestion, from the viewpoint of traded options, real options, Arrow–Debreu model, Heath–Jarrow–Morton model, insurance business. The article suggests making clear distinction and using different naming ● when the frequents approach and the statistics is relevant, ● when we just use consequent relative weights during the no-arbitrage pricing, and these weight are just interpreted as probabilities, ● when we just lack the necessary information, and there is a basic uncertainty in the business decision making process. The paper suggests making a sharp distinction between fi nancial derivatives used for market risk management and credit risk type derivatives (CDO, CDS, etc) in the reregulation process of the fi nancial markets.
Resumo:
In this study, I determined the identity, taxonomic placement, and distribution of digenetic trematodes parasitizing the snails Pomacea paludosa and Planorbella duryi at Pa-hay-okee, Everglades National Park. I also characterized temporal and geographic variation in the probability of parasite infection for these snails based on two years of sampling. Although studies indicate that digenean parasites may have important effects both on individual species and the structure of communities, there have been no studies of digenean parasitism on snails within the Everglades ecosystem. For example, the endangered Everglade Snail Kite, a specialist that feeds almost exclusively on Pomacea paludosa, and is known to be a definitive host of digenean parasites, may suffer direct and indirect effects from consumption of parasitized apple snails. Therefore, information on the diversity and abundance of parasites harbored in snail populations in the Everglades should be of considerable interest for management and conservation of wildlife. Juvenile digeneans (cercariae) representing 20 species were isolated from these two snails, representing a quadrupling of the number of species known. Species were characterized based on morphological, morphometric, and sequence data (18S rDNA, COI, and ITS). Species richness of shed cercariae from P. duryi was greater than P. paludosa, with 13 and 7 species respectively. These species represented 14 families. P. paludosa and P. duryi had no digenean species in common. Probability of digenean infection was higher for P. duryi than P. paludosa and adults showed a greater risk of infection than juveniles for both of these snails. Planorbella duryi showed variation in probability of infection between sampling sites and hydrological seasons. The number of unique combinations of multi-species infections was greatest among P. duryi individuals, while the overall percentage of multi-species infections was greatest in P. paludosa. Analyses of six frequently-observed multiple infections from P. duryi suggest the presence of negative interactions, positive interactions, and neutral associations between larval digeneans. These results should contribute to an understanding of the factors controlling the abundance and distribution of key species in the Everglades ecosystem and may in particular help in the management and recovery planning for the Everglade Snail Kite.
Resumo:
Various physical systems have dynamics that can be modeled by percolation processes. Percolation is used to study issues ranging from fluid diffusion through disordered media to fragmentation of a computer network caused by hacker attacks. A common feature of all of these systems is the presence of two non-coexistent regimes associated to certain properties of the system. For example: the disordered media can allow or not allow the flow of the fluid depending on its porosity. The change from one regime to another characterizes the percolation phase transition. The standard way of analyzing this transition uses the order parameter, a variable related to some characteristic of the system that exhibits zero value in one of the regimes and a nonzero value in the other. The proposal introduced in this thesis is that this phase transition can be investigated without the explicit use of the order parameter, but rather through the Shannon entropy. This entropy is a measure of the uncertainty degree in the information content of a probability distribution. The proposal is evaluated in the context of cluster formation in random graphs, and we apply the method to both classical percolation (Erd¨os- R´enyi) and explosive percolation. It is based in the computation of the entropy contained in the cluster size probability distribution and the results show that the transition critical point relates to the derivatives of the entropy. Furthermore, the difference between the smooth and abrupt aspects of the classical and explosive percolation transitions, respectively, is reinforced by the observation that the entropy has a maximum value in the classical transition critical point, while that correspondence does not occurs during the explosive percolation.
Resumo:
Various physical systems have dynamics that can be modeled by percolation processes. Percolation is used to study issues ranging from fluid diffusion through disordered media to fragmentation of a computer network caused by hacker attacks. A common feature of all of these systems is the presence of two non-coexistent regimes associated to certain properties of the system. For example: the disordered media can allow or not allow the flow of the fluid depending on its porosity. The change from one regime to another characterizes the percolation phase transition. The standard way of analyzing this transition uses the order parameter, a variable related to some characteristic of the system that exhibits zero value in one of the regimes and a nonzero value in the other. The proposal introduced in this thesis is that this phase transition can be investigated without the explicit use of the order parameter, but rather through the Shannon entropy. This entropy is a measure of the uncertainty degree in the information content of a probability distribution. The proposal is evaluated in the context of cluster formation in random graphs, and we apply the method to both classical percolation (Erd¨os- R´enyi) and explosive percolation. It is based in the computation of the entropy contained in the cluster size probability distribution and the results show that the transition critical point relates to the derivatives of the entropy. Furthermore, the difference between the smooth and abrupt aspects of the classical and explosive percolation transitions, respectively, is reinforced by the observation that the entropy has a maximum value in the classical transition critical point, while that correspondence does not occurs during the explosive percolation.
Resumo:
7 pages, 6 figures
Field data, numerical simulations and probability analyses to assess lava flow hazards at Mount Etna
Resumo:
Improving lava flow hazard assessment is one of the most important and challenging fields of volcanology, and has an immediate and practical impact on society. Here, we present a methodology for the quantitative assessment of lava flow hazards based on a combination of field data, numerical simulations and probability analyses. With the extensive data available on historic eruptions of Mt. Etna, going back over 2000 years, it has been possible to construct two hazard maps, one for flank and the other for summit eruptions, allowing a quantitative analysis of the most likely future courses of lava flows. The effective use of hazard maps of Etna may help in minimizing the damage from volcanic eruptions through correct land use in densely urbanized area with a population of almost one million people. Although this study was conducted on Mt. Etna, the approach used is designed to be applicable to other volcanic areas.
Resumo:
In this study, the authors propose simple methods to evaluate the achievable rates and outage probability of a cognitive radio (CR) link that takes into account the imperfectness of spectrum sensing. In the considered system, the CR transmitter and receiver correlatively sense and dynamically exploit the spectrum pool via dynamic frequency hopping. Under imperfect spectrum sensing, false-alarm and miss-detection occur which cause impulsive interference emerged from collisions due to the simultaneous spectrum access of primary and cognitive users. That makes it very challenging to evaluate the achievable rates. By first examining the static link where the channel is assumed to be constant over time, they show that the achievable rate using a Gaussian input can be calculated accurately through a simple series representation. In the second part of this study, they extend the calculation of the achievable rate to wireless fading environments. To take into account the effect of fading, they introduce a piece-wise linear curve fitting-based method to approximate the instantaneous achievable rate curve as a combination of linear segments. It is then demonstrated that the ergodic achievable rate in fast fading and the outage probability in slow fading can be calculated to achieve any given accuracy level.
Resumo:
The application of custom classification techniques and posterior probability modeling (PPM) using Worldview-2 multispectral imagery to archaeological field survey is presented in this paper. Research is focused on the identification of Neolithic felsite stone tool workshops in the North Mavine region of the Shetland Islands in Northern Scotland. Sample data from known workshops surveyed using differential GPS are used alongside known non-sites to train a linear discriminant analysis (LDA) classifier based on a combination of datasets including Worldview-2 bands, band difference ratios (BDR) and topographical derivatives. Principal components analysis is further used to test and reduce dimensionality caused by redundant datasets. Probability models were generated by LDA using principal components and tested with sites identified through geological field survey. Testing shows the prospective ability of this technique and significance between 0.05 and 0.01, and gain statistics between 0.90 and 0.94, higher than those obtained using maximum likelihood and random forest classifiers. Results suggest that this approach is best suited to relatively homogenous site types, and performs better with correlated data sources. Finally, by combining posterior probability models and least-cost analysis, a survey least-cost efficacy model is generated showing the utility of such approaches to archaeological field survey.