6 resultados para probability density function
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
In my work I derive closed-form pricing formulas for volatility based options by suitably approximating the volatility process risk-neutral density function. I exploit and adapt the idea, which stands behind popular techniques already employed in the context of equity options such as Edgeworth and Gram-Charlier expansions, of approximating the underlying process as a sum of some particular polynomials weighted by a kernel, which is typically a Gaussian distribution. I propose instead a Gamma kernel to adapt the methodology to the context of volatility options. VIX vanilla options closed-form pricing formulas are derived and their accuracy is tested for the Heston model (1993) as well as for the jump-diffusion SVJJ model proposed by Duffie et al. (2000).
Resumo:
The scalar Schrödinger equation models the probability density distribution for a particle to be found in a point x given a certain potential V(x) forming a well with respect to a fixed energy level E_0. Formally two real inversion points a,b exist such that V(a)=V(b)=E_0, V(x)<0 in (a,b) and V(x)>0 for xb. Following the work made by D.Yafaev and performing a WKB approximation we obtain solutions defined on specific intervals. The aim of the first part of the thesis is to find a condition on E, which belongs to a neighbourhood of E_0, such that it is an eigenvalue of the Schrödinger operator, obtaining in this way global and linear dependent solutions in L2. In quantum mechanics this condition is known as Bohr-Sommerfeld quantization. In the second part we define a Schrödinger operator referred to two potential wells and we study the quantization conditions on E in order to have a global solution in L2xL2 with respect to the mutual position of the potentials. In particular their wells can be disjoint,can have an intersection, can be included one into the other and can have a single point intersection. For these cases we refer to the works of A.Martinez, S. Fujiié, T. Watanabe, S. Ashida.
Resumo:
Particle concentration is a principal factor that affects erosion rate of solid surfaces under particle impact, such as pipe bends in pneumatic conveyors; it is well known that a reduction in the specific erosion rate occurs under high particle concentrations, a phenomenon referred to as the “shielding effect”. The cause of shielding is believed to be increased likelihood of inter-particulate collisions, the high collision probability between incoming and rebounding particles reducing the frequency and the severity of particle impacts on the target surface. In this study, the effects of particle concentration on erosion of a mild steel bend surface have been investigated in detail using three different particulate materials on an industrial scale pneumatic conveying test rig. The materials were studied so that two had the same particle density but very different particle size, whereas two had very similar particle size but very different particle density. Experimental results confirm the shielding effect due to high particle concentration and show that the particle density has a far more significant influence than the particle size, on the magnitude of the shielding effect. A new method of correcting for change in erosivity of the particles in repeated handling, to take this factor out of the data, has been established, and appears to be successful. Moreover, a novel empirical model of the shielding effects has been used, in term of erosion resistance which appears to decrease linearly when the particle concentration decreases. With the model it is possible to find the specific erosion rate when the particle concentration tends to zero, and conversely predict how the specific erosion rate changes at finite values of particle concentration; this is critical to enable component life to be predicted from erosion tester results, as the variation of the shielding effect with concentration is different in these two scenarios. In addition a previously unreported phenomenon has been recorded, of a particulate material whose erosivity has steadily increased during repeated impacts.
Resumo:
Finding the optimum location for placing a dam on a river is usually a complicated process which generally forces thousands of people to flee their homes because they will be inundated during the filling of the dam. Dams could also attract people living in the surrounding area after their construction. The goal of this research is to check for dam attractiveness for people by comparing growth rates of population density in surrounding areas after dam construction to those associated with the period antecedent to the dam construction. To this aim, 1859 dams across the United States of America and high-resolution population distribution from 1790 to 2010 are examined. By grouping dams as a function of their main purpose, water supply dams are found to be, as expected, the most attractive dams for people, with the biggest growth in population density. Irrigation dams are next, followed by hydroelectricity, flood control, Navigation, and finally Recreation dams. Fishery dams and dams for other uses suffered a decrease in population in the years after their construction. The regions with the greatest population growth were found approximately 40-45 km from the dam and at distances greater than 90 km, whereas the regions with the greatest population decline or only a modest gain were located within 10-15 km of the dam.
Resumo:
Alpha oscillatory activity has long been associated with perceptual and cognitive processes related to attention control. The aim of this study is to explore the task-dependent role of alpha frequency in a lateralized visuo-spatial detection task. Specifically, the thesis focuses on consolidating the scientific literature's knowledge about the role of alpha frequency in perceptual accuracy, and deepening the understanding of what determines trial-by-trial fluctuations of alpha parameters and how these fluctuations influence overall task performance. The hypotheses, confirmed empirically, were that different implicit strategies are put in place based on the task context, in order to maximize performance with optimal resource distribution (namely alpha frequency, associated positively with performance): “Lateralization” of the attentive resources towards one hemifield should be associated with higher alpha frequency difference between contralateral and ipsilateral hemisphere; “Distribution” of the attentive resources across hemifields should be associated with lower alpha frequency difference between hemispheres; These strategies, used by the participants according to their brain capabilities, have proven themselves adaptive or maladaptive depending on the different tasks to which they have been set: "Distribution" of the attentive resources seemed to be the best strategy when the distribution probability between hemifields was balanced: i.e. the neutral condition task. "Lateralization" of the attentive resources seemed to be more effective when the distribution probability between hemifields was biased towards one hemifield: i.e., the biased condition task.
Resumo:
Dwarf galaxies often experience gravitational interactions from more massive companions. These interactions can deform galaxies, turn star formation on or off, or give rise to mass loss phenomena. In this thesis work we propose to study, through N-body simulations, the stellar mass loss suffered by the dwarf spheroid galaxy (dSph) Fornax orbiting in the Milky Way gravitational potential. Which is a key phenomenon to explain the mass budget problem: the Fornax globular clusters together have a stellar mass comparable to that of Fornax itself. If we look at the stellar populations which they are made of and we apply the scenarios of stellar population formation we find that, originally, they must have been >= 5 times more massive. For this reason, they must have lost or ejected stars through dynamic interactions. However, as presented in Larsen et al (2012), field stars alone are not sufficient to explain this scenario. We may assume that some of those stars fell into Fornax, and later were stripped by Milky Way. In order to study this solution we built several illustrative single component simulations, with a tabulated density model using the P07ecc orbit studied from Battaglia et al (2015). To divide the single component into stellar and dark matter components we have defined a posterior the probability function P(E), where E is the initial energy distribution of the particles. By associating each particle with a fraction of stellar mass and dark matter. In this way we built stellar density profiles without repeating simulations. We applied the method to Fornax using the profile density tables obtained in Pascale et al (2018) as observational constraints and to build the model. The results confirm the results previously obtained with less flexible models by Battaglia et al (2015). They show a stellar mass loss < 4% within 1.6 kpc and negligible within 3 kpc, too small to solve the mass budget problem.