5 resultados para multifractional Brownian motion

em Université de Lausanne, Switzerland


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the traditional actuarial risk model, if the surplus is negative, the company is ruined and has to go out of business. In this paper we distinguish between ruin (negative surplus) and bankruptcy (going out of business), where the probability of bankruptcy is a function of the level of negative surplus. The idea for this notion of bankruptcy comes from the observation that in some industries, companies can continue doing business even though they are technically ruined. Assuming that dividends can only be paid with a certain probability at each point of time, we derive closed-form formulas for the expected discounted dividends until bankruptcy under a barrier strategy. Subsequently, the optimal barrier is determined, and several explicit identities for the optimal value are found. The surplus process of the company is modeled by a Wiener process (Brownian motion).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper is motivated by the valuation problem of guaranteed minimum death benefits in various equity-linked products. At the time of death, a benefit payment is due. It may depend not only on the price of a stock or stock fund at that time, but also on prior prices. The problem is to calculate the expected discounted value of the benefit payment. Because the distribution of the time of death can be approximated by a combination of exponential distributions, it suffices to solve the problem for an exponentially distributed time of death. The stock price process is assumed to be the exponential of a Brownian motion plus an independent compound Poisson process whose upward and downward jumps are modeled by combinations (or mixtures) of exponential distributions. Results for exponential stopping of a Lévy process are used to derive a series of closed-form formulas for call, put, lookback, and barrier options, dynamic fund protection, and dynamic withdrawal benefit with guarantee. We also discuss how barrier options can be used to model lapses and surrenders.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to analyze what transaction costs are acceptable for customers in different investments. In this study, two life insurance contracts, a mutual fund and a risk-free investment, as alternative investment forms are considered. The first two products under scrutiny are a life insurance investment with a point-to-point capital guarantee and a participating contract with an annual interest rate guarantee and participation in the insurer's surplus. The policyholder assesses the various investment opportunities using different utility measures. For selected types of risk profiles, the utility position and the investor's preference for the various investments are assessed. Based on this analysis, the authors study which cost levels can make all of the products equally rewarding for the investor. Design/methodology/approach - The paper notes the risk-neutral valuation calibration using empirical data utility and performance measurement dynamics underlying: geometric Brownian motion numerical examples via Monte Carlo simulation. Findings - In the first step, the financial performance of the various saving opportunities under different assumptions of the investor's utility measurement is studied. In the second step, the authors calculate the level of transaction costs that are allowed in the various products to make all of the investment opportunities equally rewarding from the investor's point of view. A comparison of these results with transaction costs that are common in the market shows that insurance companies must be careful with respect to the level of transaction costs that they pass on to their customers to provide attractive payoff distributions. Originality/value - To the best of the authors' knowledge, their research question - i.e. which transaction costs for life insurance products would be acceptable from the customer's point of view - has not been studied in the above described context so far.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To investigate the accuracy of 4 clinical instruments in the detection of glaucomatous damage. Methods: 102 eyes of 55 test subjects (Age mean = 66.5yrs, range = [39; 89]) underwent Heidelberg Retinal Tomography (HRTIII), (disc area<2.43); and standard automated perimetry (SAP) using Octopus (Dynamic); Pulsar (TOP); and Moorfields Motion Displacement Test (MDT) (ESTA strategy). Eyes were separated into three groups 1) Healthy (H): IOP<21mmHg and healthy discs (clinical examination), 39 subjects, 78 eyes; 2) Glaucoma suspect (GS): Suspicious discs (clinical examination), 12 subjects, 15 eyes; 3) Glaucoma (G): progressive structural or functional loss, 14 subjects, 20 eyes. Clinical diagnostic precision was examined using the cut-off associated with the p<5% normative limit of MD (Octopus/Pulsar), PTD (MDT) and MRA (HRT) analysis. The sensitivity, specificity and accuracy were calculated for each instrument. Results: See table Conclusions: Despite the advantage of defining glaucoma suspects using clinical optic disc examination, the HRT did not yield significantly higher accuracy than functional measures. HRT, MDT and Octopus SAP yielded higher accuracy than Pulsar perimetry, although results did not reach statistical significance. Further studies are required to investigate the structure-function correlations between these instruments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The enhanced functional sensitivity offered by ultra-high field imaging may significantly benefit simultaneous EEG-fMRI studies, but the concurrent increases in artifact contamination can strongly compromise EEG data quality. In the present study, we focus on EEG artifacts created by head motion in the static B0 field. A novel approach for motion artifact detection is proposed, based on a simple modification of a commercial EEG cap, in which four electrodes are non-permanently adapted to record only magnetic induction effects. Simultaneous EEG-fMRI data were acquired with this setup, at 7T, from healthy volunteers undergoing a reversing-checkerboard visual stimulation paradigm. Data analysis assisted by the motion sensors revealed that, after gradient artifact correction, EEG signal variance was largely dominated by pulse artifacts (81-93%), but contributions from spontaneous motion (4-13%) were still comparable to or even larger than those of actual neuronal activity (3-9%). Multiple approaches were tested to determine the most effective procedure for denoising EEG data incorporating motion sensor information. Optimal results were obtained by applying an initial pulse artifact correction step (AAS-based), followed by motion artifact correction (based on the motion sensors) and ICA denoising. On average, motion artifact correction (after AAS) yielded a 61% reduction in signal power and a 62% increase in VEP trial-by-trial consistency. Combined with ICA, these improvements rose to a 74% power reduction and an 86% increase in trial consistency. Overall, the improvements achieved were well appreciable at single-subject and single-trial levels, and set an encouraging quality mark for simultaneous EEG-fMRI at ultra-high field.