87 resultados para Digital dividend


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We characterize the value function of maximizing the total discounted utility of dividend payments for a compound Poisson insurance risk model when strictly positive transaction costs are included, leading to an impulse control problem. We illustrate that well known simple strategies can be optimal in the case of exponential claim amounts. Finally we develop a numerical procedure to deal with general claim amount distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract We introduce a label-free technology based on digital holographic microscopy (DHM) with applicability for screening by imaging, and we demonstrate its capability for cytotoxicity assessment using mammalian living cells. For this first high content screening compatible application, we automatized a digital holographic microscope for image acquisition of cells using commercially available 96-well plates. Data generated through both label-free DHM imaging and fluorescence-based methods were in good agreement for cell viability identification and a Z'-factor close to 0.9 was determined, validating the robustness of DHM assay for phenotypic screening. Further, an excellent correlation was obtained between experimental cytotoxicity dose-response curves and known IC values for different toxic compounds. For comparable results, DHM has the major advantages of being label free and close to an order of magnitude faster than automated standard fluorescence microscopy.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study was carried out to check whether classic osteometric parameters can be determined from the 3D reconstructions of MSCT (multislice computed tomography) scans acquired in the context of the Virtopsy project. To this end, four isolated and macerated skulls were examined by six examiners. First the skulls were conventionally (manually) measured using 32 internationally accepted linear measurements. Then the skulls were scanned by the use of MSCT with slice thicknesses of 1.25 mm and 0.63 mm, and the 33 measurements were virtually determined on the digital 3D reconstructions of the skulls. The results of the traditional and the digital measurements were compared for each examiner to figure out variations. Furthermore, several parameters were measured on the cranium and postcranium during an autopsy and compared to the values that had been measured on a 3D reconstruction from a previously acquired postmortem MSCT scan. The results indicate that equivalent osteometric values can be obtained from digital 3D reconstructions from MSCT scans using a slice thickness of 1.25 mm, and from conventional manual examinations. The measurements taken from a corpse during an autopsy could also be validated with the methods used for the digital 3D reconstructions in the context of the Virtopsy project. Future aims are the assessment and biostatistical evaluation in respect to sex, age and stature of all data sets stored in the Virtopsy project so far, as well as of future data sets. Furthermore, a definition of new parameters, only measurable with the aid of MSCT data would be conceivable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this review, we summarize how the new concept of digital optics applied to the field of holographic microscopy has allowed the development of a reliable and flexible digital holographic quantitative phase microscopy (DH-QPM) technique at the nanoscale particularly suitable for cell imaging. Particular emphasis is placed on the original biological information provided by the quantitative phase signal. We present the most relevant DH-QPM applications in the field of cell biology, including automated cell counts, recognition, classification, three-dimensional tracking, discrimination between physiological and pathophysiological states, and the study of cell membrane fluctuations at the nanoscale. In the last part, original results show how DH-QPM can address two important issues in the field of neurobiology, namely, multiple-site optical recording of neuronal activity and noninvasive visualization of dendritic spine dynamics resulting from a full digital holographic microscopy tomographic approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Assessment of image quality for digital x-ray mammography systems used in European screening programs relies mainly on contrast-detail CDMAM phantom scoring and requires the acquisition and analysis of many images in order to reduce variability in threshold detectability. Part II of this study proposes an alternative method based on the detectability index (d') calculated for a non-prewhitened model observer with an eye filter (NPWE). The detectability index was calculated from the normalized noise power spectrum and image contrast, both measured from an image of a 5 cm poly(methyl methacrylate) phantom containing a 0.2 mm thick aluminium square, and the pre-sampling modulation transfer function. This was performed as a function of air kerma at the detector for 11 different digital mammography systems. These calculated d' values were compared against threshold gold thickness (T) results measured with the CDMAM test object and against derived theoretical relationships. A simple relationship was found between T and d', as a function of detector air kerma; a linear relationship was found between d' and contrast-to-noise ratio. The values of threshold thickness used to specify acceptable performance in the European Guidelines for 0.10 and 0.25 mm diameter discs were equivalent to threshold calculated detectability indices of 1.05 and 6.30, respectively. The NPWE method is a validated alternative to CDMAM scoring for use in the image quality specification, quality control and optimization of digital x-ray systems for screening mammography.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To assess the technical feasibility of multi-detector row computed tomographic (CT) angiography in the assessment of peripheral arterial bypass grafts and to evaluate its accuracy and reliability in the detection of graft-related complications, including graft stenosis, aneurysmal changes, and arteriovenous fistulas. MATERIALS AND METHODS: Four-channel multi-detector row CT angiography was performed in 65 consecutive patients with 85 peripheral arterial bypass grafts. Each bypass graft was divided into three segments (proximal anastomosis, course of the graft body, and distal anastomosis), resulting in 255 segments. Two readers evaluated all CT angiograms with regard to image quality and the presence of bypass graft-related abnormalities, including graft stenosis, aneurysmal changes, and arteriovenous fistulas. The results were compared with McNemar test with Bonferroni correction. CT attenuation values were recorded at five different locations from the inflow artery to the outflow artery of the bypass graft. These findings were compared with the findings at duplex ultrasonography (US) in 65 patients and the findings at conventional digital subtraction angiography (DSA) in 27. RESULTS: Image quality was rated as good or excellent in 250 (98%) and in 252 (99%) of 255 bypass segments, respectively. There was excellent agreement both between readers and between CT angiography and duplex US in the detection of graft stenosis, aneurysmal changes, and arteriovenous fistulas (kappa = 0.86-0.99). CT angiography and duplex US were compared with conventional DSA, and there was no statistically significant difference (P >.25) in sensitivity or specificity between CT angiography and duplex US for both readers for detection of hemodynamically significant bypass stenosis or occlusion, aneurysmal changes, or arteriovenous fistulas. Mean CT attenuation values ranged from 232 HU in the inflow artery to 281 HU in the outflow artery of the bypass graft. CONCLUSION: Multi-detector row CT angiography may be an accurate and reliable technique after duplex US in the assessment of peripheral arterial bypass grafts and detection of graft-related complications, including stenosis, aneurysmal changes, and arteriovenous fistulas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Turtle Mountain in Alberta, Canada has become an important field laboratory for testing different techniques related to the characterization and monitoring of large slope mass movements as the stability of large portions of the eastern face of the mountain is still questionable. In order to better quantify the volumes potentially unstable and the most probable failure mechanisms and potential consequences, structural analysis and runout modeling were preformed. The structural features of the eastern face were investigated using a high resolution digital elevation model (HRDEM). According to displacement datasets and structural observations, potential failure mechanisms affecting different portions of the mountain have been assessed. The volumes of the different potentially unstable blocks have been calculated using the Sloping Local Base Level (SLBL) method. Based on the volume estimation, two and three dimensional dynamic runout analyses have been performed. Calibration of this analysis is based on the experience from the adjacent Frank Slide and other similar rock avalanches. The results will be used to improve the contingency plans within the hazard area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rapport de synthèse Cette thèse consiste en trois essais sur les stratégies optimales de dividendes. Chaque essai correspond à un chapitre. Les deux premiers essais ont été écrits en collaboration avec les Professeurs Hans Ulrich Gerber et Elias S. W. Shiu et ils ont été publiés; voir Gerber et al. (2006b) ainsi que Gerber et al. (2008). Le troisième essai a été écrit en collaboration avec le Professeur Hans Ulrich Gerber. Le problème des stratégies optimales de dividendes remonte à de Finetti (1957). Il se pose comme suit: considérant le surplus d'une société, déterminer la stratégie optimale de distribution des dividendes. Le critère utilisé consiste à maximiser la somme des dividendes escomptés versés aux actionnaires jusqu'à la ruine2 de la société. Depuis de Finetti (1957), le problème a pris plusieurs formes et a été résolu pour différents modèles. Dans le modèle classique de théorie de la ruine, le problème a été résolu par Gerber (1969) et plus récemment, en utilisant une autre approche, par Azcue and Muler (2005) ou Schmidli (2008). Dans le modèle classique, il y a un flux continu et constant d'entrées d'argent. Quant aux sorties d'argent, elles sont aléatoires. Elles suivent un processus à sauts, à savoir un processus de Poisson composé. Un exemple qui correspond bien à un tel modèle est la valeur du surplus d'une compagnie d'assurance pour lequel les entrées et les sorties sont respectivement les primes et les sinistres. Le premier graphique de la Figure 1 en illustre un exemple. Dans cette thèse, seules les stratégies de barrière sont considérées, c'est-à-dire quand le surplus dépasse le niveau b de la barrière, l'excédent est distribué aux actionnaires comme dividendes. Le deuxième graphique de la Figure 1 montre le même exemple du surplus quand une barrière de niveau b est introduite, et le troisième graphique de cette figure montre, quand à lui, les dividendes cumulés. Chapitre l: "Maximizing dividends without bankruptcy" Dans ce premier essai, les barrières optimales sont calculées pour différentes distributions du montant des sinistres selon deux critères: I) La barrière optimale est calculée en utilisant le critère usuel qui consiste à maximiser l'espérance des dividendes escomptés jusqu'à la ruine. II) La barrière optimale est calculée en utilisant le second critère qui consiste, quant à lui, à maximiser l'espérance de la différence entre les dividendes escomptés jusqu'à la ruine et le déficit au moment de la ruine. Cet essai est inspiré par Dickson and Waters (2004), dont l'idée est de faire supporter aux actionnaires le déficit au moment de la ruine. Ceci est d'autant plus vrai dans le cas d'une compagnie d'assurance dont la ruine doit être évitée. Dans l'exemple de la Figure 1, le déficit au moment de la ruine est noté R. Des exemples numériques nous permettent de comparer le niveau des barrières optimales dans les situations I et II. Cette idée, d'ajouter une pénalité au moment de la ruine, a été généralisée dans Gerber et al. (2006a). Chapitre 2: "Methods for estimating the optimal dividend barrier and the probability of ruin" Dans ce second essai, du fait qu'en pratique on n'a jamais toute l'information nécessaire sur la distribution du montant des sinistres, on suppose que seuls les premiers moments de cette fonction sont connus. Cet essai développe et examine des méthodes qui permettent d'approximer, dans cette situation, le niveau de la barrière optimale, selon le critère usuel (cas I ci-dessus). Les approximations "de Vylder" et "diffusion" sont expliquées et examinées: Certaines de ces approximations utilisent deux, trois ou quatre des premiers moments. Des exemples numériques nous permettent de comparer les approximations du niveau de la barrière optimale, non seulement avec les valeurs exactes mais également entre elles. Chapitre 3: "Optimal dividends with incomplete information" Dans ce troisième et dernier essai, on s'intéresse à nouveau aux méthodes d'approximation du niveau de la barrière optimale quand seuls les premiers moments de la distribution du montant des sauts sont connus. Cette fois, on considère le modèle dual. Comme pour le modèle classique, dans un sens il y a un flux continu et dans l'autre un processus à sauts. A l'inverse du modèle classique, les gains suivent un processus de Poisson composé et les pertes sont constantes et continues; voir la Figure 2. Un tel modèle conviendrait pour une caisse de pension ou une société qui se spécialise dans les découvertes ou inventions. Ainsi, tant les approximations "de Vylder" et "diffusion" que les nouvelles approximations "gamma" et "gamma process" sont expliquées et analysées. Ces nouvelles approximations semblent donner de meilleurs résultats dans certains cas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a method to automatically segment red blood cells (RBCs) visualized by digital holographic microscopy (DHM), which is based on the marker-controlled watershed algorithm. Quantitative phase images of RBCs can be obtained by using off-axis DHM along to provide some important information about each RBC, including size, shape, volume, hemoglobin content, etc. The most important process of segmentation based on marker-controlled watershed is to perform an accurate localization of internal and external markers. Here, we first obtain the binary image via Otsu algorithm. Then, we apply morphological operations to the binary image to get the internal markers. We then apply the distance transform algorithm combined with the watershed algorithm to generate external markers based on internal markers. Finally, combining the internal and external markers, we modify the original gradient image and apply the watershed algorithm. By appropriately identifying the internal and external markers, the problems of oversegmentation and undersegmentation are avoided. Furthermore, the internal and external parts of the RBCs phase image can also be segmented by using the marker-controlled watershed combined with our method, which can identify the internal and external markers appropriately. Our experimental results show that the proposed method achieves good performance in terms of segmenting RBCs and could thus be helpful when combined with an automated classification of RBCs.