839 resultados para random utility


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe several simulation algorithms that yield random probability distributions with given values of risk measures. In case of vanilla risk measures, the algorithms involve combining and transforming random cumulative distribution functions or random Lorenz curves obtained by simulating rather general random probability distributions on the unit interval. A new algorithm based on the simulation of a weighted barycentres array is suggested to generate random probability distributions with a given value of the spectral risk measure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first section of this chapter starts with the Buffon problem, which is one of the oldest in stochastic geometry, and then continues with the definition of measures on the space of lines. The second section defines random closed sets and related measurability issues, explains how to characterize distributions of random closed sets by means of capacity functionals and introduces the concept of a selection. Based on this concept, the third section starts with the definition of the expectation and proves its convexifying effect that is related to the Lyapunov theorem for ranges of vector-valued measures. Finally, the strong law of large numbers for Minkowski sums of random sets is proved and the corresponding limit theorem is formulated. The chapter is concluded by a discussion of the union-scheme for random closed sets and a characterization of the corresponding stable laws.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic models for three-dimensional particles have many applications in applied sciences. Lévy–based particle models are a flexible approach to particle modelling. The structure of the random particles is given by a kernel smoothing of a Lévy basis. The models are easy to simulate but statistical inference procedures have not yet received much attention in the literature. The kernel is not always identifiable and we suggest one approach to remedy this problem. We propose a method to draw inference about the kernel from data often used in local stereology and study the performance of our approach in a simulation study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We prove large deviation results for sums of heavy-tailed random elements in rather general convex cones being semigroups equipped with a rescaling operation by positive real numbers. In difference to previous results for the cone of convex sets, our technique does not use the embedding of cones in linear spaces. Examples include the cone of convex sets with the Minkowski addition, positive half-line with maximum operation and the family of square integrable functions with arithmetic addition and argument rescaling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Contact force (CF) is an important determinant of lesion formation for atrial endocardial radiofrequency ablation. There are minimal published data on CF and ventricular lesion formation. We studied the impact of CF on lesion formation using an ovine model both endocardially and epicardially. METHODS AND RESULTS Twenty sheep received 160 epicardial and 160 endocardial ventricular radiofrequency applications using either a 3.5-mm irrigated-tip catheter (Thermocool, Biosense-Webster, n=160) or a 3.5 irrigated-tip catheter with CF assessment (Tacticath, Endosense, n=160), via percutaneous access. Power was delivered at 30 watts for 60 seconds, when either catheter/tissue contact was felt to be good or when CF>10 g with Tacticath. After completion of all lesions, acute dimensions were taken at pathology. Identifiable lesion formation from radiofrequency application was improved with the aid of CF information, from 78% to 98% on the endocardium (P<0.001) and from 90% to 100% on the epicardium (P=0.02). The mean total force was greater on the endocardium (39±18 g versus 21±14 g for the epicardium; P<0.001) mainly because of axial force. Despite the force-time integral being greater endocardially, epicardial lesions were larger (231±182 mm(3) versus 209±131 mm(3); P=0.02) probably because of the absence of the heat sink effect of the circulating blood and covered a greater area (41±27 mm(2) versus 29±17 mm(2); P=0.03) because of catheter orientation. CONCLUSIONS In the absence of CF feedback, 22% of endocardial radiofrequency applications that are thought to have good contact did not result in lesion formation. Epicardial ablation is associated with larger lesions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This investigation attempts to answer the question why more and more parents have chosen the Gymnasium for their children's secondary school education in post‐war West Germany. Based on the theory of subjective expected utility, the crucial mechanisms of parental educational decisions have been emphasized. From this perspective it is assumed that increasing educational motivation coupled with changes in the subjective evaluation of the cost–benefit of education were important conditions for an increasing participation in upper secondary schools. These were, however, in turn, the result of educational expansion. The empirical analyses for three time‐periods in the 1960s, 1970s, and 1980s confirm these assumptions to a large degree. Additionally, empirical evidence was found to suggest that in addition to the intentions of parents and the educational career of their children, structural moments of educational expansion and their own inertia played an important role in the pupils' transition from one educational level to the next. Finally, evidence was found that persistent class‐specific educational inequality stems from a constant balance in the relative cost–benefit advantages between social classes as well as from an increasing difference of primary origin effect between social classes in the realization of their educational choice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

North temperate fish in post-glacial lakes are textbook examples for rapid parallel adaptive radiation into multiple trophic specialists within individual lakes. Speciation repeatedly proceeded along the benthic – limnetic habitat axis, and benthic – limnetic sister species diverge in the number of gill rakers. Yet, the utility of different numbers of gill rakers for consuming benthic vs. limnetic food has only very rarely been experimentally demonstrated. We bred and raised families of a benthic – limnetic species pair of whitefish under common garden conditions to test whether these species (i) show heritable differentiation in feeding efficiency on zooplankton, and (ii) whether varia- tion in feeding efficiency is predicted by variation in gill raker numbers. We used zooplankton of three different size classes to investigate prey size dependency of divergence in feeding efficiency and to investigate the effect strength of variation in the number of gill rakers. Our results show strong interspecific differences in feeding efficiency. These differences are largest when fish were tested with the smallest zooplankton. Importantly, feeding efficiency is significantly positively correlated with the number of gill rakers when using small zooplankton, also when species identity is statistically controlled for. Our results support the hypothesis that a larger number of gill rakers are of adaptive significance for feeding on zooplankton and pro- vide one of the first experimental demonstrations of trait utility of gill raker number when fish feed on zooplankton. These results are consistent with the suggested importance of divergent selection driven feeding adaptation during adaptive radiation of fish in post-glacial lakes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a fully automatic, robust approach for segmenting proximal femur in conventional X-ray images. Our method is based on hierarchical landmark detection by random forest regression, where the detection results of 22 global landmarks are used to do the spatial normalization, and the detection results of the 59 local landmarks serve as the image cue for instantiation of a statistical shape model of the proximal femur. To detect landmarks in both levels, we use multi-resolution HoG (Histogram of Oriented Gradients) as features which can achieve better accuracy and robustness. The efficacy of the present method is demonstrated by experiments conducted on 150 clinical x-ray images. It was found that the present method could achieve an average point-to-curve error of 2.0 mm and that the present method was robust to low image contrast, noise and occlusions caused by implants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge of landmarks and contours in anteroposterior (AP) pelvis X-rays is invaluable for computer aided diagnosis, hip surgery planning and image-guided interventions. This paper presents a fully automatic and robust approach for landmarking and segmentation of both pelvis and femur in a conventional AP X-ray. Our approach is based on random forest regression and hierarchical sparse shape composition. Experiments conducted on 436 clinical AP pelvis x-rays show that our approach achieves an average point-to-curve error around 1.3 mm for femur and 2.2 mm for pelvis, both with success rates around 98%. Compared to existing methods, our approach exhibits better performance in both the robustness and the accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Perceptual learning is a training induced improvement in performance. Mechanisms underlying the perceptual learning of depth discrimination in dynamic random dot stereograms were examined by assessing stereothresholds as a function of decorrelation. The inflection point of the decorrelation function was defined as the level of decorrelation corresponding to 1.4 times the threshold when decorrelation is 0%. In general, stereothresholds increased with increasing decorrelation. Following training, stereothresholds and standard errors of measurement decreased systematically for all tested decorrelation values. Post training decorrelation functions were reduced by a multiplicative constant (approximately 5), exhibiting changes in stereothresholds without changes in the inflection points. Disparity energy model simulations indicate that a post-training reduction in neuronal noise can sufficiently account for the perceptual learning effects. In two subjects, learning effects were retained over a period of six months, which may have application for training stereo deficient subjects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of group-randomized trials is particularly widespread in the evaluation of health care, educational, and screening strategies. Group-randomized trials represent a subset of a larger class of designs often labeled nested, hierarchical, or multilevel and are characterized by the randomization of intact social units or groups, rather than individuals. The application of random effects models to group-randomized trials requires the specification of fixed and random components of the model. The underlying assumption is usually that these random components are normally distributed. This research is intended to determine if the Type I error rate and power are affected when the assumption of normality for the random component representing the group effect is violated. ^ In this study, simulated data are used to examine the Type I error rate, power, bias and mean squared error of the estimates of the fixed effect and the observed intraclass correlation coefficient (ICC) when the random component representing the group effect possess distributions with non-normal characteristics, such as heavy tails or severe skewness. The simulated data are generated with various characteristics (e.g. number of schools per condition, number of students per school, and several within school ICCs) observed in most small, school-based, group-randomized trials. The analysis is carried out using SAS PROC MIXED, Version 6.12, with random effects specified in a random statement and restricted maximum likelihood (REML) estimation specified. The results from the non-normally distributed data are compared to the results obtained from the analysis of data with similar design characteristics but normally distributed random effects. ^ The results suggest that the violation of the normality assumption for the group component by a skewed or heavy-tailed distribution does not appear to influence the estimation of the fixed effect, Type I error, and power. Negative biases were detected when estimating the sample ICC and dramatically increased in magnitude as the true ICC increased. These biases were not as pronounced when the true ICC was within the range observed in most group-randomized trials (i.e. 0.00 to 0.05). The normally distributed group effect also resulted in bias ICC estimates when the true ICC was greater than 0.05. However, this may be a result of higher correlation within the data. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, the econometrics literature has shown a growing interest in the study of partially identified models, in which the object of economic and statistical interest is a set rather than a point. The characterization of this set and the development of consistent estimators and inference procedures for it with desirable properties are the main goals of partial identification analysis. This review introduces the fundamental tools of the theory of random sets, which brings together elements of topology, convex geometry, and probability theory to develop a coherent mathematical framework to analyze random elements whose realizations are sets. It then elucidates how these tools have been fruitfully applied in econometrics to reach the goals of partial identification analysis.