76 resultados para Random lasers
Resumo:
The use of glasses doped with PbS nanocrystals as intracavity saturable absorbers for passive Q-switching and mode locking of c-cut Nd:Gd0.7Y0.3VO4, Nd:YVO4, and Nd:GdVO4 lasers is investigated. Q-switching yields pulses as short as 35 ns with an average output power of 435 mW at a repetition rate of 6–12 kHz at a pump power of 5–6 W. Mode locking through a combination of PbS nanocrystals and a Kerr lens results in 1.4 ps long pulses with an average output power of 255 mW at a repetition rate of 100 MHz.
Resumo:
A physical random number generator based on the intrinsic randomness of quantum mechanics is described. The random events are realized by the choice of single photons between the two outputs of a beamsplitter. We present a simple device, which minimizes the impact of the photon counters’ noise, dead-time and after pulses.
Resumo:
In all European Union countries, chemical residues are required to be routinely monitored in meat. Good farming and veterinary practice can prevent the contamination of meat with pharmaceutical substances, resulting in a low detection of drug residues through random sampling. An alternative approach is to target-monitor farms suspected of treating their animals with antimicrobials. The objective of this project was to assess, using a stochastic model, the efficiency of these two sampling strategies. The model integrated data on Swiss livestock as well as expert opinion and results from studies conducted in Switzerland. Risk-based sampling showed an increase in detection efficiency of up to 100% depending on the prevalence of contaminated herds. Sensitivity analysis of this model showed the importance of the accuracy of prior assumptions for conducting risk-based sampling. The resources gained by changing from random to risk-based sampling should be transferred to improving the quality of prior information.
Resumo:
We describe several simulation algorithms that yield random probability distributions with given values of risk measures. In case of vanilla risk measures, the algorithms involve combining and transforming random cumulative distribution functions or random Lorenz curves obtained by simulating rather general random probability distributions on the unit interval. A new algorithm based on the simulation of a weighted barycentres array is suggested to generate random probability distributions with a given value of the spectral risk measure.
Resumo:
The first section of this chapter starts with the Buffon problem, which is one of the oldest in stochastic geometry, and then continues with the definition of measures on the space of lines. The second section defines random closed sets and related measurability issues, explains how to characterize distributions of random closed sets by means of capacity functionals and introduces the concept of a selection. Based on this concept, the third section starts with the definition of the expectation and proves its convexifying effect that is related to the Lyapunov theorem for ranges of vector-valued measures. Finally, the strong law of large numbers for Minkowski sums of random sets is proved and the corresponding limit theorem is formulated. The chapter is concluded by a discussion of the union-scheme for random closed sets and a characterization of the corresponding stable laws.
Resumo:
Stochastic models for three-dimensional particles have many applications in applied sciences. Lévy–based particle models are a flexible approach to particle modelling. The structure of the random particles is given by a kernel smoothing of a Lévy basis. The models are easy to simulate but statistical inference procedures have not yet received much attention in the literature. The kernel is not always identifiable and we suggest one approach to remedy this problem. We propose a method to draw inference about the kernel from data often used in local stereology and study the performance of our approach in a simulation study.
Resumo:
We prove large deviation results for sums of heavy-tailed random elements in rather general convex cones being semigroups equipped with a rescaling operation by positive real numbers. In difference to previous results for the cone of convex sets, our technique does not use the embedding of cones in linear spaces. Examples include the cone of convex sets with the Minkowski addition, positive half-line with maximum operation and the family of square integrable functions with arithmetic addition and argument rescaling.
Resumo:
In this paper, we propose a fully automatic, robust approach for segmenting proximal femur in conventional X-ray images. Our method is based on hierarchical landmark detection by random forest regression, where the detection results of 22 global landmarks are used to do the spatial normalization, and the detection results of the 59 local landmarks serve as the image cue for instantiation of a statistical shape model of the proximal femur. To detect landmarks in both levels, we use multi-resolution HoG (Histogram of Oriented Gradients) as features which can achieve better accuracy and robustness. The efficacy of the present method is demonstrated by experiments conducted on 150 clinical x-ray images. It was found that the present method could achieve an average point-to-curve error of 2.0 mm and that the present method was robust to low image contrast, noise and occlusions caused by implants.
Resumo:
Knowledge of landmarks and contours in anteroposterior (AP) pelvis X-rays is invaluable for computer aided diagnosis, hip surgery planning and image-guided interventions. This paper presents a fully automatic and robust approach for landmarking and segmentation of both pelvis and femur in a conventional AP X-ray. Our approach is based on random forest regression and hierarchical sparse shape composition. Experiments conducted on 436 clinical AP pelvis x-rays show that our approach achieves an average point-to-curve error around 1.3 mm for femur and 2.2 mm for pelvis, both with success rates around 98%. Compared to existing methods, our approach exhibits better performance in both the robustness and the accuracy.
Resumo:
Refractive losses in laser-produced plasmas used as gain media are caused by electron density gradients, and limit the energy transport range. The pump pulse is thus deflected from the high-gain region and the short wavelength laser signal also steers away, causing loss of collimation. A Hohlraum used as a target makes the plasma homogeneous and can mitigate refractive losses by means of wave-guiding. A computational study combining a hydrodynamics code and an atomic physics code is presented, which includes a ray-tracing modeling based on the eikonal theory of the trajectory equation. This study presents gain calculations based on population inversion produced by free-electron collisions exciting bound electrons into metastable levels in the 3d94d1(J = 0) → 3d94p1(J = 1) transition of Ni-like Sn. Further, the Hohlraum suggests a dramatic enhancement of the conversion efficiency of collisionally excited x-ray lasing for Ni-like Sn.
Resumo:
In recent years, the econometrics literature has shown a growing interest in the study of partially identified models, in which the object of economic and statistical interest is a set rather than a point. The characterization of this set and the development of consistent estimators and inference procedures for it with desirable properties are the main goals of partial identification analysis. This review introduces the fundamental tools of the theory of random sets, which brings together elements of topology, convex geometry, and probability theory to develop a coherent mathematical framework to analyze random elements whose realizations are sets. It then elucidates how these tools have been fruitfully applied in econometrics to reach the goals of partial identification analysis.