20 resultados para RAY-TRACING ALGORITHM

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

90.00% 90.00%

Publicador:

Resumo:

A computational study of line-focus generation was done using a self-written ray-tracing code and compared to experimental data. Two line-focusing geometries were compared, i.e., either exploiting the sagittal astigmatism of a tilted spherical mirror or using the spherical aberration of an off-axis- illuminated spherical mirror. Line focusing by means of astigmatism or spherical aberration showed identical results as expected for the equivalence of the two frames of reference. The variation of the incidence angle on the target affects the line-focus length, which affects the amplification length such that as long as the irradiance is above the amplification threshold, it is advantageous to have a longer line focus. The amplification threshold is physically dependent on operating parameters and plasma-column conditions and in the present study addresses four possible cases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Refractive losses in laser-produced plasmas used as gain media are caused by electron density gradients, and limit the energy transport range. The pump pulse is thus deflected from the high-gain region and the short wavelength laser signal also steers away, causing loss of collimation. A Hohlraum used as a target makes the plasma homogeneous and can mitigate refractive losses by means of wave-guiding. A computational study combining a hydrodynamics code and an atomic physics code is presented, which includes a ray-tracing modeling based on the eikonal theory of the trajectory equation. This study presents gain calculations based on population inversion produced by free-electron collisions exciting bound electrons into metastable levels in the 3d94d1(J = 0) → 3d94p1(J = 1) transition of Ni-like Sn. Further, the Hohlraum suggests a dramatic enhancement of the conversion efficiency of collisionally excited x-ray lasing for Ni-like Sn.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We introduce gradient-domain rendering for Monte Carlo image synthesis.While previous gradient-domain Metropolis Light Transport sought to distribute more samples in areas of high gradients, we show, in contrast, that estimating image gradients is also possible using standard (non-Metropolis) Monte Carlo algorithms, and furthermore, that even without changing the sample distribution, this often leads to significant error reduction. This broadens the applicability of gradient rendering considerably. To gain insight into the conditions under which gradient-domain sampling is beneficial, we present a frequency analysis that compares Monte Carlo sampling of gradients followed by Poisson reconstruction to traditional Monte Carlo sampling. Finally, we describe Gradient-Domain Path Tracing (G-PT), a relatively simple modification of the standard path tracing algorithm that can yield far superior results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we report on an optical tolerance analysis of the submillimeter atmospheric multi-beam limb sounder, STEAMR. Physical optics and ray-tracing methods were used to quantify and separate errors in beam pointing and distortion due to reflector misalignment and primary reflector surface deformations. Simulations were performed concurrently with the manufacturing of a multi-beam demonstrator of the relay optical system which shapes and images the beams to their corresponding receiver feed horns. Results from Monte Carlo simulations show that the inserts used for reflector mounting should be positioned with an overall accuracy better than 100 μm (~ 1/10 wavelength). Analyses of primary reflector surface deformations show that a deviation of magnitude 100 μm can be tolerable before deployment, whereas the corresponding variations should be less than 30 μm during operation. The most sensitive optical elements in terms of misalignments are found near the focal plane. This localized sensitivity is attributed to the off-axis nature of the beams at this location. Post-assembly mechanical measurements of the reflectors in the demonstrator show that alignment better than 50 μm could be obtained.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The visible reflectance spectrum of many Solar System bodies changes with changing viewing geometry for reasons not fully understood. It is often observed to redden (increasing spectral slope) with increasing solar phase angle, an effect known as phase reddening. Only once, in an observation of the martian surface by the Viking 1 lander, was reddening observed up to a certain phase angle with bluing beyond, making the reflectance ratio as a function of phase angle shaped like an arch. However, in laboratory experiments this arch-shape is frequently encountered. To investigate this, we measured the bidirectional reflectance of particulate samples of several common rock types in the 400–1000 nm wavelength range and performed ray-tracing simulations. We confirm the occurrence of the arch for surfaces that are forward scattering, i.e. are composed of semi-transparent particles and are smooth on the scale of the particles, and for which the reflectance increases from the lower to the higher wavelength in the reflectance ratio. The arch shape is reproduced by the simulations, which assume a smooth surface. However, surface roughness on the scale of the particles, such as the Hapke and van Horn (Hapke, B., van Horn, H. [1963]. J. Geophys. Res. 68, 4545–4570) fairy castles that can spontaneously form when sprinkling a fine powder, leads to monotonic reddening. A further consequence of this form of microscopic roughness (being indistinct without the use of a microscope) is a flattening of the disk function at visible wavelengths, i.e. Lommel–Seeliger-type scattering. The experiments further reveal monotonic reddening for reflectance ratios at near-IR wavelengths. The simulations fail to reproduce this particular reddening, and we suspect that it results from roughness on the surface of the particles. Given that the regolith of atmosphereless Solar System bodies is composed of small particles, our results indicate that the prevalence of monotonic reddening and Lommel–Seeliger-type scattering for these bodies results from microscopic roughness, both in the form of structures built by the particles and roughness on the surface of the particles themselves. It follows from the singular Viking 1 observation that the surface in front of the lander was composed of semi-transparent particles, and was smooth on the scale of the particle size.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims. The OSIRIS camera onboard the Rosetta spacecraft obtained close-up views of the dust coma of comet 67P. The jet structures can be used to trace their source regions and to examine the possible effect of gas-surface interaction. Methods. We analyzed the wide-angle images obtained in the special dust observation sequences between August and September 2014. The jet features detected in different images were compared to study their time variability. The locations of the potential source regions of some of the jets are identified by ray tracing. We used a ring-masking technique to calculate the brightness distribution of dust jets along the projected distance. Results. The jets detected between August and September 2014 mostly originated in the Hapi region. Morphological changes appeared over a timescale of several days in September. The brightness slope of the dust jets is much steeper than the background coma. This might be related to the sublimation or fragmentation of the emitted dust grains. Interaction of the expanding gas flow with the cliff walls on both sides of Hapi could lead to erosion and material down-fall to the nucleus surface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new approach for reconstructing a patient-specific shape model and internal relative intensity distribution of the proximal femur from a limited number (e.g., 2) of calibrated C-arm images or X-ray radiographs. Our approach uses independent shape and appearance models that are learned from a set of training data to encode the a priori information about the proximal femur. An intensity-based non-rigid 2D-3D registration algorithm is then proposed to deformably fit the learned models to the input images. The fitting is conducted iteratively by minimizing the dissimilarity between the input images and the associated digitally reconstructed radiographs of the learned models together with regularization terms encoding the strain energy of the forward deformation and the smoothness of the inverse deformation. Comprehensive experiments conducted on images of cadaveric femurs and on clinical datasets demonstrate the efficacy of the present approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Guidelines for the treatment of patients in severe hypothermia and mainly in hypothermic cardiac arrest recommend the rewarming using the extracorporeal circulation (ECC). However,guidelines for the further in-hospital diagnostic and therapeutic approach of these patients, who often suffer from additional injuries—especially in avalanche casualties, are lacking. Lack of such algorithms may relevantly delay treatment and put patients at further risk. Together with a multidisciplinary team, the Emergency Department at the University Hospital in Bern, a level I trauma centre, created an algorithm for the in-hospital treatment of patients with hypothermic cardiac arrest. This algorithm primarily focuses on the decision-making process for the administration of ECC. THE BERNESE HYPOTHERMIA ALGORITHM: The major difference between the traditional approach, where all hypothermic patients are primarily admitted to the emergency centre, and our new algorithm is that hypothermic cardiac arrest patients without obvious signs of severe trauma are taken to the operating theatre without delay. Subsequently, the interdisciplinary team decides whether to rewarm the patient using ECC based on a standard clinical trauma assessment, serum potassium levels, core body temperature, sonographic examinations of the abdomen, pleural space, and pericardium, as well as a pelvic X-ray, if needed. During ECC, sonography is repeated and haemodynamic function as well as haemoglobin levels are regularly monitored. Standard radiological investigations according to the local multiple trauma protocol are performed only after ECC. Transfer to the intensive care unit, where mild therapeutic hypothermia is maintained for another 12 h, should not be delayed by additional X-rays for minor injuries. DISCUSSION: The presented algorithm is intended to facilitate in-hospital decision-making and shorten the door-to-reperfusion time for patients with hypothermic cardiac arrest. It was the result of intensive collaboration between different specialties and highlights the importance of high-quality teamwork for rare cases of severe accidental hypothermia. Information derived from the new International Hypothermia Registry will help to answer open questions and further optimize the algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite numerous studies about nitrogen-cycling in forest ecosystems, many uncertainties remain, especially regarding the longer-term nitrogen accumulation. To contribute to filling this gap, the dynamic process-based model TRACE, with the ability to simulate 15N tracer redistribution in forest ecosystems was used to study N cycling processes in a mountain spruce forest of the northern edge of the Alps in Switzerland (Alptal, SZ). Most modeling analyses of N-cycling and C-N interactions have very limited ability to determine whether the process interactions are captured correctly. Because the interactions in such a system are complex, it is possible to get the whole-system C and N cycling right in a model without really knowing if the way the model combines fine-scale interactions to derive whole-system cycling is correct. With the possibility to simulate 15N tracer redistribution in ecosystem compartments, TRACE features a very powerful tool for the validation of fine-scale processes captured by the model. We first adapted the model to the new site (Alptal, Switzerland; long-term low-dose N-amendment experiment) by including a new algorithm for preferential water flow and by parameterizing of differences in drivers such as climate, N deposition and initial site conditions. After the calibration of key rates such as NPP and SOM turnover, we simulated patterns of 15N redistribution to compare against 15N field observations from a large-scale labeling experiment. The comparison of 15N field data with the modeled redistribution of the tracer in the soil horizons and vegetation compartments shows that the majority of fine-scale processes are captured satisfactorily. Particularly, the model is able to reproduce the fact that the largest part of the N deposition is immobilized in the soil. The discrepancies of 15N recovery in the LF and M soil horizon can be explained by the application method of the tracer and by the retention of the applied tracer by the well developed moss layer, which is not considered in the model. Discrepancies in the dynamics of foliage and litterfall 15N recovery were also observed and are related to the longevity of the needles in our mountain forest. As a next step, we will use the final Alptal version of the model to calculate the effects of climate change (temperature, CO2) and N deposition on ecosystem C sequestration in this regionally representative Norway spruce (Picea abies) stand.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A previously presented algorithm for the reconstruction of bremsstrahlung spectra from transmission data has been implemented into MATHEMATICA. Spectra vectorial algebra has been used to solve the matrix system A * F = T. The new implementation has been tested by reconstructing photon spectra from transmission data acquired in narrow beam conditions, for nominal energies of 6, 15, and 25 MV. The results were in excellent agreement with the original calculations. Our implementation has the advantage to be based on a well-tested mathematical kernel. Furthermore it offers a comfortable user interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When patients enter our emergency room with suspected multiple injuries, Statscan provides a full body anterior and lateral image for initial diagnosis, and then zooms in on specific smaller areas for a more detailed evaluation. In order to examine the possible role of Statscan in the management of multiply injured patients we implemented a modified ATLS((R)) algorithm, where X-ray of C-spine, chest and pelvis have been replaced by single-total a.p./lat. body radiograph. Between 15 October 2006 and 1 February 2007 143 trauma patients (mean ISS 15+/-14 (3-75)) were included. We compared the time in resuscitation room to 650 patients (mean ISS 14+/-14 (3-75)) which were treated between 1 January 2002 and 1 January 2004 according to conventional ATLS protocol. The total-body scanning time was 3.5 min (3-6 min) compared to 25.7 (8-48 min) for conventional X-rays, The total ER time was unchanged 28.7 min (13-58 min) compared to 29.1 min (15-65 min) using conventional plain radiography. In 116/143 patients additional CT scans were necessary. In 98/116 full body trauma CT scans were performed. In 18/116 patients selective CT scans were ordered based on Statscan findings. In 43/143 additional conventional X-rays had to be performed, mainly due to inadequate a.p. views of fractured bones. All radiographs were transmitted over the hospital network (Picture Archiving and Communication System, PACS) for immediate simultaneous viewing at different places. The rapid availability of images for interpretation because of their digital nature and the reduced need for repeat exposures because of faulty radiography are also felt to be strengths.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The alveolated structure of the pulmonary acinus plays a vital role in gas exchange function. Three-dimensional (3D) analysis of the parenchymal region is fundamental to understanding this structure-function relationship, but only a limited number of attempts have been conducted in the past because of technical limitations. In this study, we developed a new image processing methodology based on finite element (FE) analysis for accurate 3D structural reconstruction of the gas exchange regions of the lung. Stereologically well characterized rat lung samples (Pediatr Res 53: 72-80, 2003) were imaged using high-resolution synchrotron radiation-based X-ray tomographic microscopy. A stack of 1,024 images (each slice: 1024 x 1024 pixels) with resolution of 1.4 mum(3) per voxel were generated. For the development of FE algorithm, regions of interest (ROI), containing approximately 7.5 million voxels, were further extracted as a working subunit. 3D FEs were created overlaying the voxel map using a grid-based hexahedral algorithm. A proper threshold value for appropriate segmentation was iteratively determined to match the calculated volume density of tissue to the stereologically determined value (Pediatr Res 53: 72-80, 2003). The resulting 3D FEs are ready to be used for 3D structural analysis as well as for subsequent FE computational analyses like fluid dynamics and skeletonization.