4 resultados para Re-sampled Uniform

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study was to analyse the cerebral venous outflow in relation to the arterial inflow during a Valsalva manoeuvre (VM). In 19 healthy volunteers (mean age 24.1 +/- 2.6 years), the middle cerebral artery (MCA) and the straight sinus (SRS) were insonated by transcranial Doppler sonography. Simultaneously the arterial blood pressure was recorded using a photoplethysmographic method. Two VM of 10 s length were performed per participant. Tracings of the variables were then transformed to equidistantly re-sampled data. Phases of the VM were analysed regarding the increase of the flow velocities and the latency to the peak. The typical four phases of the VM were also found in the SRS signal. The relative flow velocity (FV) increase was significantly higher in the SRS than in the MCA for all phases, particularly that of phase IV (p < 0.01). Comparison of the time latency of the VM phases of the MCA and SRS only showed a significant difference for phase I (p < 0.01). In particular, there was no significant difference for phase IV (15.8 +/- 0.29 vs. 16.0 +/- 0.28 s). Alterations in venous outflow in phase I are best explained by a cross-sectional change of the lumen of the SRS, while phases II and III are compatible with a Starling resistor. However, the significantly lager venous than the arterial overshoot in phase IV may be explained by the active regulation of the venous tone.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Translocation of nanoparticles (NP) from the pulmonary airways into other pulmonary compartments or the systemic circulation is controversially discussed in the literature. In a previous study it was shown that titanium dioxide (TiO2) NP were "distributed in four lung compartments (air-filled spaces, epithelium/endothelium, connective tissue, capillary lumen) in correlation with compartment size". It was concluded that particles can move freely between these tissue compartments. To analyze whether the distribution of TiO2 NP in the lungs is really random or shows a preferential targeting we applied a newly developed method for comparing NP distributions. METHODS: Rat lungs exposed to an aerosol containing TiO2 NP were prepared for light and electron microscopy at 1 h and at 24 h after exposure. Numbers of TiO2 NP associated with each compartment were counted using energy filtering transmission electron microscopy. Compartment size was estimated by unbiased stereology from systematically sampled light micrographs. Numbers of particles were related to compartment size using a relative deposition index and chi-squared analysis. RESULTS: Nanoparticle distribution within the four compartments was not random at 1 h or at 24 h after exposure. At 1 h the connective tissue was the preferential target of the particles. At 24 h the NP were preferentially located in the capillary lumen. CONCLUSION: We conclude that TiO2 NP do not move freely between pulmonary tissue compartments, although they can pass from one compartment to another with relative ease. The residence time of NP in each tissue compartment of the respiratory system depends on the compartment and the time after exposure. It is suggested that a small fraction of TiO2 NP are rapidly transported from the airway lumen to the connective tissue and subsequently released into the systemic circulation.