964 resultados para Gaussian curve


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

More than 3000 years ago, men began quenching and tem­pering tools to improve their physical properties. The an­cient people found that iron was easier to shape and form in a heated condition. Charcoal was used as the fuel, and when the shaping process was completed, the smiths cooled the piece in the most obvious way, quenching in water. Quite un­intentionally, these people stumbled on the process for im­proving the properties of iron, and the art of blacksmithing began.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fossil pollen data from stratigraphic cores are irregularly spaced in time due to non-linear age-depth relations. Moreover, their marginal distributions may vary over time. We address these features in a nonparametric regression model with errors that are monotone transformations of a latent continuous-time Gaussian process Z(T). Although Z(T) is unobserved, due to monotonicity, under suitable regularity conditions, it can be recovered facilitating further computations such as estimation of the long-memory parameter and the Hermite coefficients. The estimation of Z(T) itself involves estimation of the marginal distribution function of the regression errors. These issues are considered in proposing a plug-in algorithm for optimal bandwidth selection and construction of confidence bands for the trend function. Some high-resolution time series of pollen records from Lago di Origlio in Switzerland, which go back ca. 20,000 years are used to illustrate the methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several methods based on Kriging have recently been proposed for calculating a probability of failure involving costly-to-evaluate functions. A closely related problem is to estimate the set of inputs leading to a response exceeding a given threshold. Now, estimating such a level set—and not solely its volume—and quantifying uncertainties on it are not straightforward. Here we use notions from random set theory to obtain an estimate of the level set, together with a quantification of estimation uncertainty. We give explicit formulae in the Gaussian process set-up and provide a consistency result. We then illustrate how space-filling versus adaptive design strategies may sequentially reduce level set estimation uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of expensive numerical experiments, a promising solution for alleviating the computational costs consists of using partially converged simulations instead of exact solutions. The gain in computational time is at the price of precision in the response. This work addresses the issue of fitting a Gaussian process model to partially converged simulation data for further use in prediction. The main challenge consists of the adequate approximation of the error due to partial convergence, which is correlated in both design variables and time directions. Here, we propose fitting a Gaussian process in the joint space of design parameters and computational time. The model is constructed by building a nonstationary covariance kernel that reflects accurately the actual structure of the error. Practical solutions are proposed for solving parameter estimation issues associated with the proposed model. The method is applied to a computational fluid dynamics test case and shows significant improvement in prediction compared to a classical kriging model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Robotic-assisted laparoscopic surgery (RALS) is evolving as an important surgical approach in the field of colorectal surgery. We aimed to evaluate the learning curve for RALS procedures involving resections of the rectum and rectosigmoid. METHODS: A series of 50 consecutive RALS procedures were performed between August 2008 and September 2009. Data were entered into a retrospective database and later abstracted for analysis. The surgical procedures included abdominoperineal resection (APR), anterior rectosigmoidectomy (AR), low anterior resection (LAR), and rectopexy (RP). Demographic data and intraoperative parameters including docking time (DT), surgeon console time (SCT), and total operative time (OT) were analyzed. The learning curve was evaluated using the cumulative sum (CUSUM) method. RESULTS: The procedures performed for 50 patients (54% male) included 25 AR (50%), 15 LAR (30%), 6 APR (12%), and 4 RP (8%). The mean age of the patients was 54.4 years, the mean BMI was 27.8 kg/m(2), and the median American Society of Anesthesiologists (ASA) classification was 2. The series had a mean DT of 14 min, a mean SCT of 115.1 min, and a mean OT of 246.1 min. The DT and SCT accounted for 6.3% and 46.8% of the OT, respectively. The SCT learning curve was analyzed. The CUSUM(SCT) learning curve was best modeled as a parabola, with equation CUSUM(SCT) in minutes equal to 0.73 × case number(2) - 31.54 × case number - 107.72 (R = 0.93). The learning curve consisted of three unique phases: phase 1 (the initial 15 cases), phase 2 (the middle 10 cases), and phase 3 (the subsequent cases). Phase 1 represented the initial learning curve, which spanned 15 cases. The phase 2 plateau represented increased competence with the robotic technology. Phase 3 was achieved after 25 cases and represented the mastery phase in which more challenging cases were managed. CONCLUSIONS: The three phases identified with CUSUM analysis of surgeon console time represented characteristic stages of the learning curve for robotic colorectal procedures. The data suggest that the learning phase was achieved after 15 to 25 cases.