963 resultados para Riemann, superfici, genere, curve
Resumo:
Fluid optimization is a major contributor to improved outcome in patients. Unfortunately, anesthesiologists are often in doubt whether an additional fluid bolus will improve the hemodynamics of the patient or not as excess fluid may even jeopardize the condition. This article discusses physiological concepts of liberal versus restrictive fluid management followed by a discussion on the respective capabilities of various monitors to predict fluid responsiveness. The parameter difference in pulse pressure (dPP), derived from heart-lung interaction in mechanically ventilated patients is discussed in detail. The dPP cutoff value of 13% to predict fluid responsiveness is presented together with several assessment techniques of dPP. Finally, confounding variables on dPP measurements, such as ventilation parameters, pneumoperitoneum and use of norepinephrine are also mentioned.
Resumo:
The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).
Resumo:
Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation
Resumo:
More than 3000 years ago, men began quenching and tempering tools to improve their physical properties. The ancient people found that iron was easier to shape and form in a heated condition. Charcoal was used as the fuel, and when the shaping process was completed, the smiths cooled the piece in the most obvious way, quenching in water. Quite unintentionally, these people stumbled on the process for improving the properties of iron, and the art of blacksmithing began.
Resumo:
BACKGROUND: Robotic-assisted laparoscopic surgery (RALS) is evolving as an important surgical approach in the field of colorectal surgery. We aimed to evaluate the learning curve for RALS procedures involving resections of the rectum and rectosigmoid. METHODS: A series of 50 consecutive RALS procedures were performed between August 2008 and September 2009. Data were entered into a retrospective database and later abstracted for analysis. The surgical procedures included abdominoperineal resection (APR), anterior rectosigmoidectomy (AR), low anterior resection (LAR), and rectopexy (RP). Demographic data and intraoperative parameters including docking time (DT), surgeon console time (SCT), and total operative time (OT) were analyzed. The learning curve was evaluated using the cumulative sum (CUSUM) method. RESULTS: The procedures performed for 50 patients (54% male) included 25 AR (50%), 15 LAR (30%), 6 APR (12%), and 4 RP (8%). The mean age of the patients was 54.4 years, the mean BMI was 27.8 kg/m(2), and the median American Society of Anesthesiologists (ASA) classification was 2. The series had a mean DT of 14 min, a mean SCT of 115.1 min, and a mean OT of 246.1 min. The DT and SCT accounted for 6.3% and 46.8% of the OT, respectively. The SCT learning curve was analyzed. The CUSUM(SCT) learning curve was best modeled as a parabola, with equation CUSUM(SCT) in minutes equal to 0.73 × case number(2) - 31.54 × case number - 107.72 (R = 0.93). The learning curve consisted of three unique phases: phase 1 (the initial 15 cases), phase 2 (the middle 10 cases), and phase 3 (the subsequent cases). Phase 1 represented the initial learning curve, which spanned 15 cases. The phase 2 plateau represented increased competence with the robotic technology. Phase 3 was achieved after 25 cases and represented the mastery phase in which more challenging cases were managed. CONCLUSIONS: The three phases identified with CUSUM analysis of surgeon console time represented characteristic stages of the learning curve for robotic colorectal procedures. The data suggest that the learning phase was achieved after 15 to 25 cases.
Resumo:
We study the tuning curve of entangled photons generated by type-0 spontaneous parametric down-conversion in a periodically poled potassium titanyl phosphate crystal. We demonstrate the X-shaped spatiotemporal structure of the spectrum by means of measurements and numerical simulations. Experiments for different pump waists, crystal temperatures, and crystal lengths are in good agreement with numerical simulations.
Resumo:
Palynology provides the opportunity to make inferences on changes in diversity of terrestrial vegetation over long time scales. The often coarse taxonomic level achievable in pollen analysis, differences in pollen production and dispersal, and the lack of pollen source boundaries hamper the application of diversity indices to palynology. Palynological richness, the number of pollen types at a constant pollen count, is the most robust and widely used diversity indicator for pollen data. However, this index is also influenced by the abundance distribution of pollen types in sediments. In particular, where the index is calculated by rarefaction analysis, information on taxonomic richness at low abundance may be lost. Here we explore information that can be extracted from the accumulation of taxa over consecutive samples. The log-transformed taxa accumulation curve can be broken up into linear sections with different slope and intersect parameters, describing the accumulation of new taxa within the section. The breaking points may indicate changes in the species pool or in the abundance of high versus low pollen producers. Testing this concept on three pollen diagrams from different landscapes, we find that the break points in the taxa accumulation curves provide convenient zones for identifying changes in richness and evenness. The linear regressions over consecutive samples can be used to inter- and extrapolate to low or extremely high pollen counts, indicating evenness and richness in taxonomic composition within these zones. An evenness indicator, based on the rank-order-abundance is used to assist in the evaluation of the results and the interpretation of the fossil records. Two central European pollen diagrams show major changes in the taxa accumulation curves for the Lateglacial period and the time of human induced land-use changes, while they do not indicate strong changes in the species pool with the onset of the Holocene. In contrast, a central Swedish pollen diagram shows comparatively little change, but high richness during the early Holocene forest establishment. Evenness and palynological richness are related for most periods in the three diagrams, however, sections before forest establishment and after forest clearance show high evenness, which is not necessarily accompanied by high palynological richness, encouraging efforts to separate the two.
Resumo:
auctore Christiano Hartmanno Samuele Gatzert