45 resultados para ACCURATE DOCKING


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: The decision to maintain intensive treatment in cardiac surgical patients with poor initial outcome is mostly based on individual experience. The risk scoring systems used in cardiac surgery have no prognostic value for individuals. This study aims to assess (a) factors possibly related to poor survival and functional outcomes in cardiac surgery patients requiring prolonged (> or = 5 days) intensive care unit (ICU) treatment, (b) conditions in which treatment withdrawal might be justified, and (c) the patient's perception of the benefits and drawbacks of long intensive treatments. METHODS: The computerized data prospectively recorded for every patient in the intensive care unit over a 3-year period were reviewed and analyzed (n=1859). Survival and quality of life (QOL) outcomes were determined in all patients having required > or =5 consecutive days of intensive treatment (n=194/10.4%). Long-term survivors were interviewed at yearly intervals in a standardized manner and quality of life was assessed using the dependency score of Karnofsky. No interventions or treatments were given, withhold, or withdrawn as part of this study. RESULTS: In-hospital, 1-, and 3-year cumulative survival rates reached 91.3%, 85.6%, and 75.1%, respectively. Quality of life assessed 1 year postoperatively by the score of Karnofsky was good in 119/165 patients, fair in 32 and poor in 14. Multivariate logistic regression analysis of 19 potential predictors of poor outcome identified dialysis as the sole factor significantly (p=0.027) - albeit moderately - reducing long-term survival, and sustained neurological deficit as an inconstant predictor of poor functional outcome (p=0.028). One year postoperatively 0.63% of patients still reminded of severe suffering in the intensive station and 20% of discomfort. Only 7.7% of patients would definitely refuse redo surgery. CONCLUSIONS: This study of cardiac surgical patients requiring > or =5 days of intensive treatment did not identify factors unequivocally justifying early treatment limitation in individuals. It found that 1-year mortality and disability rates can be maintained at a low level in this subset of patients, and that severe suffering in the ICU is infrequent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constructing a 3D surface model from sparse-point data is a nontrivial task. Here, we report an accurate and robust approach for reconstructing a surface model of the proximal femur from sparse-point data and a dense-point distribution model (DPDM). The problem is formulated as a three-stage optimal estimation process. The first stage, affine registration, is to iteratively estimate a scale and a rigid transformation between the mean surface model of the DPDM and the sparse input points. The estimation results of the first stage are used to establish point correspondences for the second stage, statistical instantiation, which stably instantiates a surface model from the DPDM using a statistical approach. This surface model is then fed to the third stage, kernel-based deformation, which further refines the surface model. Handling outliers is achieved by consistently employing the least trimmed squares (LTS) approach with a roughly estimated outlier rate in all three stages. If an optimal value of the outlier rate is preferred, we propose a hypothesis testing procedure to automatically estimate it. We present here our validations using four experiments, which include 1 leave-one-out experiment, 2 experiment on evaluating the present approach for handling pathology, 3 experiment on evaluating the present approach for handling outliers, and 4 experiment on reconstructing surface models of seven dry cadaver femurs using clinically relevant data without noise and with noise added. Our validation results demonstrate the robust performance of the present approach in handling outliers, pathology, and noise. An average 95-percentile error of 1.7-2.3 mm was found when the present approach was used to reconstruct surface models of the cadaver femurs from sparse-point data with noise added.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is no accepted way of measuring prothrombin time without time loss for patients undergoing major surgery who are at risk of intraoperative dilution and consumption coagulopathy due to bleeding and volume replacement with crystalloids or colloids. Decisions to transfuse fresh frozen plasma and procoagulatory drugs have to rely on clinical judgment in these situations. Point-of-care devices are considerably faster than the standard laboratory methods. In this study we assessed the accuracy of a Point-of-care (PoC) device measuring prothrombin time compared to the standard laboratory method. Patients undergoing major surgery and intensive care unit patients were included. PoC prothrombin time was measured by CoaguChek XS Plus (Roche Diagnostics, Switzerland). PoC and reference tests were performed independently and interpreted under blinded conditions. Using a cut-off prothrombin time of 50%, we calculated diagnostic accuracy measures, plotted a receiver operating characteristic (ROC) curve and tested for equivalence between the two methods. PoC sensitivity and specificity were 95% (95% CI 77%, 100%) and 95% (95% CI 91%, 98%) respectively. The negative likelihood ratio was 0.05 (95% CI 0.01, 0.32). The positive likelihood ratio was 19.57 (95% CI 10.62, 36.06). The area under the ROC curve was 0.988. Equivalence between the two methods was confirmed. CoaguChek XS Plus is a rapid and highly accurate test compared with the reference test. These findings suggest that PoC testing will be useful for monitoring intraoperative prothrombin time when coagulopathy is suspected. It could lead to a more rational use of expensive and limited blood bank resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clock synchronization is critical for the operation of a distributed wireless network system. In this paper we investigate on a method able to evaluate in real time the synchronization offset between devices down to nanoseconds (as needed for positioning). The method is inspired by signal processing algorithms and relies on fine-grain time information obtained during the reconstruction of the signal at the receiver. Applying the method to a GPS-synchronized system show that GPS-based synchronization has high accuracy potential but still suffers from short-term clock drift, which limits the achievable localization error.