997 resultados para Predictive Monitoring
Resumo:
The article provides a method for long-term forecast of frame alignment losses based on the bit-error rate monitoring for structure-agnostic circuit emulation service over Ethernet in a mobile backhaul network. The developed method with corresponding algorithm allows to detect instants of probable frame alignment losses in a long term perspective in order to give engineering personnel extra time to take some measures aimed at losses prevention. Moreover, long-term forecast of frame alignment losses allows to make a decision about the volume of TDM data encapsulated into a circuit emulation frame in order to increase utilization of the emulated circuit. The developed long-term forecast method formalized with the corresponding algorithm is recognized as cognitive and can act as a part of network predictive monitoring system.
Resumo:
La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), como la rapídez de procesamiento, coste, disponibilidad o seguridad, son críticas para la usabilidad de los servicios o sus composiciones en cualquier aplicación concreta. El análisis de estas propriedades se puede realizarse de una forma más precisa y rica en información si se utilizan las técnicas de análisis de programas, como el análisis de complejidad o de compartición de datos, que son capables de analizar simultáneamente tanto las estructuras de control como las de datos, dependencias y operaciones en una composición. El análisis de coste computacional para la composicion de servicios puede ayudar a una monitorización predictiva así como a una adaptación proactiva a través de una inferencia automática de coste computacional, usando los limites altos y bajos como funciones del valor o del tamaño de los mensajes de entrada. Tales funciones de coste se pueden usar para adaptación en la forma de selección de los candidatos entre los servicios que minimizan el coste total de la composición, basado en los datos reales que se pasan al servicio. Las funciones de coste también pueden ser combinadas con los parámetros extraídos empíricamente desde la infraestructura, para producir las funciones de los límites de QoS sobre los datos de entrada, cuales se pueden usar para previsar, en el momento de invocación, las violaciones de los compromisos al nivel de servicios (Service Level Agreements, SLA) potenciales or inminentes. En las composiciones críticas, una previsión continua de QoS bastante eficaz y precisa se puede basar en el modelado con restricciones de QoS desde la estructura de la composition, datos empiricos en tiempo de ejecución y (cuando estén disponibles) los resultados del análisis de complejidad. Este enfoque se puede aplicar a las orquestaciones de servicios con un control centralizado del flujo, así como a las coreografías con participantes multiples, siguiendo unas interacciones complejas que modifican su estado. El análisis del compartición de datos puede servir de apoyo para acciones de adaptación, como la paralelización, fragmentación y selección de los componentes, las cuales son basadas en dependencias funcionales y en el contenido de información en los mensajes, datos internos y las actividades de la composición, cuando se usan construcciones de control complejas, como bucles, bifurcaciones y flujos anidados. Tanto las dependencias funcionales como el contenido de información (descrito a través de algunos atributos definidos por el usuario) se pueden expresar usando una representación basada en la lógica de primer orden (claúsulas de Horn), y los resultados del análisis se pueden interpretar como modelos conceptuales basados en retículos. ABSTRACT Service-Oriented Computing (SOC) is a widely accepted paradigm for development of flexible, distributed and adaptable software systems, in which service compositions perform more complex, higher-level, often cross-organizational tasks using atomic services or other service compositions. In such systems, Quality of Service (QoS) properties, such as the performance, cost, availability or security, are critical for the usability of services and their compositions in concrete applications. Analysis of these properties can become more precise and richer in information, if it employs program analysis techniques, such as the complexity and sharing analyses, which are able to simultaneously take into account both the control and the data structures, dependencies, and operations in a composition. Computation cost analysis for service composition can support predictive monitoring and proactive adaptation by automatically inferring computation cost using the upper and lower bound functions of value or size of input messages. These cost functions can be used for adaptation by selecting service candidates that minimize total cost of the composition, based on the actual data that is passed to them. The cost functions can also be combined with the empirically collected infrastructural parameters to produce QoS bounds functions of input data that can be used to predict potential or imminent Service Level Agreement (SLA) violations at the moment of invocation. In mission-critical applications, an effective and accurate continuous QoS prediction, based on continuations, can be achieved by constraint modeling of composition QoS based on its structure, known data at runtime, and (when available) the results of complexity analysis. This approach can be applied to service orchestrations with centralized flow control, and choreographies with multiple participants with complex stateful interactions. Sharing analysis can support adaptation actions, such as parallelization, fragmentation, and component selection, which are based on functional dependencies and information content of the composition messages, internal data, and activities, in presence of complex control constructs, such as loops, branches, and sub-workflows. Both the functional dependencies and the information content (described using user-defined attributes) can be expressed using a first-order logic (Horn clause) representation, and the analysis results can be interpreted as a lattice-based conceptual models.
Resumo:
Background Recurrent nerve injury is 1 of the most important complications of thyroidectomy. During the last decade, nerve monitoring has gained increasing acceptance in several centers as a method to predict and to document nerve function at the end of the operation. We evaluated the efficacy of a nerve monitoring system in a series of patients who underwent thyroidectomy and critically analyzed the negative predictive value (NPV) and positive predictive value (PPV) of the method. Methods. NIM System efficacy was prospectively analyzed in 447 patients who underwent thyroidectomy between 2001 and 2008 (366 female/81 male; 420 white/47 nonwhite; 11 to 82 years of age; median, 43 years old). There were 421 total thyroidectomies and 26 partial thyroidectomies, leading to 868 nerves at risk. The gold standard to evaluate inferior laryngeal nerve function was early postoperative videolaryngoscopy, which was repeated after 4 to 6 months in all patients with abnormal endoscopic findings. Results. At the early evaluation, 858 nerves (98.8%) presented normal videolaryngoscopic features after surgery. Ten paretic/paralyzed nerves (1.2%) were detected (2 unexpected unilateral paresis, 2 unexpected bilateral paresis, 1 unexpected unilateral paralysis, 1 unexpected bilateral paralyses, and 1 expected unilateral paralysis). At the late videolaryngoscopy, only 2 permanent nerve paralyses were noted (0.2%), with an ultimate result of 99.8% functioning nerves. Nerve monitoring showed absent or markedly reduced electrical activity at the end of the operations in 25/868 nerves (2.9%), including all 10 endoscopically compromised nerves, with 15 false-positive results. There were no false-negative results. Therefore, the PPV was 40.0%, and the NPV was 100%. Conclusions. In the present series, nerve monitoring had a very high PPV but a low NPV for the detection of recurrent nerve injury. (C) 2011 Wiley Periodicals, Inc. Head Neck 34: 175-179, 2012
Resumo:
INTRODUCTION: The incidence of bloodstream infection (BSI) in extracorporeal life support (ECLS) is reported between 0.9 and 19.5%. In January 2006, the Extracorporeal Life Support Organization (ELSO) reported an overall incidence of 8.78% distributed as follows: respiratory: 6.5% (neonatal), 20.8% (pediatric); cardiac: 8.2% (neonatal) and 12.6% (pediatric). METHOD: At BC Children's Hospital (BCCH) daily surveillance blood cultures (BC) are performed and antibiotic prophylaxis is not routinely recommended. Positive BC (BC+) were reviewed, including resistance profiles, collection time of BC+, time to positivity and mortality. White blood cell count, absolute neutrophile count, immature/total ratio, platelet count, fibrinogen and lactate were analyzed 48, 24 and 0 h prior to BSI. A univariate linear regression analysis was performed. RESULTS: From 1999 to 2005, 89 patients underwent ECLS. After exclusion, 84 patients were reviewed. The attack rate was 22.6% (19 BSI) and 13.1% after exclusion of coagulase-negative staphylococci (n = 8). BSI patients were significantly longer on ECLS (157 h) compared to the no-BSI group (127 h, 95% CI: 106-148). Six BSI patients died on ECLS (35%; 4 congenital diaphragmatic hernias, 1 hypoplastic left heart syndrome and 1 after a tetralogy repair). BCCH survival on ECLS was 71 and 58% at discharge, which is comparable to previous reports. No patient died primarily because of BSI. No BSI predictor was identified, although lactate may show a decreasing trend before BSI (P = 0.102). CONCLUSION: Compared with ELSO, the studied BSI incidence was higher with a comparable mortality. We speculate that our BSI rate is explained by underreporting of "contaminants" in the literature, the use of broad-spectrum antibiotic prophylaxis and a higher yield with daily monitoring BC. We support daily surveillance blood cultures as an alternative to antibiotic prophylaxis in the management of patients on ECLS.
Resumo:
The increased use of vancomycin in hospitals has resulted in a standard practice to monitor serum vancomycin levels because of possible nephrotoxicity. However, the routine monitoring of vancomycin serum concentration is under criticism and the cost effectiveness of such routine monitoring is in question because frequent monitoring neither results in increase efficacy nor decrease nephrotoxicity. The purpose of the present study is to determine factors that may place patients at increased risk of developing vancomycin induced nephrotoxicity and for whom monitoring may be most beneficial.^ From September to December 1992, 752 consecutive in patients at The University of Texas M. D. Anderson Cancer Center, Houston, were prospectively evaluated for nephrotoxicity in order to describe predictive risk factors for developing vancomycin related nephrotoxicity. Ninety-five patients (13 percent) developed nephrotoxicity. A total of 299 patients (40 percent) were considered monitored (vancomycin serum levels determined during the course of therapy), and 346 patients (46 percent) were receiving concurrent moderate to highly nephrotoxic drugs.^ Factors that were found to be significantly associated with nephrotoxicity in univariate analysis were: gender, base serum creatinine greater than 1.5mg/dl, monitor, leukemia, concurrent moderate to highly nephrotoxic drugs, and APACHE III scores of 40 or more. Significant factors in the univariate analysis were then entered into a stepwise logistic regression analysis to determine independent predictive risk factors for vancomycin induced nephrotoxicity.^ Factors, with their corresponding odds ratios and 95% confidence limits, selected by stepwise logistic regression analysis to be predictive of vancomycin induced nephrotoxicity were: Concurrent therapy with moderate to highly nephrotoxic drugs (2.89; 1.76-4.74), APACHE III scores of 40 or more (1.98; 1.16-3.38), and male gender (1.98; 1.04-2.71).^ Subgroup (monitor and non-monitor) analysis showed that male (OR = 1.87; 95% CI = 1.01, 3.45) and moderate to highly nephrotoxic drugs (OR = 4.58; 95% CI = 2.11, 9.94) were significant for nephrotoxicity in monitored patients. However, only APACHE III score (OR = 2.67; 95% CI = 1.13,6.29) was significant for nephrotoxicity in non-monitored patients.^ The conclusion drawn from this study is that not every patient receiving vancomycin therapy needs frequent monitoring of vancomycin serum levels. Such routine monitoring may be appropriate in patients with one or more of the identified risk factors and low risk patients do not need to be subjected to the discomfort and added cost of multiple blood sampling. Such prudent selection of patients to monitor may decrease cost to patients and hospital. ^
Resumo:
Objectives We studied the relationship between changes in body composition and changes in blood pressure levels. Background The mechanisms underlying the frequently observed progression from pre-hypertension to hypertension are poorly understood. Methods We examined 1,145 subjects from a population-based survey at baseline in 1994/1995 and at follow-up in 2004/2005. First, we studied individuals pre-hypertensive at baseline who, during 10 years of follow-up, either had normalized blood pressure (PreNorm, n = 48), persistently had pre-hypertension (PrePre, n = 134), or showed progression to hypertension (PreHyp, n = 183). In parallel, we studied predictors for changes in blood pressure category in individuals hypertensive at baseline (n = 429). Results After 10 years, the PreHyp group was characterized by a marked increase in body weight (+5.71% [95% confidence interval (CI): 4.60% to 6.83%]) that was largely the result of an increase in fat mass (+17.8% [95% CI: 14.5% to 21.0%]). In the PrePre group, both the increases in body weight (+1.95% [95% CI: 0.68% to 3.22%]) and fat mass (+8.09% [95% CI: 4.42% to 11.7%]) were significantly less pronounced than in the PreHyp group (p < 0.001 for both). The PreNorm group showed no significant change in body weight (-1.55% [95% CI: -3.70% to 0.61%]) and fat mass (+0.20% [95% CI: -6.13% to 6.52%], p < 0.05 for both, vs. the PrePre group). Conclusions After 10 years of follow-up, hypertension developed in 50.1% of individuals with pre-hypertension and only 6.76% went from hypertensive to pre-hypertensive blood pressure levels. An increase in body weight and fat mass was a risk factor for the development of sustained hypertension, whereas a decrease was predictive of a decrease in blood pressure. (J Am Coll Cardiol 2010; 56: 65-76) (C) 2010 by the American College of Cardiology Foundation
Resumo:
The identification of predictors for the progression of chronic Chagas cardiomyopathy (CCC) is essential to ensure adequate patient management. This study looked into a non-concurrent cohort of 165 CCC patients between 1985 and 2010 for independent predictors for CCC progression. The outcomes were worsening of the CCC scores and the onset of left ventricular dysfunction assessed by means of echo-Doppler cardiography. Patients were analyzed for social, demographic, epidemiologic, clinical and workup-related variables. A descriptive analysis was conducted, followed by survival curves based on univariate (Kaplan-Meier and Cox’s univariate model) and multivariate (Cox regression model) analysis. Patients were followed from two to 20 years (mean: 8.2). Their mean age was 44.8 years (20-77). Comparing both iterations of the study, in the second there was a statistically significant increase in the PR interval and in the QRS duration, despite a reduction in heart rates (Wilcoxon < 0.01). The predictors for CCC progression in the final regression model were male gender (HR = 2.81), Holter monitoring showing pauses equal to or greater than two seconds (HR = 3.02) increased cardiothoracic ratio (HR = 7.87) and time of use of digitalis (HR = 1.41). Patients with multiple predictive factors require stricter follow-up and treatment.
Resumo:
First published online: December 16, 2014.
Resumo:
OBJECTIVE: Risk stratification of patients with nonsustained ventricular tachycardia (NSVT) and chronic chagasic cardiomyopathy (CCC). METHODS: Seventy eight patients with CCC and NSVT were consecutively and prospectively studied. All patients underwent to 24-hour Holter monitoring, radioisotopic ventriculography, left ventricular angiography, and electrophysiologic study. With programmed ventricular stimulation. RESULTS: Sustained monomorphic ventricular tachycardia (SMVT) was induced in 25 patients (32%), NSVT in 20 (25.6%) and ventricular fibrillation in 4 (5.1%). In 29 patients (37.2%) no arrhythmia was inducible. During a 55.7-month-follow-up, 22 (28.2%) patients died, 16 due to sudden death, 2 due to nonsudden cardiac death and 4 due to noncardiac death. Logistic regression analysis showed that induction was the independent and main variable that predicted the occurrence of subsequent events and cardiac death (probability of 2.56 and 2.17, respectively). The Mantel-Haenszel chi-square test showed that survival probability was significantly lower in the inducible group than in the noninductible group. The percentage of patients free of events was significantly higher in the noninducible group. CONCLUSION: Induction of SMVT during programmed ventricular stimulation was a predictor of arrhythmia occurrence cardiac death and general mortality in patients with CCC and NSVT.
Resumo:
Background: The importance of measuring blood pressure before morning micturition and in the afternoon, while working, is yet to be established in relation to the accuracy of home blood pressure monitoring (HBPM). Objective: To compare two HBPM protocols, considering 24-hour ambulatory blood pressure monitoring (wakefulness ABPM) as gold-standard and measurements taken before morning micturition (BM) and in the afternoon (AM), for the best diagnosis of systemic arterial hypertension (SAH), and their association with prognostic markers. Methods: After undergoing 24-hour wakefulness ABPM, 158 participants (84 women) were randomized for 3- or 5-day HBPM. Two variations of the 3-day protocol were considered: with measurements taken before morning micturition and in the afternoon (BM+AM); and with post-morning-micturition and evening measurements (PM+EM). All patients underwent echocardiography (for left ventricular hypertrophy - LVH) and urinary albumin measurement (for microalbuminuria - MAU). Result: Kappa statistic for the diagnosis of SAH between wakefulness-ABPM and standard 3-day HBPM, 3-day HBPM (BM+AM) and (PM+EM), and 5-day HBPM were 0.660, 0.638, 0.348 and 0.387, respectively. The values of sensitivity of (BM+AM) versus (PM+EM) were 82.6% × 71%, respectively, and of specificity, 84.8% × 74%, respectively. The positive and negative predictive values were 69.1% × 40% and 92.2% × 91.2%, respectively. The comparisons of intraclass correlations for the diagnosis of LVH and MAU between (BM+AM) and (PM+EM) were 0.782 × 0.474 and 0.511 × 0.276, respectively. Conclusions: The 3 day-HBPM protocol including measurements taken before morning micturition and during work in the afternoon showed the best agreement with SAH diagnosis and the best association with prognostic markers.
Resumo:
AbstractBackground:30-40% of cardiac resynchronization therapy cases do not achieve favorable outcomes.Objective:This study aimed to develop predictive models for the combined endpoint of cardiac death and transplantation (Tx) at different stages of cardiac resynchronization therapy (CRT).Methods:Prospective observational study of 116 patients aged 64.8 ± 11.1 years, 68.1% of whom had functional class (FC) III and 31.9% had ambulatory class IV. Clinical, electrocardiographic and echocardiographic variables were assessed by using Cox regression and Kaplan-Meier curves.Results:The cardiac mortality/Tx rate was 16.3% during the follow-up period of 34.0 ± 17.9 months. Prior to implantation, right ventricular dysfunction (RVD), ejection fraction < 25% and use of high doses of diuretics (HDD) increased the risk of cardiac death and Tx by 3.9-, 4.8-, and 5.9-fold, respectively. In the first year after CRT, RVD, HDD and hospitalization due to congestive heart failure increased the risk of death at hazard ratios of 3.5, 5.3, and 12.5, respectively. In the second year after CRT, RVD and FC III/IV were significant risk factors of mortality in the multivariate Cox model. The accuracy rates of the models were 84.6% at preimplantation, 93% in the first year after CRT, and 90.5% in the second year after CRT. The models were validated by bootstrapping.Conclusion:We developed predictive models of cardiac death and Tx at different stages of CRT based on the analysis of simple and easily obtainable clinical and echocardiographic variables. The models showed good accuracy and adjustment, were validated internally, and are useful in the selection, monitoring and counseling of patients indicated for CRT.
Resumo:
The benefit of bevacizumab (Bv) has been shown in different tumors including colorectal cancer, renal cancer, pulmonary non-small cell cancer and also breast cancer. However to date, there is no established test evaluating the angiogenic status of a patient and monitoring the effects of anti-angiogenic treatments. Tumor angiogenesis is the result of a balance between multiple pro- and anti¬angiogenic molecules. There is very little published clinical data exploring the impact of the anti-angiogenic therapy on the different angiogenesis-related molecules and the potential role of these molecules as prognostic or predictive factors.
Resumo:
AIM: Hyperglycaemia is now a recognized predictive factor of morbidity and mortality after coronary artery bypass grafting (CABG). For this reason, we aimed to evaluate the postoperative management of glucose control in patients undergoing cardiovascular surgery, and to assess the impact of glucose levels on in-hospital mortality and morbidity. METHODS: This was a retrospective study investigating the association between postoperative blood glucose and outcomes, including death, post-surgical complications, and length of stay in the intensive care unit (ICU) and in hospital. RESULTS: A total of 642 consecutive patients were enrolled into the study after cardiovascular surgery (CABG, carotid endarterectomy and bypass in the lower limbs). Patients' mean age was 68+/-10 years, and 74% were male. In-hospital mortality was 5% in diabetic patients vs 2% in non-diabetic patients (OR: 1.66, P=0.076). Having blood glucose levels in the upper quartile range (> or =8.8 mmol/L) on postoperative day 1 was independently associated with death (OR: 10.16, P=0.0002), infectious complications (OR: 1.76, P=0.04) and prolonged ICU stay (OR: 3.10, P<0.0001). Patients presenting with three or more hypoglycaemic episodes (<4.1 mmol/L) had increased rates of mortality (OR: 9.08, P<0.0001) and complications (OR: 8.57, P<0.0001). CONCLUSION: Glucose levels greater than 8.8 mmol/L on postoperative day 1 and having three or more hypoglycaemic episodes in the postoperative period were predictive of mortality and morbidity among patients undergoing cardiovascular surgery. This suggests that a multidisciplinary approach may be able to achieve better postoperative blood glucose control. Conclusion: Objectif: L'hyperglycémie a été reconnue comme facteur prédictif de morbidité et mortalité après un pontage aortocoronaire. Notre étude avait pour objectif d'évaluer la prise en charge postopératoire des glycémies chez les patients qui avaient subi une intervention chirurgicale cardiovasculaire et d'évaluer l'impact de ces glycémies sur la mortalité et la morbidité intrahospitalières. Méthodes: Étude rétrospective recherchant une association entre la glycémie postopératoire et les complications postchirurgicales, la mortalité et la durée du séjour aux soins intensifs et à l'hôpital. Résultats: L'étude a été réalisée sur 642 patients qui avaient subi une intervention chirurgicale cardiovasculaire (ex. pontage aortocoronaire, endartérectomie de la carotide, pontage artériel des membres inférieurs). L'âge moyen est de 68 ± 10 ans et 74 % des patients étaient de sexe masculin. La mortalité intrahospitalière a été de 5 % parmi les patients diabétiques et 2 % chez les non-diabétiques (OR 1,66, p = 0,076). Les taux de glycémies situés dans le quartile supérieur (≥ 8,8 mmol/l) à j1 postopératoire sont associés de manière indépendante avec la mortalité (OR 10,16, 95 % CI 3,20-39,00, p = 0,0002), les complications infectieuses (OR 1,76, 95 % CI 1,02-3,00, p = 0,04) et la durée du séjour aux soins intensifs (OR 3,10, 95 % CI 1,83-5,38, p < 0,0001). Les patients qui avaient présenté trois hypoglycémies ou plus (< 4,1 mmol/l) ont présenté un taux augmenté de mortalité (OR 9,08, p ≤ 0,0001) et de complications (OR 8,57, p < 0,0001). Conclusion : Des glycémies supérieures à 8,8 mmol/l à j1 postopératoire et la présence de trois hypoglycémies ou plus en période postopératoire sont des facteurs prédictifs de mauvais pronostic chez les patients qui avaient subi une intervention chirurgicale cardiovasculaire. Ainsi, une approche multidisciplinaire devrait être proposée afin d'obtenir un meilleur contrôle postopératoire des glycémies.
Resumo:
Background: The imatinib trough plasma concentration (C(min)) correlates with clinical response in cancer patients. Therapeutic drug monitoring (TDM) of plasma C(min) is therefore suggested. In practice, however, blood sampling for TDM is often not performed at trough. The corresponding measurement is thus only remotely informative about C(min) exposure. Objectives: The objectives of this study were to improve the interpretation of randomly measured concentrations by using a Bayesian approach for the prediction of C(min), incorporating correlation between pharmacokinetic parameters, and to compare the predictive performance of this method with alternative approaches, by comparing predictions with actual measured trough levels, and with predictions obtained by a reference method, respectively. Methods: A Bayesian maximum a posteriori (MAP) estimation method accounting for correlation (MAP-ρ) between pharmacokinetic parameters was developed on the basis of a population pharmacokinetic model, which was validated on external data. Thirty-one paired random and trough levels, observed in gastrointestinal stromal tumour patients, were then used for the evaluation of the Bayesian MAP-ρ method: individual C(min) predictions, derived from single random observations, were compared with actual measured trough levels for assessment of predictive performance (accuracy and precision). The method was also compared with alternative approaches: classical Bayesian MAP estimation assuming uncorrelated pharmacokinetic parameters, linear extrapolation along the typical elimination constant of imatinib, and non-linear mixed-effects modelling (NONMEM) first-order conditional estimation (FOCE) with interaction. Predictions of all methods were finally compared with 'best-possible' predictions obtained by a reference method (NONMEM FOCE, using both random and trough observations for individual C(min) prediction). Results: The developed Bayesian MAP-ρ method accounting for correlation between pharmacokinetic parameters allowed non-biased prediction of imatinib C(min) with a precision of ±30.7%. This predictive performance was similar for the alternative methods that were applied. The range of relative prediction errors was, however, smallest for the Bayesian MAP-ρ method and largest for the linear extrapolation method. When compared with the reference method, predictive performance was comparable for all methods. The time interval between random and trough sampling did not influence the precision of Bayesian MAP-ρ predictions. Conclusion: Clinical interpretation of randomly measured imatinib plasma concentrations can be assisted by Bayesian TDM. Classical Bayesian MAP estimation can be applied even without consideration of the correlation between pharmacokinetic parameters. Individual C(min) predictions are expected to vary less through Bayesian TDM than linear extrapolation. Bayesian TDM could be developed in the future for other targeted anticancer drugs and for the prediction of other pharmacokinetic parameters that have been correlated with clinical outcomes.
Resumo:
AIM: Hyperglycaemia is now a recognized predictive factor of morbidity and mortality after coronary artery bypass grafting (CABG). For this reason, we aimed to evaluate the postoperative management of glucose control in patients undergoing cardiovascular surgery, and to assess the impact of glucose levels on in-hospital mortality and morbidity. METHODS: This was a retrospective study investigating the association between postoperative blood glucose and outcomes, including death, post-surgical complications, and length of stay in the intensive care unit (ICU) and in hospital. RESULTS: A total of 642 consecutive patients were enrolled into the study after cardiovascular surgery (CABG, carotid endarterectomy and bypass in the lower limbs). Patients' mean age was 68+/-10 years, and 74% were male. In-hospital mortality was 5% in diabetic patients vs 2% in non-diabetic patients (OR: 1.66, P=0.076). Having blood glucose levels in the upper quartile range (> or =8.8 mmol/L) on postoperative day 1 was independently associated with death (OR: 10.16, P=0.0002), infectious complications (OR: 1.76, P=0.04) and prolonged ICU stay (OR: 3.10, P<0.0001). Patients presenting with three or more hypoglycaemic episodes (<4.1 mmol/L) had increased rates of mortality (OR: 9.08, P<0.0001) and complications (OR: 8.57, P<0.0001). CONCLUSION: Glucose levels greater than 8.8 mmol/L on postoperative day 1 and having three or more hypoglycaemic episodes in the postoperative period were predictive of mortality and morbidity among patients undergoing cardiovascular surgery. This suggests that a multidisciplinary approach may be able to achieve better postoperative blood glucose control.