938 resultados para RISK PATIENTS
Resumo:
Background Cardiovascular disease (CVD) is partially attributed to traditional cardiovascular risk factors, which can be identified and managed based on risk stratification algorithms (Framingham Risk Score, National Cholesterol Education Program, Systematic Cardiovascular Risk Evaluation and Reynolds Risk Score). We aimed to (a) identify the proportion of at risk patients with rheumatoid arthritis (RA) requiring statin therapy identified by conventional risk calculators, and (b) assess whether patients at risk were receiving statins. Methods Patients at high CVD risk (excluding patients with established CVD or diabetes) were identified from a cohort of 400 well characterised patients with RA, by applying risk calculators with or without a ×1.5 multiplier in specific patient subgroups. Actual statin use versus numbers eligible for statins was also calculated. Results The percentage of patients identified as being at risk ranged significantly depending on the method, from 1.6% (for 20% threshold global CVD risk) to 15.5% (for CVD and cerebrovascular morbidity and mortality) to 21.8% (for 10% global CVD risk) and 25.9% (for 5% CVD mortality), with the majority of them (58.1% to 94.8%) not receiving statins. The application of a 1.5 multiplier identified 17% to 78% more at risk patients. Conclusions Depending on the risk stratification method, 2% to 26% of patients with RA without CVD have sufficiently high risk to require statin therapy, yet most of them remain untreated. To address this issue, we would recommend annual systematic screening using the nationally applicable risk calculator, combined with regular audit of whether treatment targets have been achieved.
Resumo:
Advances in our understanding of pathological mechanisms can inform the identification of various biomarkers for risk stratification, monitoring drug efficacy and toxicity; and enabling careful monitoring of polypharmacy. Biomarkers in the broadest sense refer to 'biological markers' and this can be blood-based (eg. fibrin D-dimer, von Willebrand factor, etc) urine-based (eg. thromboxane), or even related to cardiac or cerebral imaging(1). Most biomarkers offer improvements over clinical risk scores in predicting high risk patients - at least statistically - but usually at the loss of simplicity and practicality for easy application in everyday clinical practice. Given the various biomarkers can be informed by different aspects of pathophysiology (e.g. inflammation, clotting, collagen turnover) they can nevertheless contribute to a better understanding of underlying disease processes(2). Indeed, many age-related diseases share common modifiable underpinning mechanisms e.g. inflammation, oxidative stress and visceral adiposity.
Resumo:
This article proposes a Bayesian neural network approach to determine the risk of re-intervention after endovascular aortic aneurysm repair surgery. The target of proposed technique is to determine which patients have high chance to re-intervention (high-risk patients) and which are not (low-risk patients) after 5 years of the surgery. Two censored datasets relating to the clinical conditions of aortic aneurysms have been collected from two different vascular centers in the United Kingdom. A Bayesian network was first employed to solve the censoring issue in the datasets. Then, a back propagation neural network model was built using the uncensored data of the first center to predict re-intervention on the second center and classify the patients into high-risk and low-risk groups. Kaplan-Meier curves were plotted for each group of patients separately to show whether there is a significant difference between the two risk groups. Finally, the logrank test was applied to determine whether the neural network model was capable of predicting and distinguishing between the two risk groups. The results show that the Bayesian network used for uncensoring the data has improved the performance of the neural networks that were built for the two centers separately. More importantly, the neural network that was trained with uncensored data of the first center was able to predict and discriminate between groups of low risk and high risk of re-intervention after 5 years of endovascular aortic aneurysm surgery at center 2 (p = 0.0037 in the logrank test).
Resumo:
Lifelong surveillance is not cost-effective after endovascular aneurysm repair (EVAR), but is required to detect aortic complications which are fatal if untreated (type 1/3 endoleak, sac expansion, device migration). Aneurysm morphology determines the probability of aortic complications and therefore the need for surveillance, but existing analyses have proven incapable of identifying patients at sufficiently low risk to justify abandoning surveillance. This study aimed to improve the prediction of aortic complications, through the application of machine-learning techniques. Patients undergoing EVAR at 2 centres were studied from 2004–2010. Aneurysm morphology had previously been studied to derive the SGVI Score for predicting aortic complications. Bayesian Neural Networks were designed using the same data, to dichotomise patients into groups at low- or high-risk of aortic complications. Network training was performed only on patients treated at centre 1. External validation was performed by assessing network performance independently of network training, on patients treated at centre 2. Discrimination was assessed by Kaplan-Meier analysis to compare aortic complications in predicted low-risk versus predicted high-risk patients. 761 patients aged 75 +/− 7 years underwent EVAR in 2 centres. Mean follow-up was 36+/− 20 months. Neural networks were created incorporating neck angu- lation/length/diameter/volume; AAA diameter/area/volume/length/tortuosity; and common iliac tortuosity/diameter. A 19-feature network predicted aor- tic complications with excellent discrimination and external validation (5-year freedom from aortic complications in predicted low-risk vs predicted high-risk patients: 97.9% vs. 63%; p < 0.0001). A Bayesian Neural-Network algorithm can identify patients in whom it may be safe to abandon surveillance after EVAR. This proposal requires prospective study.
Resumo:
RATIONALE: Limitations in methods for the rapid diagnosis of hospital-acquired infections often delay initiation of effective antimicrobial therapy. New diagnostic approaches offer potential clinical and cost-related improvements in the management of these infections. OBJECTIVES: We developed a decision modeling framework to assess the potential cost-effectiveness of a rapid biomarker assay to identify hospital-acquired infection in high-risk patients earlier than standard diagnostic testing. METHODS: The framework includes parameters representing rates of infection, rates of delayed appropriate therapy, and impact of delayed therapy on mortality, along with assumptions about diagnostic test characteristics and their impact on delayed therapy and length of stay. Parameter estimates were based on contemporary, published studies and supplemented with data from a four-site, observational, clinical study. Extensive sensitivity analyses were performed. The base-case analysis assumed 17.6% of ventilated patients and 11.2% of nonventilated patients develop hospital-acquired infection and that 28.7% of patients with hospital-acquired infection experience delays in appropriate antibiotic therapy with standard care. We assumed this percentage decreased by 50% (to 14.4%) among patients with true-positive results and increased by 50% (to 43.1%) among patients with false-negative results using a hypothetical biomarker assay. Cost of testing was set at $110/d. MEASUREMENTS AND MAIN RESULTS: In the base-case analysis, among ventilated patients, daily diagnostic testing starting on admission reduced inpatient mortality from 12.3 to 11.9% and increased mean costs by $1,640 per patient, resulting in an incremental cost-effectiveness ratio of $21,389 per life-year saved. Among nonventilated patients, inpatient mortality decreased from 7.3 to 7.1% and costs increased by $1,381 with diagnostic testing. The resulting incremental cost-effectiveness ratio was $42,325 per life-year saved. Threshold analyses revealed the probabilities of developing hospital-acquired infection in ventilated and nonventilated patients could be as low as 8.4 and 9.8%, respectively, to maintain incremental cost-effectiveness ratios less than $50,000 per life-year saved. CONCLUSIONS: Development and use of serial diagnostic testing that reduces the proportion of patients with delays in appropriate antibiotic therapy for hospital-acquired infections could reduce inpatient mortality. The model presented here offers a cost-effectiveness framework for future test development.
Resumo:
PURPOSE: Conventional staging methods are inadequate to identify patients with stage II colon cancer (CC) who are at high risk of recurrence after surgery with curative intent. ColDx is a gene expression, microarray-based assay shown to be independently prognostic for recurrence-free interval (RFI) and overall survival in CC. The objective of this study was to further validate ColDx using formalin-fixed, paraffin-embedded specimens collected as part of the Alliance phase III trial, C9581.
PATIENTS AND METHODS: C9581 evaluated edrecolomab versus observation in patients with stage II CC and reported no survival benefit. Under an initial case-cohort sampling design, a randomly selected subcohort (RS) comprised 514 patients from 901 eligible patients with available tissue. Forty-nine additional patients with recurrence events were included in the analysis. Final analysis comprised 393 patients: 360 RS (58 events) and 33 non-RS events. Risk status was determined for each patient by ColDx. The Self-Prentice method was used to test the association between the resulting ColDx risk score and RFI adjusting for standard prognostic variables.
RESULTS: Fifty-five percent of patients (216 of 393) were classified as high risk. After adjustment for prognostic variables that included mismatch repair (MMR) deficiency, ColDx high-risk patients exhibited significantly worse RFI (multivariable hazard ratio, 2.13; 95% CI, 1.3 to 3.5; P < .01). Age and MMR status were marginally significant. RFI at 5 years for patients classified as high risk was 82% (95% CI, 79% to 85%), compared with 91% (95% CI, 89% to 93%) for patients classified as low risk.
CONCLUSION: ColDx is associated with RFI in the C9581 subsample in the presence of other prognostic factors, including MMR deficiency. ColDx could be incorporated with the traditional clinical markers of risk to refine patient prognosis.
Resumo:
Pancreaticoduodenectomy with or without adjuvant chemotherapy remains the only modality of possible cure in patients with cancer involving the head of the pancreas and the periampullary region. While mortality rates after pancreaticoduodenectomy have improved considerably over the course of the last century, morbidity remains high. Patient selection is of paramount importance in ensuring that major surgery is offered to individuals who will most benefit from a pancreaticoduodenectomy. Moreover, identifying preoperative risk factors provides potential targets for prehabilitation and optimisation of the patient's physiology before undertaking surgery. In addition to this, early identification of patients who are likely to develop postoperative complications allows for better allocation of critical care resources and more aggressive management high risk patients. Cardiopulmonary exercise testing is becoming an increasingly popular tool in the preoperative risk assessment of the surgical patient. However, very little work has been done to investigate the role of cardiopulmonary exercise testing in predicting complications after pancreaticoduodenectomy. The impact of jaundice, systemic inflammation and other preoperative clinicopathological characteristics on cardiopulmonary exercise physiology has not been studied in detail before in this cohort of patients. The overall aim of the thesis was to examine the relationships between preoperative clinico-pathological characteristics including cardiopulmonary exercise physiology, obstructive jaundice, body composition and systemic inflammation and complications and the post-surgical systemic inflammatory response in patients undergoing pancreaticoduodenectomy. Chapter 1 reviews the existing literature on preoperative cardiopulmonary exercise testing, the impact of obstructive jaundice, perioperative systemic inflammation and the importance of body composition in determining outcomes in patients undergoing major surgery with particular reference to pancreatic surgery. Chapter 2 reports on the role of cardiopulmonary exercise testing in predicting postoperative complications after pancreaticoduodenectomy. The results demonstrate that patients with V˙O2AT less than 10 ml/kg/min are more likely to develop a postoperative pancreatic fistula, stay longer in hospital and less likely to receive adjuvant therapy. These results emphasise the importance of aerobic fitness to recover from the operative stress of major surgery without significant morbidity. Cardiopulmonary exercise testing may prove useful in selecting patients for intensive prehabilitation programmes as well as for other optimisation measures to prepare them for major surgery. Chapter 3 evaluates the relationship between cardiopulmonary exercise physiology and other clinicopathological characteristics of the patient. A detailed analysis of cardiopulmonary exercise test parameters in jaundiced versus non-jaundiced patients demonstrates that obstructive jaundice does not impair cardiopulmonary exercise physiology. This further supports emerging evidence in contemporary literature that jaundiced patients can proceed directly to surgery without preoperative biliary drainage. The results of this study also show an interesting inverse relationship between body mass index and anaerobic threshold which is analysed in more detail in Chapter 4. Chapter 4 examines the relationship between preoperative cardiopulmonary exercise physiology and body composition in depth. All parameters measured at cardiopulmonary exercise test are compared against body composition and body mass index. The results of this chapter report that the current method of reporting V˙O2, both at peak exercise and anaerobic threshold, is biased against obese subjects and advises caution in the interpretation of cardiopulmonary exercise test results in patients with a high BMI. This is particularly important as current evidence in literature suggests that postoperative outcomes in obese subjects are comparable to non-obese subjects while cardiopulmonary exercise test results are also abnormally low in this very same cohort of patients. Chapter 5 analyses the relationship between preoperative clinico-pathological characteristics including systemic inflammation and the magnitude of the postoperative systemic inflammatory response. Obstructive jaundice appears to have an immunosuppressive effect while elevated preoperative CRP and hypoalbuminemia appear to have opposite effects with hypoalbuminemia resulting in a lower response while elevated CRP in the absence of hypoalbuminemia resulted in a greater postoperative systemic inflammatory response. Chapter 6 evaluates the role of the early postoperative systemic inflammatory response in predicting complications after pancreaticoduodenectomy and aims to establish clinically relevant thresholds for C-Reactive Protein for the prediction of complications. The results of this chapter demonstrate that CRP levels as early as the second postoperative day are associated with complications. While post-operative CRP was useful in the prediction of infective complications, this was the case only in patients who did not develop a post-operative pancreatic fistula. The predictive ability of inflammatory markers for infectious complications was blunted in patients with a pancreatic fistula. Chapter 7 summarises the findings of this thesis, their place in current literature and future directions. The results of this thesis add to the current knowledge regarding the complex pathophysiological abnormalities in patients undergoing pancreaticoduodenectomy, with specific emphasis on the interaction between cardiopulmonary exercise physiology, obstructive jaundice, systemic inflammation and postoperative outcomes. The work presented in this thesis lays the foundations for further studies aimed at improving outcomes after pancreaticoduodenectomy through the development of individualised, goal-directed therapies that are initiated well before this morbid yet necessary operation is performed.
Resumo:
Background: Oral anticoagulation (OAC) reduces stroke risk in patients with atrial fibrillation (AF), however it is still underutilized and sometimes refused by patients. This project was divided in two inter-related studies. Study 1 explored the experiences that influence prescription of OAC by physicians. Study 2 explored the experiences which influence patients' decisions to accept, decline or discontinue OAC. Methods: Semi-structured individual interviews were conducted in both studies. In Study 1four sub-groups of physicians (n = 16) experienced with OAC in AF were interviewed: consultant cardiologists, consultant general physicians, general practitioners and cardiology registrars. In Study 2 three sub-groups of patients (n = 11) diagnosed with AF were interviewed; those who accepted, refused, and who discontinued warfarin. Results: Study 1: Two over-arching themes emerged from doctors' experiences: (1) communicating information and (2) challenges with OAC prescription for AF. Physicians still adopt a paternalistic approach to decision-making. They should instead motivate patients to take part in treatment discussions and choices should reflect the patient's needs and concerns. Physician education should focus more on communication skills, individualised care and time-management as these are critical for patient adherence. Continuous OAC education for AF should adopt a multi-disciplinary approach. Further, interpreters should also be educated on medical communication skills. Study 2: Three over-arching themes comprised patients' experiences: (1) the initial consultation, (2) life after the consultation, and (3) patients' reflections. Patient education during the initial consultation was critical in increasing patient's knowledge of OAC. On-going patient education is imperative to maintain adherence. Patients valued physicians' concern for their needs during decision-making. Patients who had experience of stroke were more receptive to education aimed towards stroke risk reduction rather than bleeding risk. Patients' perceptions of warfarin are also influenced by the media. Comment: Qualitative research is crucial in exploring barriers to treatment as it provides an excellent insight into patients' experiences of healthcare. A patient-centred approach should be adopted and incorporated into physicians' education. Education and patient involvement in the decision-making process is essential to promote treatment acceptance and long-term adherence
Resumo:
Early discharge protocols have been proposed for ST-segment elevation myocardial infarction (STEMI) low risk patients despite the existence of few but significant cardiovascular events during mid-term follow-up. We aimed to identify a subgroup of patients among those considered low-risk in which prognosis would be particularly good. We analyzed 30-day outcomes and long-term follow-up among 1.111 STEMI patients treated with reperfusion therapy. Multivariate analysis identified seven variables as predictors of 30-day outcomes: Femoral approach; age > 65; systolic dysfunction; postprocedural TIMI flow < 3; elevated creatinine level > 1.5 mg/dL; stenosis of left-main coronary artery; and two or higher Killip class (FASTEST). A total of 228 patients (20.5%), defined as very low-risk (VLR), had none of these variables on admission. VLR group of patients compared to non-VLR patients had lower in-hospital (0% vs. 5.9%; p < 0.001) and 30-day mortality (0% vs. 6.25%: p < 0.001). They also presented fewer in-hospital complications (6.6% vs. 39.7%; p < 0.001) and 30-day major adverse events (0.9% vs. 4.5%; p = 0.01). Significant mortality differences during a mean follow-up of 23.8 ± 19.4 months were also observed (2.2% vs. 15.2%; p < 0.001). The first VLR subject died 11 months after hospital discharge. No cardiovascular deaths were identified in this subgroup of patients during follow-up. About a fifth of STEMI patients have VLR and can be easily identified. They have an excellent prognosis suggesting that 24–48 h in-hospital stay could be a feasible alternative in these patients.
Resumo:
BRCA1 and BRCA2 are the most frequently mutated genes in ovarian cancer (OC), crucial both for the identification of cancer predisposition and therapeutic choices. However, germline variants in other genes could be involved in OC susceptibility. We characterized OC patients to detect mutations in genes other than BRCA1/2 that could be associated with a high risk to develop OC, and that could permit patients to enter the most appropriate treatment and surveillance program. Next-Generation Sequencing analysis with a 94-gene panel was performed on germline DNA of 219 OC patients. We identified 34 pathogenic/likely-pathogenic variants in BRCA1/2 and 38 in other 21 genes. Patients with pathogenic/likely-pathogenic variants in non-BRCA1/2 genes developed mainly OC alone compared to the other groups that developed also breast cancer or other tumors (p=0.001). Clinical correlation analysis showed that low-risk patients were significantly associated with platinum sensitivity (p<0.001). Regarding PARP inhibitors (PARPi) response, patients with pathogenic mutations in non-BRCA1/2 genes had significantly worse PFS and OS. Moreover, a statistically significant worse PFS was found for every increase of one thousand platelets before PARPi treatment. To conclude, knowledge about molecular alterations in genes beyond BRCA1/2 in OC could allow for more personalized diagnostic, predictive, prognostic, and therapeutic strategies for OC patients.
Resumo:
Background Many countries have set targets for suicide reduction, and suggested that mental health care providers and general practitioners have a key role to play. Method Asystematic review of the literature. Results Among those in the general population who commit suicide, up to 41% may have contact with psychiatric inpatient care in the year prior to death and up-to 9% may commit suicide within one day of discharge. The corresponding figures are I I and 4% for community-based psychiatric care and 83 and 20% for general practitioners. Conclusions Among those who die by suicide. contact with health services is common before death. This is a necessary but not sufficient condition for clinicians to intervene. More work is needed to determine whether these people show characteristic patterns of care and/or particular risk factors which would enable a targeted approach to be developed to assist clinicians in detecting and managing high-risk patients.
Resumo:
The Multicenter Australian Study of Epidural Anesthesia and Analgesia in Major Surgery (The MASTER Trial) was designed to evaluate the possible benefit of epidural block in improving outcome in high-risk patients. The trial began in 1995 and is scheduled to reach the planned sample size of 900 during 2001. This paper describes the trial design and presents data comparing 455 patients randomized in 21 institutions in Australia, Hong Kong, and Malaysia, with 237 patients from the same hospitals who were eligible but not randomized. Nine categories of high-risk patients were defined as entry criteria for the trial. Protocols for ethical review, informed consent, randomization, clinical anesthesia and analgesia, and perioperative management were determined following extensive consultation with anesthesiologists throughout Australia. Clinical and research information was collected in participating hospitals by research staff who may not have been blind to allocation. Decisions about the presence or absence of endpoints were made primarily by a computer algorithm, supplemented by blinded clinical experts. Without unblinding the trial, comparison of eligibility criteria and incidence of endpoints between randomized and nonrandomized patients showed only small differences. We conclude that there is no strong evidence of important demographic or clinical differences between randomized and nonrandomized patients eligible for the MASTER Trial. Thus, the trial results are likely to be broadly generalizable. Control Clin Trials 2000;21:244-256 (C) Elsevier Science Inc. 2000.
Resumo:
Background Epidural block is widely used to manage major abdominal surgery and postoperative analgesia, but its risks. and benefits are uncertain. We compared adverse outcomes in high-risk patients managed for major surgery with epidural block or alternative analgesic regimens with general anaesthesia in a multicentre randomised trial. Methods 915 patients undergoing major abdominal surgery with one of nine defined comorbid states to identify high-risk status were randomly assigned intraoperative epidural anaesthesia and postoperative epidural analgesia for 72 h with general anaesthesia (site of epidural selected to provide optimum block) or control. The primary endpoint was death at 30 days or major postsurgical morbidity. Analysis by intention to treat involved 447 patients assigned epidural and 441 control. Findings 255 patients (57.1%) in the epidural group and 268 (60.7%) in the control group had at least one morbidity endpoint or died (p=0.29). Mortality at 30 days was low in both groups (epidural 23 [5.1%], control 19 [4.3%], p=0.67). Only one of eight categories of morbid endpoints in individual systems (respiratory failure) occurred less frequently in patients managed with epidural techniques (23% vs 30%, p=0.02). Postoperative epidural analgesia was associated with lower pain scores during the first 3 postoperative days. There were no major adverse consequences of epidural-catheter insertion. Interpretation Most adverse morbid outcomes in high-risk patients undergoing major abdominal surgery are not reduced by use of combined epidural and general anaesthesia and postoperative epidural analgesia. However, the improvement in analgesia, reduction in respiratory failure, and the low risk of serious adverse consequences suggest that many high-risk patients undergoing major intra-abdominal surgery will receive substantial benefit from combined general and epidural anaesthesia intraoperatively with continuing postoperative epidural analgesia.
Resumo:
Objective: To determine 30 day mortality, long term survival, and recurrent cardiac events after coronary artery bypass graft (CABG) in a population. Design: Follow up study of patients prospectively entered on to a cardiothoracic surgical database. Record linkages were used to obtain data on readmissions and deaths. Patients: 8910 patients undergoing isolated first CABG between 1980 and 1993 in Western Australia. Main outcome measures: 30 day and long term survival, readmission for cardiac event (acute myocardial infarction, unstable angina, percutaneous transluminal coronary angioplasty or reoperative CABG). Results: There were 3072 deaths to mid 1999. 30 day and long term survival were significantly better in patients treated in the first five years than during the following decade. The age of the patients, proportion of female patients, and number of grafts increased over time. An urgent procedure (odds ratio 3.3), older age (9% per year) and female sex (odds ratio 1.5) were associated with increased risk for 30 day mortality, while age (7% per year) and a recent myocardial infarction (odds ratio 1.16) influenced long term survival. Internal mammary artery grafts were followed by better short and long term survival, though there was an obvious selection bias in favour of younger male patients. Conclusions: This study shows worsening crude mortality at 30 days after CABG from the mid 1980s, associated with the inclusion of higher risk patients. Older age, an acute myocardial infarction in the year before surgery, and the use of sephenous vein grafts only were associated with poorer long term survival and greater risk of a recurrent cardiac event. Female sex predicted recurrent events but not long term survival.
Resumo:
Background: We tested the hypothesis that the universal application of myocardial scanning with single-photon emission computed tomography (SPECT) would result in better risk stratification in renal transplant candidates (RTC) compared with SPECT being restricted to patients who, in addition to renal disease, had other clinical risk factors. Methods: RTCs (n=363) underwent SPECT and clinical risk stratification according to the American Society of Transplantation (AST) algorithm and were followed up until a major adverse cardiovascular event (MACE) or death. Results: Of the 363 patients, 79 patients (22%) had an abnormal SPECT scan and 270 (74%) were classified as high risk. Both methods correctly identified patients with increased probability of MACE. However, clinical stratification performed better (sensitivity and negative predictive value 99% and 99% vs. 25% and 87%, respectively). High-risk patients with an abnormal SPECT scan had a modest increased risk of events (log-rank = 0.03; hazard ratio [HR] = 1.37; 95% confidence interval [95% CI], 1.02-1.82). Eighty-six patients underwent coronary angiography, and coronary artery disease (CAD) was found in 60%. High-risk patients with CAD had an increased incidence of events (log-rank = 0.008; HR=3.85; 95% CI, 1.46-13.22), but in those with an abnormal SPECT scan, the incidence of events was not influenced by CAD (log-rank = 0.23). Forty-six patients died. Clinical stratification, but not SPECT, correlated with the probability of death (log-rank = 0.02; HR=3.25; 95% CI, 1.31-10.82). Conclusion: SPECT should be restricted to high-risk patients. Moreover, in contrast to SPECT, the AST algorithm was also useful for predicting death by any cause in RTCs and for selecting patients for invasive coronary testing.