961 resultados para critically patients
Resumo:
Background: Appropriate disposition of emergency department (ED) patients with chest pain is dependent on clinical evaluation of risk. A number of chest pain risk stratification tools have been proposed. The aim of this study was to compare the predictive performance for major adverse cardiac events (MACE) using risk assessment tools from the National Heart Foundation of Australia (HFA), the Goldman risk score and the Thrombolysis in Myocardial Infarction risk score (TIMI RS). Methods: This prospective observational study evaluated ED patients aged ≥30 years with non-traumatic chest pain for which no definitive non-ischemic cause was found. Data collected included demographic and clinical information, investigation findings and occurrence of MACE by 30 days. The outcome of interest was the comparative predictive performance of the risk tools for MACE at 30 days, as analyzed by receiver operator curves (ROC). Results: Two hundred eighty-one patients were studied; the rate of MACE was 14.1%. Area under the curve (AUC) of the HFA, TIMI RS and Goldman tools for the endpoint of MACE was 0.54, 0.71 and 0.67, respectively, with the difference between the tools in predictive ability for MACE being highly significant [chi2 (3) = 67.21, N = 276, p < 0.0001]. Conclusion: The TIMI RS and Goldman tools performed better than the HFA in this undifferentiated ED chest pain population, but selection of cutoffs balancing sensitivity and specificity was problematic. There is an urgent need for validated risk stratification tools specific for the ED chest pain population.
Resumo:
Bladder cancer is associated with high recurrence and mortality rates due to metastasis. The elucidation of metastasis suppressors may offer therapeutic opportunities if their mechanisms of action can be elucidated and tractably exploited. In this study, we investigated the clinical and functional significance of the transcription factor activating transcription factor 3 (ATF3) in bladder cancer metastasis. Gene expression analysis revealed that decreased ATF3 was associated with bladder cancer progression and reduced survival of patients with bladder cancer. Correspondingly, ATF3 overexpression in highly metastatic bladder cancer cells decreased migration in vitro and experimental metastasis in vivo. Conversely, ATF3 silencing increased the migration of bladder cancer cells with limited metastatic capability in the absence of any effect on proliferation. In keeping with their increased motility, metastatic bladder cancer cells had increased numbers of actin filaments. Moreover, ATF3 expression correlated with expression of the actin filament severing protein gelsolin (GSN). Mechanistic studies revealed that ATF3 upregulated GSN, whereas ATF3 silencing reduced GSN levels, concomitant with alterations in the actin cytoskeleton. We identified six ATF3 regulatory elements in the first intron of the GSN gene confirmed by chromatin immunoprecipitation analysis. Critically, GSN expression reversed the metastatic capacity of bladder cancer cells with diminished levels of ATF3. Taken together, our results indicate that ATF3 suppresses metastasis of bladder cancer cells, at least in part through the upregulation of GSN-mediated actin remodeling. These findings suggest ATF3 coupled with GSN as prognostic markers for bladder cancer metastasis.
Resumo:
Importance Approximately one-third of patients with peripheral artery disease experience intermittent claudication, with consequent loss of quality of life. Objective To determine the efficacy of ramipril for improving walking ability, patient-perceived walking performance, and quality of life in patients with claudication. Design, Setting, and Patients Randomized, double-blind, placebo-controlled trial conducted among 212 patients with peripheral artery disease (mean age, 65.5 [SD, 6.2] years), initiated in May 2008 and completed in August 2011 and conducted at 3 hospitals in Australia. Intervention Patients were randomized to receive 10 mg/d of ramipril (n = 106) or matching placebo (n = 106) for 24 weeks. Main Outcome Measures Maximum and pain-free walking times were recorded during a standard treadmill test. The Walking Impairment Questionnaire (WIQ) and Short-Form 36 Health Survey (SF-36) were used to assess walking ability and quality of life, respectively. Results At 6 months, relative to placebo, ramipril was associated with a 75-second (95% CI, 60-89 seconds) increase in mean pain-free walking time (P < .001) and a 255-second (95% CI, 215-295 seconds) increase in maximum walking time (P < .001). Relative to placebo, ramipril improved the WIQ median distance score by 13.8 (Hodges-Lehmann 95% CI, 12.2-15.5), speed score by 13.3 (95% CI, 11.9-15.2), and stair climbing score by 25.2 (95% CI, 25.1-29.4) (P < .001 for all). The overall SF-36 median Physical Component Summary score improved by 8.2 (Hodges-Lehmann 95% CI, 3.6-11.4; P = .02) in the ramipril group relative to placebo. Ramipril did not affect the overall SF-36 median Mental Component Summary score. Conclusions and Relevance Among patients with intermittent claudication, 24-week treatment with ramipril resulted in significant increases in pain-free and maximum treadmill walking times compared with placebo. This was associated with a significant increase in the physical functioning component of the SF-36 score. Trial Registration clinicaltrials.gov Identifier: NCT00681226
Resumo:
Purpose The LUX-Lung 3 study investigated the efficacy of chemotherapy compared with afatinib, a selective, orally bioavailable ErbB family blocker that irreversibly blocks signaling from epidermal growth factor receptor (EGFR/ErbB1), human epidermal growth factor receptor 2 (HER2/ErbB2), and ErbB4 and has wide-spectrum preclinical activity against EGFR mutations. A phase II study of afatinib in EGFR mutation-positive lung adenocarcinoma demonstrated high response rates and progression-free survival (PFS). Patients and Methods In this phase III study, eligible patients with stage IIIB/IV lung adenocarcinoma were screened for EGFR mutations. Mutation-positive patients were stratified by mutation type (exon 19 deletion, L858R, or other) and race (Asian or non-Asian) before two-to-one random assignment to 40 mg afatinib per day or up to six cycles of cisplatin plus pemetrexed chemotherapy at standard doses every 21 days. The primary end point was PFS by independent review. Secondary end points included tumor response, overall survival, adverse events, and patient-reported outcomes (PROs). Results A total of 1,269 patients were screened, and 345 were randomly assigned to treatment. Median PFS was 11.1 months for afatinib and 6.9 months for chemotherapy (hazard ratio [HR], 0.58; 95% CI, 0.43 to 0.78; P = .001). Median PFS among those with exon 19 deletions and L858R EGFR mutations (n = 308) was 13.6 months for afatinib and 6.9 months for chemotherapy (HR, 0.47; 95% CI, 0.34 to 0.65; P = .001). The most common treatmentrelated adverse events were diarrhea, rash/acne, and stomatitis for afatinib and nausea, fatigue, and decreased appetite for chemotherapy. PROs favored afatinib, with better control of cough, dyspnea, and pain. Conclusion Afatinib is associated with prolongation of PFS when compared with standard doublet chemotherapy in patients with advanced lung adenocarcinoma and EGFR mutations.
Resumo:
Purpose Patient-reported symptoms and health-related quality of life (QoL) benefits were investigated in a randomized, phase III trial of afatinib or cisplatin/pemetrexed. Patients and Methods Three hundred forty-five patients with advanced epidermal growth factor receptor (EGFR) mutation-positive lung adenocarcinoma were randomly assigned 2:1 to afatinib 40 mg per day or up to six cycles of cisplatin/pemetrexed. Lung cancer symptoms and health-related QoL were assessed every 21 days until progression using the European Organisation for Research and Treatment of Cancer Quality of Life Questionnaire C30 and Lung Cancer-13 questionnaires. Analyses of cough, dyspnea, and pain were preplanned, including percentage of patients who improved on therapy, time to deterioration of symptoms, and change in symptoms over time. Results Questionnaire compliance was high. Compared with chemotherapy, afatinib significantly delayed the time to deterioration for cough (hazard ratio [HR], 0.60; 95% CI, 0.41 to 0.87; P = .007) and dyspnea (HR, 0.68; 95% CI, 0.50 to 0.93; P = .015), but not pain (HR, 0.83; 95% CI, 0.62 to 1.10; P = .19). More patients on afatinib (64%) versus chemotherapy (50%) experienced improvements in dyspnea scores (P lt; .010). Differences in mean scores over time significantly favored afatinib over chemotherapy for cough (P lt; .001) and dyspnea (P = .001). Afatinib showed significantly better mean scores over time in global health status/QoL (P = .015) and physical (P = .001), role (P = .004), and cognitive (P lt; .007) functioning compared with chemotherapy. Fatigue and nausea were worse with chemotherapy, whereas diarrhea, dysphagia, and sore mouth were worse with afatinib (all P = .01). Conclusion In patients with lung adenocarcinoma with EGFR mutations, first-line afatinib was associated with better control of cough and dyspnea compared with chemotherapy, although diarrhea, dysphagia, and sore mouth were worse. Global health status/QoL was also improved over time with afatinib compared with chemotherapy.
Resumo:
In today’s NHS culture, commissioners are increasingly looking to the third sector for innovation and excellence in healthcare delivery. Opportunities for organisations within this sector to form fruitful and lasting partnerships have also grown.
Resumo:
The study examines the illness behaviour of patients with Chronic Fatigue Syndrome (CFS). The Illness Behaviour Questionnaire (IBQ) the twenty-eight version of the General Health Questionnaire (GBQ-28), and the Beck Depression Inventory (BDI) were administered to forty patients with a diagnosis of CFS. The results revealed that CFS patients in comparison with general practice patients, scored significantly higher on the IBQ sub-scales of General Hypochonriasis, t(188) = 5.2, p < 0.001 and Disease Conviction, t(188) = 13.28, p < 0.001 but lower on the Psychological/Somatic sub-scale, t(188) = -5.88, p < 0.001. The CFS and psychiatric patients did not differ significantly on the general hypochondriasis sub-scale. Results of the GHQ-28 revealed 66.7% of the CFS patients scored above the cut-off for psychiatric morbidity. In comparison to a previous study of CFS patients [1], the current findings indicate a significantly higher score on general hypochondriasis. The implications of these findings are discussed.
Resumo:
Purpose The presence of a lymphocytic infiltration in autonomic ganglia and an increased prevalence of autoantibodies and iritis in diabetic patients with autonomic neuropathy suggests a role for autoimmune mechanisms in the development of diabetic and perhaps somatic neuropathy. Corneal Langerhans cells are antigenpresenting cells which can be identified in corneal immunologic conditions using in-vivo confocal microscopy. The aim of this study was to assess the presence and density of Langerhans cells (LCs) in Bowman’s layer of the cornea in diabetic patients with varying degrees of neuropathy compared to healthy control subjects. Method 128 diabetic patients aged 58±1 years with differing severity of neuropathy (NDS – 4.7±0.28) and 26 control subjects aged 53±3 years were examined with in-vivo corneal confocal microscopy to quantify the density of “Langerhans cells” (LCs). Results LCs were observed more often in diabetic patients (73.8%) compared to control subjects (46.1%), P = 0.001. The LC density (number/mm2) was also significantly increased in diabetic patients (17.73±1.45) compared to control subjects (6.94±1.58, P = 0.001). There was a significant correlation between the density of LCs with age (r = 0.162, P = 0.047) and severity of neuropathy assessed by NDS (r =−0.202, P = 0.02). Conclusions In vivo corneal confocal microscopy enables quantification of Langerhans cells in Bowman’s layer of the cornea. There is a relationship between density of LCs and the degree of nerve damage. Corneal confocal microscopy could be a valuable tool to establish the role of immune mediated corneal nerve damage and provide insights into the pathogenesis of diabetic neuropathy.
Resumo:
Background Sleep disturbances, including insomnia and sleep-disordered breathing, are a common complaint in people with heart failure and impair well-being. Exercise training (ET) improves quality of life in stable heart failure patients. ET also improves sleep quality in healthy older patients, but there are no previous intervention studies in heart failure patients. Aim The aim of this study was to examine the impact of ET on sleep quality in patients recently discharged from hospital with heart failure. Methods This was a sub-study of a multisite randomised controlled trial. Participants with a heart failure hospitalisation were randomised within six weeks of discharge to a 12-week disease management programme including exercise advice (n=52) or to the same programme with twice weekly structured ET (n=54). ET consisted of two one-hour supervised aerobic and resistance training sessions, prescribed and advanced by an exercise specialist. The primary outcome was change in Pittsburgh Sleep Quality Index (PSQI) between randomisation and week 12. Results At randomisation, 45% of participants reported poor sleep (PSQI≥5). PSQI global score improved significantly more in the ET group than the control group (–1.5±3.7 vs 0.4±3.8, p=0.03). Improved sleep quality correlated with improved exercise capacity and reduced depressive symptoms, but not with changes in body mass index or resting heart rate. Conclusion Twelve weeks of twice-weekly supervised ET improved sleep quality in patients recently discharged from hospital with heart failure.
Resumo:
Background and purpose Phosphodiesterases PDE3 and/or PDE4 control ventricular effects of catecholamines in several species but their relative effects in failing human ventricle are unknown. We investigated whether the PDE3-selective inhibitor cilostamide (0.3-1μM) or PDE4 inhibitor rolipram (1-10μM) modified the positive inotropic and lusitropic effects of catecholamines in human failing myocardium. Experimental approach Right and left ventricular trabeculae from freshly explanted hearts of 5 non-β-blocker-treated and 15 metoprolol-treated patients with terminal heart failure were paced to contract at 1Hz. The effects of (-)-noradrenaline, mediated through β1-adrenoceptors (β2-adrenoceptors blocked with ICI118551), and (-)-adrenaline, mediated through β2-adrenoceptors (β1-adrenoceptors blocked with CGP20712A), were assessed in the absence and presence of PDE inhibitors. Catecholamine potencies were estimated from –logEC50s. Key results Cilostamide did not significantly potentiate the inotropic effects of the catecholamines in non-β-blocker-treated patients. Cilostamide caused greater potentiation (P=0.037) of the positive inotropic effects of (-)-adrenaline (0.78±0.12 log units) than (-)-noradrenaline (0.47±0.12 log units) in metoprolol-treated patients. Lusitropic effects of the catecholamines were also potentiated by cilostamide. Rolipram did not affect the inotropic and lusitropic potencies of (-)-noradrenaline or (-)-adrenaline on right and left ventricular trabeculae from metoprolol-treated patients. Conclusions and implications Metoprolol induces a control by PDE3 of ventricular effects mediated through both β1- and β2-adrenoceptors, thereby further reducing sympathetic cardiostimulation in patients with terminal heart failure. Concurrent therapy with a PDE3 blocker and metoprolol could conceivably facilitate cardiostimulation evoked by adrenaline through β2-adrenoceptors. PDE4 does not appear to reduce inotropic and lusitropic effects of catecholamines in failing human ventricle.
Resumo:
Background Prevention strategies are critical to reduce infection rates in total joint arthroplasty (TJA), but evidence-based consensus guidelines on prevention of surgical site infection (SSI) remain heterogeneous and do not necessarily represent this particular patient population. Questions/Purposes What infection prevention measures are recommended by consensus evidence-based guidelines for prevention of periprosthetic joint infection? How do these recommendations compare to expert consensus on infection prevention strategies from orthopedic surgeons from the largest international tertiary referral centers for TJA? Patients and Methods A review of consensus guidelines was undertaken as described by Merollini et al. Four clinical guidelines met inclusion criteria: Centers for Disease Control and Prevention's, British Orthopedic Association, National Institute of Clinical Excellence's, and National Health and Medical Research Council's (NHMRC). Twenty-eight recommendations from these guidelines were used to create an evidence-based survey of infection prevention strategies that was administered to 28 orthopedic surgeons from members of the International Society of Orthopedic Centers. The results between existing consensus guidelines and expert opinion were then compared. Results Recommended strategies in the guidelines such as prophylactic antibiotics, preoperative skin preparation of patients and staff, and sterile surgical attire were considered critically or significantly important by the surveyed surgeons. Additional strategies such as ultraclean air/laminar flow, antibiotic cement, wound irrigation, and preoperative blood glucose control were also considered highly important by surveyed surgeons, but were not recommended or not uniformly addressed in existing guidelines on SSI prevention. Conclusion Current evidence-based guidelines are incomplete and evidence should be updated specifically to address patient needs undergoing TJA.
An exploratory study of staff nurses' knowledge of delirium in the medical ICU: An Asian perspective
Resumo:
Aim The aim of this study was to establish intensive care unit nurses’ knowledge of delirium within an acute tertiary hospital within South East Asia. Background Delirium is a common, life threatening and often preventable cause of morbidity and mortality among older patients. Undetected and untreated delirium is a catalyst to increased mortality, morbidity, functional decline and results in increased requirement for nursing care, healthcare expense and hospital length of stay. However, despite effective assessment tools to identify delirium in the acute setting, there still remains an inability of ICU nurses’ to accurately identify delirium in the critically ill patient especially that of hypoactive delirium. Method A purposive sample of 53 staff nurses from a 13-bedded medical intensive care unit within an acute tertiary teaching hospital in South East Asia were asked to participate. A 40 item 5-point Likert scale questionnaire was employed to determine the participants’ knowledge of the signs and symptoms; the risk factors and negative outcomes of delirium. Results The overall positively answered mean score was 27 (67.3%) out of a possible 40 questions. Mean scores for knowledge of signs and symptoms, risk factors and negative outcomes were 9.52 (63.5%, n = 15), 11.43 (63.5%, n = 17) and 6.0 (75%, n = 8), respectively. Conclusion Whilst the results of this study are similar to others taken from a western perspective, it appeared that the ICU nurses in this study demonstrated limited knowledge of the signs and symptoms, risk factors and negative outcomes of delirium in the critically patient. The implications for practice of this are important given the outcomes of untreated delirium.
Resumo:
The use of the Sengstaken–Blakemore tube as a life-saving treatment for bleeding oesophageal varices is slowly becoming the least preferred method possibly due to the potential complications associated with its placement. Nursing practice pertaining to the care of this patient group appears ad hoc and reliant on local knowledge and experience as opposed to recognised evidence of best practice. Therefore, this paper focuses on the application of Lewin's transitional change theory used to introduce a change in nursing practice with the application of a guideline to enhance the care of patients with a Sengstaken–Blakemore tube in situ within a general intensive care unit. This method identified some of the complexities surrounding the change process including the driving and restraining forces that must be harnessed and minimised in order for the adoption of change to be successful.
Resumo:
AIM The aim of this paper was to review the current discourse in relation to intensive care unit (ICU) delirium. In particular, it will discuss the predisposing and contributory factors associated with delirium's development as well as effects of delirium on patients, staff and family members. BACKGROUND Critically ill patients are at greater risk of developing delirium and, with an ageing population and increased patient acuity permitted by medical advances, delirium is a growing problem in the ICU. However, there is a universal consensus that the definition of ICU delirium needs improvement to aid its recognition and to ensure both hypoalert-hypoactive and hyperalert-hyperactive variants are easily and readily identified. RELEVANCE TO CLINICAL PRACTICE The effects of ICU delirium have cost implications to the National Health Service in terms of prolonged ventilation and length of hospital stay. The causes of delirium can be readily classified as either predisposing or precipitating factors, which are organic in nature and commonly reversible. However, contributory factors also exist to exacerbate delirium and having an awareness of all these factors promises to aid prevention and expedite treatment. This will avoid or limit the host of adverse physiological and psychological consequences that delirium can provoke and directly enhance both patient and staff safety. CONCLUSIONS Routine screening of all patients in the ICU for the presence of delirium is crucial to its successful management. Nurses are on the front line to detect, manage and even prevent delirium.