924 resultados para Renal artery
Resumo:
BACKGROUND
Renal impairment (RI) is associated with impaired prognosis in patients with coronary artery disease. Clinical and angiographic outcomes of patients undergoing percutaneous coronary intervention (PCI) with the use of drug-eluting stents (DES) in this patient population are not well established.
METHODS
We pooled individual data for 5,011 patients from 3 trials with the exclusive and unrestricted use of DES (SIRTAX - N = 1,012, LEADERS - N = 1,707, RESOLUTE AC - N = 2,292). Angiographic follow-up was available for 1,544 lesions. Outcomes through 2 years were stratified according to glomerular filtration rate (normal renal function: GFR≥90 ml/min; mild RI: 90
Resumo:
INTRODUCTION Anemia and renal impairment are important co-morbidities among patients with coronary artery disease undergoing Percutaneous Coronary Intervention (PCI). Disease progression to eventual death can be understood as the combined effect of baseline characteristics and intermediate outcomes. METHODS Using data from a prospective cohort study, we investigated clinical pathways reflecting the transitions from PCI through intermediate ischemic or hemorrhagic events to all-cause mortality in a multi-state analysis as a function of anemia (hemoglobin concentration <120 g/l and <130 g/l, for women and men, respectively) and renal impairment (creatinine clearance <60 ml/min) at baseline. RESULTS Among 6029 patients undergoing PCI, anemia and renal impairment were observed isolated or in combination in 990 (16.4%), 384 (6.4%), and 309 (5.1%) patients, respectively. The most frequent transition was from PCI to death (6.7%, 95% CI 6.1-7.3), followed by ischemic events (4.8%, 95 CI 4.3-5.4) and bleeding (3.4%, 95% CI 3.0-3.9). Among patients with both anemia and renal impairment, the risk of death was increased 4-fold as compared to the reference group (HR 3.9, 95% CI 2.9-5.4) and roughly doubled as compared to patients with either anemia (HR 1.7, 95% CI 1.3-2.2) or renal impairment (HR 2.1, 95% CI 1.5-2.9) alone. Hazard ratios indicated an increased risk of bleeding in all three groups compared to patients with neither anemia nor renal impairment. CONCLUSIONS Applying a multi-state model we found evidence for a gradient of risk for the composite of bleeding, ischemic events, or death as a function of hemoglobin value and estimated glomerular filtration rate at baseline.
Resumo:
OBJECTIVE The development of peripheral artery disease is affected by the presence of cardiovascular risk factors. It is unclear, whether particular risk factors are leading to different clinical stages of peripheral artery disease. The aim of this retrospective cross-sectional study was to assess the association of cardiovascular risk factors with the presence of critical limb ischaemia. METHODS The study cohort was derived from a consecutive registry of patients undergoing endovascular therapy in a tertiary referral centre between January 2000 and April 2014. Patients undergoing first-time endovascular intervention for chronic peripheral artery disease of the lower extremities were included. Univariate and multivariate logistic regression models were used to assess the association of age, sex, diabetes mellitus, hypertension, dyslipidaemia, smoking, and renal insufficiency with critical limb ischaemia vs. intermittent claudication. RESULTS A total of 3406 patients were included in the study (mean age 71.7 ± 11.8 years, 2075 [61%] male). There was a significant association of age (OR 1.67, 95%-CI 1.53-1.82, p < 0.001), male gender (OR 1.23, 95%-CI 1.04-1.47, p = 0.016), diabetes (OR 1.99, 95%-CI 1.68-2.36, p < 0.001) and renal insufficiency (OR 1.62, 95%-CI 1.35-1.96, p < 0.001) with the likelihood of critical limb ischaemia. Smoking was associated with intermittent claudication rather than critical limb ischaemia (OR 0.78, 95%-CI 0.65-0.94, p = 0.010), while hypertension and dyslipidaemia did not show an association with critical limb ischaemia. CONCLUSIONS In peripheral artery disease patients undergoing first-time endovascular treatment, age, male gender, diabetes, and renal insufficiency were the strongest predictors for the presence of critical limb ischaemia.
Resumo:
BACKGROUND Cardiac troponin detected by new-generation, highly sensitive assays predicts clinical outcomes among patients with stable coronary artery disease (SCAD) treated medically. The prognostic value of baseline high-sensitivity cardiac troponin T (hs-cTnT) elevation in SCAD patients undergoing elective percutaneous coronary interventions is not well established. This study assessed the association of preprocedural levels of hs-cTnT with 1-year clinical outcomes among SCAD patients undergoing percutaneous coronary intervention. METHODS AND RESULTS Between 2010 and 2014, 6974 consecutive patients were prospectively enrolled in the Bern Percutaneous Coronary Interventions Registry. Among patients with SCAD (n=2029), 527 (26%) had elevated preprocedural hs-cTnT above the upper reference limit of 14 ng/L. The primary end point, mortality within 1 year, occurred in 20 patients (1.4%) with normal hs-cTnT versus 39 patients (7.7%) with elevated baseline hs-cTnT (P<0.001). Patients with elevated hs-cTnT had increased risks of all-cause (hazard ratio 5.73; 95% confidence intervals 3.34-9.83; P<0.001) and cardiac mortality (hazard ratio 4.68; 95% confidence interval 2.12-10.31; P<0.001). Preprocedural hs-TnT elevation remained an independent predictor of 1-year mortality after adjustment for relevant risk factors, including age, sex, and renal failure (adjusted hazard ratio 2.08; 95% confidence interval 1.10-3.92; P=0.024). A graded mortality risk was observed across higher tertiles of elevated preprocedural hs-cTnT, but not among patients with hs-cTnT below the upper reference limit. CONCLUSIONS Preprocedural elevation of hs-cTnT is observed in one fourth of SCAD patients undergoing elective percutaneous coronary intervention. Increased levels of preprocedural hs-cTnT are proportionally related to the risk of death and emerged as independent predictors of all-cause mortality within 1 year. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifier: NCT02241291.
Resumo:
Elevated homocysteine (hyperhomocysteinaemia) in renal patients is a major concern for physicians. Although cause and effect between homocysteine and cardiovascular disease (CVD) has not been established in either the general population or renal patients, there is much evidence that this relationship does exist. Purported mechanisms that may explain this effect include increases in endothelial injury, smooth muscle cell proliferation, low-density lipoprotein oxidation and changes in haemostatic balance. Renal patients have a much greater incidence of hyperhomocysteinaemia and this may be explained by decreases in either the renal or extrarenal metabolism of the compound. We conclude that data from long-term placebo-controlled trials are urgently required to determine whether hyperhomocysteinaemia in renal patients is a cause of CVD events and requires therapeutic targeting.
Resumo:
Hyperhomocysteinemia is a potential risk factor for vascular disease and is associated with endothelial dysfunction, a predictor of adverse cardiovascular events. Renal patients (end-stage renal failure (ESRF) and transplant recipients (RTR)) exhibit both hyperhomocysteinemia and endothelial dysfunction with increasing evidence of a causative link between the 2 conditions. The elevated homocysteine appears to be due to altered metabolism in the kidney (intrarenal) and in the uremic circulation ( extrarenal). This review will discuss 18 supplementation studies conducted in ESRF and 6 in RTR investigating the effects of nutritional therapy to lower homocysteine. The clinical significance of lowering homocysteine in renal patients will be discussed with data on the effects of B vitamin supplementation on cardiovascular outcomes such as endothelial function presented. Folic acid is the most effective nutritional therapy to lower homocysteine. In ESRF patients, supplementation with folic acid over a wide dose range ( 2 - 20 mg/day) either individually or in combination with other B vitamins will decrease but not normalize homocysteine. In contrast, in RTR similar doses of folic acid normalizes homocysteine. Folic acid improves endothelial function in ESRF patients, however this has yet to be investigated in RTR. Homocysteine-lowering therapy is more effective in ESRF patients than RTR.
Resumo:
BACKGROUND AND AIMS: Although it has become clear that aneurysmal and occlusive arterial disease represent two distinct etiologic entities, it is still unknown whether the two vascular pathologies are prognostically different. We aim to assess the long-term vital prognosis of patients with abdominal aortic aneurysmal disease (AAA) or peripheral artery disease (PAD), focusing on possible differences in survival, prognostic risk profiles and causes of death. METHODS: Patients undergoing elective surgery for isolated AAA or PAD between 2003 and 2011 were retrospectively included. Differences in postoperative survival were determined using Kaplan-Meier and Cox regression analysis. Prognostic risk profiles were also established with Cox regression analysis. RESULTS: 429 and 338 patients were included in the AAA and PAD groups, respectively. AAA patients were older (71.7 vs. 63.3 years, p < 0.001), yet overall survival following surgery did not differ (HR: 1.16, 95% CI: 0.87-1.54). Neither was type of vascular disease associated with postoperative cardiovascular nor cancer-related death. However, in comparison with age- and gender-matched general populations, cardiovascular mortality was higher in PAD than AAA patients (48.3% vs. 17.3%). Survival of AAA and PAD patients was negatively affected by age, history of cancer and renal insufficiency. Additional determinants in the PAD group were diabetes and ischemic heart disease. CONCLUSIONS: Long-term survival after surgery for PAD and AAA is similar. However, overall life expectancy is significantly worse among PAD patients. The contribution of cardiovascular disease towards mortality in PAD patients warrants more aggressive secondary prevention to reduce cardiovascular mortality and improve longevity.
Resumo:
Background The accurate measurement of Cardiac output (CO) is vital in guiding the treatment of critically ill patients. Invasive or minimally invasive measurement of CO is not without inherent risks to the patient. Skilled Intensive Care Unit (ICU) nursing staff are in an ideal position to assess changes in CO following therapeutic measures. The USCOM (Ultrasonic Cardiac Output Monitor) device is a non-invasive CO monitor whose clinical utility and ease of use requires testing. Objectives To compare cardiac output measurement using a non-invasive ultrasonic device (USCOM) operated by a non-echocardiograhically trained ICU Registered Nurse (RN), with the conventional pulmonary artery catheter (PAC) using both thermodilution and Fick methods. Design Prospective observational study. Setting and participants Between April 2006 and March 2007, we evaluated 30 spontaneously breathing patients requiring PAC for assessment of heart failure and/or pulmonary hypertension at a tertiary level cardiothoracic hospital. Methods SCOM CO was compared with thermodilution measurements via PAC and CO estimated using a modified Fick equation. This catheter was inserted by a medical officer, and all USCOM measurements by a senior ICU nurse. Mean values, bias and precision, and mean percentage difference between measures were determined to compare methods. The Intra-Class Correlation statistic was also used to assess agreement. The USCOM time to measure was recorded to assess the learning curve for USCOM use performed by an ICU RN and a line of best fit demonstrated to describe the operator learning curve. Results In 24 of 30 (80%) patients studied, CO measures were obtained. In 6 of 30 (20%) patients, an adequate USCOM signal was not achieved. The mean difference (±standard deviation) between USCOM and PAC, USCOM and Fick, and Fick and PAC CO were small, −0.34 ± 0.52 L/min, −0.33 ± 0.90 L/min and −0.25 ± 0.63 L/min respectively across a range of outputs from 2.6 L/min to 7.2 L/min. The percent limits of agreement (LOA) for all measures were −34.6% to 17.8% for USCOM and PAC, −49.8% to 34.1% for USCOM and Fick and −36.4% to 23.7% for PAC and Fick. Signal acquisition time reduced on average by 0.6 min per measure to less than 10 min at the end of the study. Conclusions In 80% of our cohort, USCOM, PAC and Fick measures of CO all showed clinically acceptable agreement and the learning curve for operation of the non-invasive USCOM device by an ICU RN was found to be satisfactorily short. Further work is required in patients receiving positive pressure ventilation.
Resumo:
End-stage renal failure is a life-threatening condition, often treated with home-based peritoneal dialysis (PD). PD is a demanding regimen, and the patients who practise it must make numerous lifestyle changes and learn complicated biomedical techniques. In our experience, the renal nurses who provide mostPDeducation frequently express concerns that patient compliance with their teaching is poor. These concerns are mirrored in the renal literature. It has been argued that the perceived failure of health professionals to improve compliance rates with PD regimens is because ‘compliance’ itself has never been adequately conceptualized or defined; thus, it is difficult to operationalize and quantify. This paper examines how a group of Australian renal nurses construct patient compliance with PD therapy. These empirical data illuminate how PD compliance operates in one practice setting; how it is characterized by multiple and often competing energies; and how ultimately it might be pointless to try to tame ‘compliance’ through rigid definitions and measurement, or to rigidly enforce it in PD patients. The energies involved are too fractious and might be better spent, as many of the more experienced nurses in this study argue, in augmenting the energies that do work well together to improve patient outcomes.
Resumo:
The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.
Resumo:
The high levels of end-stage renal disease among Indigenous Australians, particularly in remote areas of the country, are a serious public health concern. The magnitude of the problem is reflected in figures from the Australian and New Zealand Transplant and Dialysis Registry that show that Indigenous Australians experience end-stage renal disease at a rate almost 9–10 times higher than other non-Indigenous Australians. A majority of Indigenous Australians have to relocate to receive appropriate renal dialysis treatment. In some Australian states, renal treatment is based on self-care dialysis which allows those Indigenous Australians to be treated back in their community. Evidence clearly shows that reuniting renal patients with community and family improves overall health and well-being for those Indigenous Australians. With the appropriate resources, training, and support, self-care management of renal dialysis treatment is an effective way for Indigenous people with end-stage renal failure to be treated at home. In this context, the study was used to gain insight and further understanding of the impact that end-stage renal disease and renal dialysis treatment has had on the lives of Indigenous community members. The study findings are from 14 individually interviewed people from South East Queensland. Data from the interviews were analysed using a combination of thematic and content analysis. The study methodology was based on qualitative data principles where the Indigenous community members were able to share their experiences and journeys living with end-stage renal disease. Many of the experiences and understanding closely relate to the renal disease pattern and the treatment with other outside influences, such as social, cultural, and environmental influences, all having an equal impact. Each community member’s experience with end-stage renal disease is unique; some manage with family and medical support, while others try to manage independently. From the study, community members who managed their renal dialysis treatment independently were much more aware of their renal health status. The study provides recommendations towards a model of care to improve the health and well-being is based on self-care and self-determination principles.
Resumo:
Background: Acute coronary syndromes are a major cause of mortality and morbidity. Objectives/Methods: The objective of this evaluation is to review the clinical trials of two new drugs being developed for the treatment of acute coronary syndromes. The first drug is the anti-coagulant otamixaban, and the trial compared otamixaban with unfractionated heparin and eptifibatide in acute coronary syndromes. The second drug is the anti-platelet ticagrelor, and the trial compared ticagrelor with clopidogrel in acute coronary syndromes. Results: In the SEPIA-ACS1 TIMI 42 trial, the primary efficacy endpoint occurred in 6.2% of subjects treated with unfractionated heparin and eptifibatide, and to a significantly lesser extent with otamixaban. In the PLATO trial, the primary efficacy endpoint had occurred less in the ticagrelor group (9.8%) than in the clopidogrel group (11.7%) at 12 months. Conclusions: Two new drugs for acute coronary syndromes, otamixaban and ticagrelor, have recently been shown to have benefits in subjects undergoing percutaneous interventions compared to the present standard regimens for this condition.