916 resultados para CARDIOPULMONARY COMPLICATIONS
Resumo:
Objectives: To identify and appraise the literature concerning nurse-administered procedural sedation and analgesia in the cardiac catheter laboratory. Design and data sources: An integrative review method was chosen for this study. MEDLINE and CINAHL databases as well as The Cochrane Database of Systematic Reviews and the Joanna Briggs Institute were searched. Nineteen research articles and three clinical guidelines were identified. Results: The authors of each study reported nurse-administered sedation in the CCL is safe due to the low incidence of complications. However, a higher percentage of deeply sedated patients were reported to experience complications than moderately sedated patients. To confound this issue, one clinical guideline permits deep sedation without an anaesthetist present, while others recommend against it. All clinical guidelines recommend nurses are educated about sedation concepts. Other findings focus on pain and discomfort and the cost-savings of nurse-administered sedation, which are associated with forgoing anaesthetic services. Conclusions: Practice is varied due to limitations in the evidence and inconsistent clinical practice guidelines. Therefore, recommendations for research and practice have been made. Research topics include determining how and in which circumstances capnography can be used in the CCL, discerning the economic impact of sedation-related complications and developing a set of objectives for nursing education about sedation. For practice, if deep sedation is administered without an anaesthetist present, it is essential nurses are adequately trained and have access to vital equipment such as capnography to monitor ventilation because deeply sedated patients are more likely to experience complications related to sedation. These initiatives will go some way to ensuring patients receiving nurse-administered procedural sedation and analgesia for a procedure in the cardiac catheter laboratory are cared for using consistent, safe and evidence-based practices.
Resumo:
Background: Procedural sedation and analgesia (PSA) administered by nurses in the cardiac catheterisation laboratory (CCL) is unlikely to yield serious complications. However, the safety of this practice is dependent on timely identification and treatment of depressed respiratory function. Aim: Describe respiratory monitoring in the CCL. Methods: Retrospective medical record audit of adult patients who underwent a procedure in the CCLs of one private hospital in Brisbane during May and June 2010. An electronic database was used to identify subjects and an audit tool ensured data collection was standardised. Results: Nurses administered PSA during 172/473 (37%) procedures including coronary angiographies, percutaneous coronary interventions, electrophysiology studies, radiofrequency ablations, cardiac pacemakers, implantable cardioverter defibrillators, temporary pacing leads and peripheral vascular interventions. Oxygen saturations were recorded during 160/172 (23%) procedures, respiration rate was recorded during 17/172 (10%) procedures, use of oxygen supplementation was recorded during 40/172 (23%) procedures and 13/172 (7.5%; 95% CI=3.59–11.41%) patients experienced oxygen desaturation. Conclusion: Although oxygen saturation was routinely documented, nurses did not regularly record respiration observations. It is likely that surgical draping and the requirement to minimise radiation exposure interfered with nurses’ ability to observe respiration. Capnography could overcome these barriers to respiration assessment as its accurate measurement of exhaled carbon dioxide coupled with the easily interpretable waveform output it produces, which displays a breath-by-breath account of ventilation, enables identification of respiratory depression in real-time. Results of this audit emphasise the need to ascertain the clinical benefits associated with using capnography to assess ventilation during PSA in the CCL.
Resumo:
OBJECTIVES: To examine the prospective association between perception of health during pregnancy and cardiovascular risk factor of mothers 21 years after the index pregnancy. METHODS: Data used were from the Mater University Study of Pregnancy (MUSP), a community- based prospective birth ohort study begun in Brisbane, Australia, in 1983. Logistic regression analyses were conducted. RESULTS: Data were available for 3692 women. Women who perceived themselves as not having a straight forward pregnancy had twice the odds (adjusted OR 2.0, 95% CI 1.1-3.8) of being diagnosed with heart disease 21 years after the indexpregnancyascomparedtowomenwith a straight forward pregnancy. Apart from that, women who had complications (other than serious pregnancy complications) during the pregnancy were also at30%increased odds (adjustedOR 1.3, 95% CI 1.0-1.6) of having hypertension 21 years later. CONCLUSIONS: As a whole, our study suggests that pregnant women who perceived that they had complications and did not have a straight forward pregnancy are likely to experience poorer cardiovascular outcomes 21 years after the pregnancy.
Resumo:
Background: Bronchopulmonary dysplasia (BPD) is one of the most common complications after preterm birth and is associated with intrauterine exposure to bacteria. Transforming growth factor-β (TGFβ) is implicated in the development of BPD. Objectives: We hypothesized that different and/or multiple bacterial signals could elicit divergent TGFβ signaling responses in the developing lung. Methods: Time-mated pregnant Merino ewes received an intra-amniotic injection of lipopolysaccharide (LPS) and/or Ureaplasma parvum serovar 3 (UP) at 117 days' and/or 121/122 days' gestational age (GA). Controls received an equivalent injection of saline and or media. Lambs were euthanized at 124 days' GA (term = 150 days' GA). TGFβ1, TGFβ2, TGFβ3, TGFβ receptor (R)1 and TGFβR2 protein levels, Smad2 phosphorylation and elastin deposition were evaluated in lung tissue. Results: Total TGFβ1 and TGFβ2 decreased by 24 and 51% after combined UP+LPS exposure, whereas total TGFβ1 increased by 31% after 7 days' LPS exposure but not after double exposures. Alveolar expression of TGFβR2 decreased 75% after UP, but remained unaltered after double exposures. Decreased focal elastin deposition after single LPS exposure was prevented by double exposures. Conclusions: TGFβ signaling components and elastin responded differently to intrauterine LPS and UP exposure. Multiple bacterial exposures attenuated TGFβ signaling and normalized elastin deposition.
Resumo:
Introduction: Thoracoscopic anterior instrumented fusion (TASF) is a safe and viable surgical option for corrective stabilisation of progressive adolescent idiopathic scoliosis (AIS) [1-2]. However, there is a paucity of literature examining optimum methods of analgesia following this type of surgery. The aim of this study was to identify; if local anaesthetic bolus via an intrapleural catheter provides effective analgesia following thoracoscopic scoliosis correction; what pain levels may be expected; and any adverse effects associated with the use of intermittent intrapleural analgesia at our centre. Methods: A subset of the most recent 80 patients from a large single centre consecutive series of 201 patients (April 2000 to present) who had undergone TASF had their medical records reviewed. 32 patients met the inclusion criteria for the analysis (i.e. pain scores must have been recorded within the hour prior and within two hours following an intrapleural bolus being given). All patients received an intrapleural catheter inserted during surgery, in addition to patient-controlled opiate analgesia and oral analgesia as required. After surgery, patients received a bolus of 0.25% bupivacaine every four hours via the intrapleural catheter. Visual analogue pain scale scores were recorded before and after the bolus of local anaesthetic and the quantity and time of day that any other analgesia was taken, were also recorded. Results and Discussion: 28 female and four male patients (mean age 14.5 ± 1.5 years) had a total of 230 boluses of local anaesthetic administered intrapleurally, directly onto the spine, in the 96 hour period following surgery. Pain scores significantly decreased following the administration of a bolus (p<0.0001), with the mean pain score decreasing from 3.66 to 1.83. The quantity of opiates via patient-controlled analgesia after surgery decreased steadily between successive 24 hours intervals after an initial increase in the second 24 hour period when patients were mobilised. One intrapleural catheter required early removal at 26 hours postop due to leakage; there were no other associated complications with the intermittent intrapleural analgesia method. Post-operative pain following anterior scoliosis correction was decreased significantly with the administration of regular local anaesthetic boluses and can be reduced to ‘mild’ levels by combined analgesia regimes. The intermittent intrapleural analgesia method was not associated with any adverse events or complications in the full cohort of 201 patients.
Resumo:
Chlamydia trachomatis continues to be the most commonly reported sexually transmitted bacterial infection in many countries with more than 100 million new cases estimated annually. These acute infections translate into significant downstream health care costs, particularly for women, where complications can include pelvic inflammatory disease and other disease sequelae such as tubal factor infertility. Despite years of research, the immunological mechanisms responsible for protective immunity versus immunopathology are still not well understood, although it is widely accepted that T cell driven IFN-g and Th17 responses are critical for clearing infection. While antibodies are able to neutralize infections in vitro, alone they are not protective, indicating that any successful vaccine will need to elicit both arms of the immune response. In recent years, there has been an expansion in the number and types of antigens that have been evaluated as vaccines, and combined with the new array of mucosal adjuvants, this aspect of chlamydial vaccinology is showing promise. Most recently, the opportunities to develop successful vaccines have been given a significant boost with the development of a genetic transformation system for Chlamydia, as well as the identification of the key role of the chlamydial plasmid in virulence. While still remaining a major challenge, the development of a successful C.trachomatis vaccine is starting to look more likely.
Resumo:
Hand, Foot and Mouth Disease (HFMD), a contagious viral disease that commonly affects infants and children with blisters and flu like symptoms, is caused by a group of enteroviruses such as Enterovirus 71 (EV71) and coxsackievirus A16 (CA16). However some HFMD caused by EV71 may further develop into severe neurological complications such as encephalitis and meningitis. The route of transmission was postulated that the virus transmit from one person to another through direct contact of vesicular fluid or droplet from the infected or via faecal-oral route. To this end, this study utilised a human colorectal adenocarcinoma cell line (HT29) with epithelioid morphology as an in vitro model for the investigation of EV71 replication kinetics. Using qPCR, viral RNA was first detected in HT29 cells as early as 12 h post infection (hpi) while viral protein was first detected at 48 hpi. A significant change in HT29 cells’ morphology was also observed after 48 hpi. Furthermore HT29 cell viability also significantly decreased at 72 hpi. Together, data from this study demonstrated that co-culture of HT29 with EV71 is a useful in vitro model to study the pathogenesis of EV71
Resumo:
Background: In diabetes care, health care professionals need to provide support for their patients. In order to provide good diabetes self-management support for adults with type 2 diabetes in Vietnam, it is important that health care professionals in Vietnam understand the factors influencing diabetes self-management among these people. However, knowledge about factors influencing diabetes self-management among adults with type 2 diabetes in Vietnam is limited. Objectives: This study aimed to investigate factors influencing diabetes self-management among adults with type 2 diabetes in Vietnam. Methodology: A cross-sectional survey with convenience sampling was conducted on 198 adults with type 2 diabetes in VietnamData collection was administeted via interview. Descriptive statistics, simple correlation statistics and structural equation modelling statistics were used for data analysis. Results: Adults with type 2 diabetes in Vietnam had limited diabetes knowledge (Median = 6.0). The majority of the study participants (72.7%) believed that performing diabetes self-management activities was very important or extremely important for controlling their blood glucose levels and for preventing complications from diabetes; about half usually received support from their family and friends’ (48.5%), and around two thirds rarely received support from their health care providers (68.2%). Many of the participants (41.4%) had limited confidence to perform diabetes management activities. The practices of diabetes self-management were limited among the study population (Mean = 96.7, SD = 19.4). Diabetes knowledge (β = 0.17, p < .001), belief in treatment effectiveness (β = 0.13, p < .01), family and friends’ support (β = 0.13, p < .001), health care providers’ support (β = 0.27, p < .001) and diabetes management self-efficacy (β = 0.43, p < .001) directly influenced their diabetes self-management. Diabetes knowledge, and family and friends’ support also indirectly influenced diabetes self-management among these people through their belief in treatment effectiveness and their diabetes management self-efficacy (p < .05). Conclusion: Findings in this study indicated that health care professionals should provide diabetes self-management support for adults with type 2 diabetes in Vietnam in the future. The adapted theory-based model of factors influencing diabetes self-management among adults with type 2 diabetes in Vietnam found in this study could be a useful framework to develop this supporting program.
Resumo:
Background: Gestational diabetes mellitus (GDM) is increasing, along with obesity and type 2 diabetes (T2DM), with Aboriginal and Torres Strait Islander people* in Australia particularly affected. GDM causes serious complications in pregnancy, birth, and the longer term, for both women and their infants. Women diagnosed with GDM have an eightfold risk of developing T2DM after pregnancy, compared with women who have not had GDM. Indigenous women have an even higher risk, at a younger age, and progress more quickly from GDM to T2DM, compared to non-Indigenous women. If left undetected and untreated, T2DM can lead to heart disease, stroke, renal disease, kidney failure, amputations and blindness. A GDM diagnosis offers a ‘window of opportunity’ for diabetes health interventions and it is vital that acceptable and effective prevention, treatment, and post-pregnancy care are provided. Low rates of post-pregnancy screening for T2DM are reported among non-Aboriginal women in Australia and among Indigenous women in other countries, however data for Aboriginal women are scarce. Breastfeeding, a healthy diet, and exercise can also help to prevent T2DM, and together with T2DM screening are recommended elements of ‘post-pregnancy care’ for women with GDM, This paper describes methods for a data linkage study to investigate rates of post-pregnancy care among women with GDM. Methods/Design: This retrospective cohort includes all women who gave birth at Cairns Base Hospital in Far North Queensland, Australia, from 2004 to 2010, coded as having GDM in the Cairns Base Hospital Clinical Coding system. Data linkage is being conducted with the Queensland Perinatal Data Collection, and three laboratories. Hospital medical records are being reviewed to validate the accuracy of GDM case ascertainment, and gather information on breastfeeding and provision of dietary advice. Multiple logistic regression is being used to compare post-pregnancy care between Aboriginal and non-Aboriginal women, while adjusting for other factors may impact on post-pregnancy care. Survival analysis is being used to estimate the rates of progression from GDM to T2DM. Discussion: There are challenges to collecting post-pregnancy data for women with GDM. However, research is urgently needed to ensure adequate post-pregnancy care is provided for women with GDM in Australia.
Resumo:
BACKGROUND: Transcatheter closure of patent foramen ovale (PFO) has rapidly evolved as the preferred management strategy for the prevention of recurrent cerebrovascular events in patients with cryptogenic stroke and presumed paradoxical embolus. There is limited outcome data in patients treated with this therapy particularly for the newer devices. METHODS: Data from medical records, catheter, and echocardiography databases on 70 PFO procedures performed was collected prospectively. RESULTS: The cohort consisted of 70 patients (mean age 43.6 years, range 19 to 77 years), of whom 51% were male. The indications for closure were cryptogenic cerebrovascular accident (CVA) or transient ischemic attack (TIA) in 64 (91%) and peripheral emboli in two (2.8%) patients and cryptogenic ST-elevation myocardial infarction in one (1.4%), refractory migraine in one (1.4%), decompression sickness in one (1.4%), and orthodeoxia in one (1.4%) patient, respectively. All patients had demonstrated right-to-left shunting on bubble study. The procedures were guided by intracardiac echocardiography in 53%, transesophageal echocardiography in 39%, and the remainder by transthoracic echo alone. Devices used were the Amplatzer PFO Occluder (AGA Medical) (sizes 18-35 mm) in 49 (70%) and the Premere device (St. Jude Medical) in 21 (30%). In-hospital complications consisted of one significant groin hematoma with skin infection. Echocardiographic follow-up at 6 months revealed that most patients had no or trivial residual shunt (98.6%), while one patient (1.4%) had a mild residual shunt. At a median of 11 months' follow-up (range 1 month to 4.3 years), no patients (0%) experienced further CVA/TIAs or paradoxical embolic events during follow-up. CONCLUSION: PFO causing presumed paradoxical embolism can be closed percutaneously with a low rate of significant residual shunting and very few complications. Recurrent index events are uncommon at medium-term (up to 4 years) follow-up.
Resumo:
BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.
Resumo:
Critical-sized osteochondral defects are clinically challenging, with limited treatment options available. By engineering osteochondral grafts using a patient's own cells and osteochondral scaffolds designed to facilitate cartilage and bone regeneration, osteochondral defects may be treated with less complications and better long-term clinical outcomes. Scaffolds can influence the development and structure of the engineered tissue, and there is an increased awareness that osteochondral tissue engineering concepts need to take the in vivo complexities into account in order to increase the likelihood of successful osteochondral tissue repair. The developing trend in osteochondral tissue engineering is the utilization of multiphasic scaffolds to recapitulate the multiphasic nature of the native tissue. Cartilage and bone have different structural, mechanical, and biochemical microenvironments. By designing osteochondral scaffolds with tissue-specific architecture, it may be possible to enhance osteochondral repair within shorter timeframe. While there are promising in vivo outcomes using multiphasic approaches, functional regeneration of osteochondral constructs still remains a challenge. In this review, we provide an overview of in vivo osteochondral repair studies that have taken place in the past three years, and define areas which needs improvement in future studies
Resumo:
Background Young parenthood continues to be an issue of concern in terms of clinical and psychosocial outcomes for mothers and their babies, with higher rates of medical complications such as preterm labour and hypertensive disease and a higher risk of depression. The aim of this study was to investigate how young age impacts on women's experience of intrapartum care. Methods Secondary analysis of data collected in a population based survey of women who had recently given birth in Queensland, comparing clinical and interpersonal aspects of the intrapartum maternity care experience for 237 eligible women aged 15–20 years and 6534 aged more than 20 years. Descriptive and multivariate analyses were undertaken. Results In the univariate analysis a number of variables were significantly associated with clinical aspects of labour and birth and perceptions of care: young women were more likely to birth in a public facility, to travel for birth and to live in less economically advantaged areas, to have a normal vaginal birth and to have one carer through labour. They were also less likely to report being treated with respect and kindness and talked to in a way they could understand. In logistic regression models, after adjustment for parity, other socio-demographic factors and mode of birth, younger mothers were still more likely to birth in a public facility, to travel for birth, to be more critical about interpersonal and aspects of care and the hospital or birth centre environment. Conclusion This study shows how experience of care during labour and birth is different for young women. Young women reported poorer quality interpersonal care which may well reflect an inferior care experience and stereotyping by health professionals, indicating a need for more effective staff engagement with young women at this time.
Resumo:
Background The association between temperature and mortality has been examined mainly in North America and Europe. However, less evidence is available in developing countries, especially in Thailand. In this study, we examined the relationship between temperature and mortality in Chiang Mai city, Thailand, during 1999–2008. Method A time series model was used to examine the effects of temperature on cause-specific mortality (non-external, cardiopulmonary, cardiovascular, and respiratory) and age-specific non-external mortality (<=64, 65–74, 75–84, and > =85 years), while controlling for relative humidity, air pollution, day of the week, season and long-term trend. We used a distributed lag non-linear model to examine the delayed effects of temperature on mortality up to 21 days. Results We found non-linear effects of temperature on all mortality types and age groups. Both hot and cold temperatures resulted in immediate increase in all mortality types and age groups. Generally, the hot effects on all mortality types and age groups were short-term, while the cold effects lasted longer. The relative risk of non-external mortality associated with cold temperature (19.35°C, 1st percentile of temperature) relative to 24.7°C (25th percentile of temperature) was 1.29 (95% confidence interval (CI): 1.16, 1.44) for lags 0–21. The relative risk of non-external mortality associated with high temperature (31.7°C, 99th percentile of temperature) relative to 28°C (75th percentile of temperature) was 1.11 (95% CI: 1.00, 1.24) for lags 0–21. Conclusion This study indicates that exposure to both hot and cold temperatures were related to increased mortality. Both cold and hot effects occurred immediately but cold effects lasted longer than hot effects. This study provides useful data for policy makers to better prepare local responses to manage the impact of hot and cold temperatures on population health.
Resumo:
Right heart dysfunction is one of the most serious complications following implantation of a left ventricular assist device (LVAD), often leading to the requirement for short or long term right ventricular support (RVAD). The inflow cannulation site induces major haemodynamic changes and so there is a need to optimize the site used depending on the patient's condition. Therefore, this study evaluated and compared the haemodynamic influence of right atrial (RAC) and right ventricular (RVC) inflow cannulation sites. An in-vitro, variable heart failure, mock circulation loop was used to compare RAC and RVC in mild and severe biventricular heart failure (BHF) conditions. In the severe BHF condition, higher ventricular ejection fraction (RAC: 13.6%, RVC: 32.7%) and thus improved heart chamber and RVAD washout was observed with RVC, which suggested this strategy might be preferable for long term support (ie. bridge to transplant or destination therapy) to reduce the risk of thrombus formation. In the mild BHF condition, higher pulmonary valve flow (RAC: 3.33 L/min, RVC: 1.97 L/min) and lower right ventricular stroke work (RAC: 0.10 W, RVC: 0.13 W) and volumes were recorded with RAC. These results indicate an improved potential for myocardial recovery, thus RAC should be chosen in this condition. This in-vitro study suggests that RVAD inflow cannulation site should be chosen on a patient-specific basis with a view to the support strategy to promote myocardial recovery or reduce the risk of long-term complications.