553 resultados para Patient monitoring
em Queensland University of Technology - ePrints Archive
Resumo:
Background: Although the potential to reduce hospitalisation and mortality in chronic heart failure (CHF) is well reported, the feasibility of receiving healthcare by structured telephone support or telemonitoring is not. Aims: To determine; adherence, adaptation and acceptability to a national nurse-coordinated telephone-monitoring CHF management strategy. The Chronic Heart Failure Assistance by Telephone Study (CHAT). Methods: Triangulation of descriptive statistics, feedback surveys and qualitative analysis of clinical notes. Cohort comprised of standard care plus intervention (SC + I) participants who completed the first year of the study. Results: 30 GPs (70% rural) randomised to SC + I recruited 79 eligible participants, of whom 60 (76%) completed the full 12 month follow-up period. During this time 3619 calls were made into the CHAT system (mean 45.81 SD ± 79.26, range 0-369), Overall there was an adherence to the study protocol of 65.8% (95% CI 0.54-0.75; p = 0.001) however, of the 60 participants who completed the 12 month follow-up period the adherence was significantly higher at 92.3% (95% CI 0.82-0.97, p ≤ 0.001). Only 3% of this elderly group (mean age 74.7 ±9.3 years) were unable to learn or competently use the technology. Participants rated CHAT with a total acceptability rate of 76.45%. Conclusion: This study shows that elderly CHF patients can adapt quickly, find telephone-monitoring an acceptable part of their healthcare routine, and are able to maintain good adherence for a least 12 months. © 2007.
Resumo:
Aim: To develop a set of Australian recommendations for the monitoring and treatment of ankylosing spondylitis (AS) through systematic literature review combined with the opinion of practicing rheumatologists. Methods: A set of eight questions, four in each domain of monitoring and treatment, were formulated by voting and the Delphi method. The results of a systematic literature review addressing each question were presented to the 23 participants of the Australian 3E meeting. All participants were clinical rheumatologists experienced in the daily management of AS. Results: After three rounds of breakout sessions to discuss the findings of the literature review, a set of recommendations was finalized after discussion and voting. The category of evidence and strength of recommendation were determined for each proposal. The level of agreement among participants was excellent (mean 84%, range 64-100%). Conclusions: The 12 recommendations developed from evidence and expert opinion provide guidance for the daily management of AS patients. For most recommendations, we found a paucity of supportive evidence in the literature highlighting the need for additional clinical studies.
Resumo:
Objective: To determine whether remote monitoring (structured telephone support or telemonitoring) without regular clinic or home visits improves outcomes for patients with chronic heart failure. Data sources: 15 electronic databases, hand searches of previous studies, and contact with authors and experts. Data extraction: Two investigators independently screened the results. Review methods: Published randomised controlled trials comparing remote monitoring programmes with usual care in patients with chronic heart failure managed within the community. Results: 14 randomised controlled trials (4264 patients) of remote monitoring met the inclusion criteria: four evaluated telemonitoring, nine evaluated structured telephone support, and one evaluated both. Remote monitoring programmes reduced the rates of admission to hospital for chronic heart failure by 21% (95% confidence interval 11% to 31%) and all cause mortality by 20% (8% to 31%); of the six trials evaluating health related quality of life three reported significant benefits with remote monitoring, and of the four studies examining healthcare costs with structured telephone support three reported reduced cost and one no effect. Conclusion: Programmes for chronic heart failure that include remote monitoring have a positive effect on clinical outcomes in community dwelling patients with chronic heart failure.
Resumo:
Background Knowledge of current trends in nurse-administered procedural sedation and analgesia (PSA) in the cardiac catheterisation laboratory (CCL) may provide important insights into how to improve safety and effectiveness of this practice. Objective To characterise current practice as well as education and competency standards regarding nurse-administered PSA in Australian and New Zealand CCLs. Design A quantitative, cross-sectional, descriptive survey design was used. Methods Data were collected using a web-based questionnaire on practice, educational standards and protocols related to nurse-administered PSA. Descriptive statistics were used to analyse data. Results A sample of 62 nurses, each from a different CCL, completed a questionnaire that focused on PSA practice. Over half of the estimated total number of CCLs in Australia and New Zealand was represented. Nurse-administered PSA was used in 94% (n = 58) of respondents CCLs. All respondents indicated that benzodiazepines, opioids or a combination of both is used for PSA (n = 58). One respondent indicated that propofol was also used. 20% (n = 12) indicated that deep sedation is purposefully induced for defibrillation threshold testing and cardioversion without a second medical practitioner present. Sedation monitoring practices vary considerably between institutions. 31% (n = 18) indicated that comprehensive education about PSA is provided. 45% (n = 26) indicated that nurses who administer PSA should undergo competency assessment. Conclusion By characterising nurse-administered PSA in Australian and New Zealand CCLs, a baseline for future studies has been established. Areas of particular importance to improve include protocols for patient monitoring and comprehensive PSA education for CCL nurses in Australia and New Zealand.
Resumo:
Aims and objectives To explore issues and challenges associated with nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory from the perspectives of senior nurses. Background Nurses play an important part in managing sedation because the prescription is usually given verbally directly from the cardiologist who is performing the procedure and typically, an anaesthetist is not present. Design A qualitative exploratory design was employed. Methods Semi-structured interviews with 23 nurses from 16 cardiac catheterisation laboratories across four states in Australia and also New Zealand were conducted. Data analysis followed the guide developed by Braun and Clark to identify the main themes. Results Major themes emerged from analysis regarding the lack of access to anaesthetists, the limitations of sedative medications, the barriers to effective patient monitoring and the impact that the increasing complexity of procedures has on patients' sedation requirements. Conclusions The most critical issue identified in this study is that current guidelines, which are meant to apply regardless of the clinical setting, are not practical for the cardiac catheterisation laboratory due to a lack of access to anaesthetists. Furthermore, this study has demonstrated that nurses hold concerns about the legitimacy of their practice in situations when they are required to perform tasks outside of clinical practice guidelines. To address nurses' concerns, it is proposed that new guidelines could be developed, which address the unique circumstances in which sedation is used in the cardiac catheterisation laboratory. Relevance to clinical practice Nurses need to possess advanced knowledge and skills in monitoring for the adverse effects of sedation. Several challenges impact on nurses' ability to monitor patients during procedural sedation and analgesia. Preprocedural patient education about what to expect from sedation is essential.
Resumo:
The cardiac catheterisation laboratory (CCL) is a specialised medical radiology facility where both chronic-stable and life-threatening cardiovascular illness is evaluated and treated. Although there are many potential sources of discomfort and distress associated with procedures performed in the CCL, a general anaesthetic is not usually required. For this reason, an anaesthetist is not routinely assigned to the CCL. Instead, to manage pain, discomfort and anxiety during the procedure, nurses administer a combination of sedative and analgesic medications according to direction from the cardiologist performing the procedure. This practice is referred to as nurse-administered procedural sedation and analgesia (PSA). While anecdotal evidence suggested that nurse-administered PSA was commonly used in the CCL, it was clear from the limited information available that current nurse-led PSA administration and monitoring practices varied and that there was contention around some aspects of practice including the type of medications that were suitable to be used and the depth of sedation that could be safely induced without an anaesthetist present. The overall aim of the program of research presented in this thesis was to establish an evidence base for nurse-led sedation practices in the CCL context. A sequential mixed methods design was used over three phases. The objective of the first phase was to appraise the existing evidence for nurse-administered PSA in the CCL. Two studies were conducted. The first study was an integrative review of empirical research studies and clinical practice guidelines focused on nurse-administered PSA in the CCL as well as in other similar procedural settings. This was the first review to systematically appraise the available evidence supporting the use of nurse-administered PSA in the CCL. A major finding was that, overall, nurse-administered PSA in the CCL was generally deemed to be safe. However, it was concluded from the analysis of the studies and the guidelines that were included in the review, that the management of sedation in the CCL was impacted by a variety of contextual factors including local hospital policy, workforce constraints and cardiologists’ preferences for the type of sedation used. The second study in the first phase was conducted to identify a sedation scale that could be used to monitor level of sedation during nurse-administered PSA in the CCL. It involved a structured literature review and psychometric analysis of scale properties. However, only one scale was found that was developed specifically for the CCL, which had not undergone psychometric testing. Several weaknesses were identified in its item structure. Other sedation scales that were identified were developed for the ICU. Although these scales have demonstrated validity and reliability in the ICU, weaknesses in their item structure precluded their use in the CCL. As findings indicated that no existing sedation scale should be applied to practice in the CCL, recommendations for the development and psychometric testing of a new sedation scale were developed. The objective of the second phase of the program of research was to explore current practice. Three studies were conducted in this phase using both quantitative and qualitative research methods. The first was a qualitative explorative study of nurses’ perceptions of the issues and challenges associated with nurse-administered PSA in the CCL. Major themes emerged from analysis of the qualitative data regarding the lack of access to anaesthetists, the limitations of sedative medications, the barriers to effective patient monitoring and the impact that the increasing complexity of procedures has on patients' sedation requirements. The second study in Phase Two was a cross-sectional survey of nurse-administered PSA practice in Australian and New Zealand CCLs. This was the first study to quantify the frequency that nurse-administered PSA was used in the CCL setting and to characterise associated nursing practices. It was found that nearly all CCLs utilise nurse-administered PSA (94%). Of note, by characterising nurse-administered PSA in Australian and New Zealand CCLs, several strategies to improve practice, such as setting up protocols for patient monitoring and establishing comprehensive PSA education for CCL nurses, were identified. The third study in Phase Two was a matched case-control study of risk factors for impaired respiratory function during nurse-administered PSA in the CCL setting. Patients with acute illness were found to be nearly twice as likely to experience impaired respiratory function during nurse-administered PSA (OR=1.78; 95%CI=1.19-2.67; p=0.005). These significant findings can now be used to inform prospective studies investigating the effectiveness of interventions for impaired respiratory function during nurse-administered PSA in the CCL. The objective of the third and final phase of the program of research was to develop recommendations for practice. To achieve this objective, a synthesis of findings from the previous phases of the program of research informed a modified Delphi study, which was conducted to develop a set of clinical practice guidelines for nurse-administered PSA in the CCL. The clinical practice guidelines that were developed set current best practice standards for pre-procedural patient assessment and risk screening practices as well as the intra and post-procedural patient monitoring practices that nurses who administer PSA in the CCL should undertake in order to deliver safe, evidence-based and consistent care to the many patients who undergo procedures in this setting. In summary, the mixed methods approach that was used clearly enabled the research objectives to be comprehensively addressed in an informed sequential manner, and, as a consequence, this thesis has generated a substantial amount of new knowledge to inform and support nurse-led sedation practice in the CCL context. However, a limitation of the research to note is that the comprehensive appraisal of the evidence conducted, combined with the guideline development process, highlighted that there were numerous deficiencies in the evidence base. As such, rather than being based on high-level evidence, many of the recommendations for practice were produced by consensus. For this reason, further research is required in order to ascertain which specific practices result in the most optimal patient and health service outcomes. Therefore, along with necessary guideline implementation and evaluation projects, post-doctoral research is planned to follow up on the research gaps identified, which are planned to form part of a continuing program of research in this field.
Resumo:
Exercise-based cardiac rehabilitation (CR) is efficacious in reducing mortality and hospital admissions; however it remains inaccessible to large proportions of the patient population. Removal of attendance barriers for hospital or centre-based CR has seen the promotion of home-based CR. Delivery of safe and appropriately prescribed exercise in the home was first documented 25 years ago, with the utilisation of fixed land-line telecommunications to monitor ECG. The advent of miniature ECG sensors, in conjunction with smartphones, now enables CR to be delivered with greater flexibility with regard to location, time and format, while retaining the capacity for real-time patient monitoring. A range of new systems allow other signals including speed, location, pulse oximetry, and respiration to be monitored and these may have application in CR. There is compelling evidence that telemonitored-based CR is an effective alternative to traditional CR practice. The long-standing barrier of access to centre-based CR, combined with new delivery platforms, raises the question of when telemonitored-based CR could replace conventional approaches as the standard practice.
Resumo:
It has been established that mixed venous oxygen saturation (SvO2) reflects the balance between systemic oxygen deliver y and consumption. Literature indicates that it is a valuable clinical indicator and has good prognostic value early in patient course. This article aims to establish the usefulness of SvO2 as a clinical indicator. A secondary aim was to determine whether central venous oxygen saturation (ScvO2) and SvO2 are interchangeable. Of particular relevance to cardiac nurses is the link between decreased SvO2 and cardiac failure in patients with myocardial infarction, and with decline in myocardial function, clinical shock and arrhythmias. While absolute values ScvO2 and SvO2 are not interchangeable, ScvO2 and SvO2are equivalent in terms of clinical course. Additionally, ScvO2 monitoring is a safer and less costly alternative to SvO2 monitoring. It can be concluded that continuous ScvO2 monitoring should potentially be undertaken in patients at risk of haemodynamic instability.
Resumo:
The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.
Resumo:
Monitoring and enhancing patient compliance with peritoneal dialysis (PD) is a recurring and problematic theme in the renal literature. A growing body of literature also argues that a failure to understand the patient's perspective of compliance may be contributing to these problems. The aim of this study was to understand the concept of compliance with PD from the patient's perspective. Using the case study approach recommended by Stake (1995), five patients on PD consented to in-depth interviews that explored the meaning of compliance in the context of PD treatment and lifestyle regimens recommended by health professionals. Participants also discussed factors that influenced their choices to follow, disregard, or refine these regimens. Results indicate that health professionals acting in alignment with individual patient needs and wishes, and demonstrating an awareness of the constraints under which patients operate and the strengths they bring to their treatment, may be the most significant issues to consider with respect to definitions of PD compliance and the development of related compliance interventions. Aspects of compliance that promoted relative normality were also important to the participants in this study and tended to result in greater concordance with health professionals' advice.
Resumo:
Stem cells have attracted tremendous interest in recent times due to their promise in providing innovative new treatments for a great range of currently debilitating diseases. This is due to their potential ability to regenerate and repair damaged tissue, and hence restore lost body function, in a manner beyond the body's usual healing process. Bone marrow-derived mesenchymal stem cells or bone marrow stromal cells are one type of adult stem cells that are of particular interest. Since they are derived from a living human adult donor, they do not have the ethical issues associated with the use of human embryonic stem cells. They are also able to be taken from a patient or other donors with relative ease and then grown readily in the laboratory for clinical application. Despite the attractive properties of bone marrow stromal cells, there is presently no quick and easy way to determine the quality of a sample of such cells. Presently, a sample must be grown for weeks and subject to various time-consuming assays, under the direction of an expert cell biologist, to determine whether it will be useful. Hence there is a great need for innovative new ways to assess the quality of cell cultures for research and potential clinical application. The research presented in this thesis investigates the use of computerised image processing and pattern recognition techniques to provide a quicker and simpler method for the quality assessment of bone marrow stromal cell cultures. In particular, aim of this work is to find out whether it is possible, through the use of image processing and pattern recognition techniques, to predict the growth potential of a culture of human bone marrow stromal cells at early stages, before it is readily apparent to a human observer. With the above aim in mind, a computerised system was developed to classify the quality of bone marrow stromal cell cultures based on phase contrast microscopy images. Our system was trained and tested on mixed images of both healthy and unhealthy bone marrow stromal cell samples taken from three different patients. This system, when presented with 44 previously unseen bone marrow stromal cell culture images, outperformed human experts in the ability to correctly classify healthy and unhealthy cultures. The system correctly classified the health status of an image 88% of the time compared to an average of 72% of the time for human experts. Extensive training and testing of the system on a set of 139 normal sized images and 567 smaller image tiles showed an average performance of 86% and 85% correct classifications, respectively. The contributions of this thesis include demonstrating the applicability and potential of computerised image processing and pattern recognition techniques to the task of quality assessment of bone marrow stromal cell cultures. As part of this system, an image normalisation method has been suggested and a new segmentation algorithm has been developed for locating cell regions of irregularly shaped cells in phase contrast images. Importantly, we have validated the efficacy of both the normalisation and segmentation method, by demonstrating that both methods quantitatively improve the classification performance of subsequent pattern recognition algorithms, in discriminating between cell cultures of differing health status. We have shown that the quality of a cell culture of bone marrow stromal cells may be assessed without the need to either segment individual cells or to use time-lapse imaging. Finally, we have proposed a set of features, that when extracted from the cell regions of segmented input images, can be used to train current state of the art pattern recognition systems to predict the quality of bone marrow stromal cell cultures earlier and more consistently than human experts.
Resumo:
Precise identification of the time when a change in a hospital outcome has occurred enables clinical experts to search for a potential special cause more effectively. In this paper, we develop change point estimation methods for survival time of a clinical procedure in the presence of patient mix in a Bayesian framework. We apply Bayesian hierarchical models to formulate the change point where there exists a step change in the mean survival time of patients who underwent cardiac surgery. The data are right censored since the monitoring is conducted over a limited follow-up period. We capture the effect of risk factors prior to the surgery using a Weibull accelerated failure time regression model. Markov Chain Monte Carlo is used to obtain posterior distributions of the change point parameters including location and magnitude of changes and also corresponding probabilistic intervals and inferences. The performance of the Bayesian estimator is investigated through simulations and the result shows that precise estimates can be obtained when they are used in conjunction with the risk-adjusted survival time CUSUM control charts for different magnitude scenarios. The proposed estimator shows a better performance where a longer follow-up period, censoring time, is applied. In comparison with the alternative built-in CUSUM estimator, more accurate and precise estimates are obtained by the Bayesian estimator. These superiorities are enhanced when probability quantification, flexibility and generalizability of the Bayesian change point detection model are also considered.
Resumo:
Background Despite its efficacy and cost-effectiveness, exercise-based cardiac rehabilitation is undertaken by less than one-third of clinically eligible cardiac patients in every country for which data is available. Reasons for non-participation include the unavailability of hospital-based rehabilitation programs, or excessive travel time and distance. For this reason, there have been calls for the development of more flexible alternatives. Methodology and Principal Findings We developed a system to enable walking-based cardiac rehabilitation in which the patient's single-lead ECG, heart rate, GPS-based speed and location are transmitted by a programmed smartphone to a secure server for real-time monitoring by a qualified exercise scientist. The feasibility of this approach was evaluated in 134 remotely-monitored exercise assessment and exercise sessions in cardiac patients unable to undertake hospital-based rehabilitation. Completion rates, rates of technical problems, detection of ECG changes, pre- and post-intervention six minute walk test (6 MWT), cardiac depression and Quality of Life (QOL) were key measures. The system was rated as easy and quick to use. It allowed participants to complete six weeks of exercise-based rehabilitation near their homes, worksites, or when travelling. The majority of sessions were completed without any technical problems, although periodic signal loss in areas of poor coverage was an occasional limitation. Several exercise and post-exercise ECG changes were detected. Participants showed improvements comparable to those reported for hospital-based programs, walking significantly further on the post-intervention 6 MWT, 637 m (95% CI: 565–726), than on the pre-test, 524 m (95% CI: 420–655), and reporting significantly reduced levels of cardiac depression and significantly improved physical health-related QOL. Conclusions and Significance The system provided a feasible and very flexible alternative form of supervised cardiac rehabilitation for those unable to access hospital-based programs, with the potential to address a well-recognised deficiency in health care provision in many countries. Future research should assess its longer-term efficacy, cost-effectiveness and safety in larger samples representing the spectrum of cardiac morbidity and severity.