577 resultados para pre-medical
Resumo:
Medical research represents a substantial departure from conventional medical care. Medical care is patient-orientated, with decisions based on the best interests and/or wishes of the person receiving the care. In contrast, medical research is future-directed. Primarily it aims to contribute new knowledge about illness or disease, or new knowledge about interventions, such as drugs, that impact upon some human condition. Current State and Territory laws and research ethics guidelines in Australia relating to the review of medical research appropriately acknowledge that the functions of medical care and medical research differ. Prior to a medical research project commencing, the study must be reviewed and approved by a Human Research Ethics Committee (HREC). For medical research involving incompetent adults, some jurisdictions require an additional, independent safeguard by way of tribunal or court approval of medical research protocols. This extra review process reflects the uncertainty of medical research involvement, and the difficulties surrogate decision-makers of incompetent adults face in making decisions about others, and deliberating about the risks and benefits of research involvement. Parents of children also face the same difficulties when making decisions about their child’s research involvement. However, unlike the position concerning incompetent adults, there are no similar safeguards under Australian law in relation to the approval of medical research involving children. This column questions why this discrepancy exists with a view to generating further dialogue on the topic.
Resumo:
We propose a computationally efficient image border pixel based watermark embedding scheme for medical images. We considered the border pixels of a medical image as RONI (region of non-interest), since those pixels have no or little interest to doctors and medical professionals irrespective of the image modalities. Although RONI is used for embedding, our proposed scheme still keeps distortion at a minimum level in the embedding region using the optimum number of least significant bit-planes for the border pixels. All these not only ensure that a watermarked image is safe for diagnosis, but also help minimize the legal and ethical concerns of altering all pixels of medical images in any manner (e.g, reversible or irreversible). The proposed scheme avoids the need for RONI segmentation, which incurs capacity and computational overheads. The performance of the proposed scheme has been compared with a relevant scheme in terms of embedding capacity, image perceptual quality (measured by SSIM and PSNR), and computational efficiency. Our experimental results show that the proposed scheme is computationally efficient, offers an image-content-independent embedding capacity, and maintains a good image quality
Resumo:
In this paper we introduce a novel design for a translational medical research ecosystem. Translational medical research is an emerging field of work, which aims to bridge the gap between basic medical science research and clinical research/patient care. We analyze the key challenges of digital ecosystems for translational research, based on real world scenarios posed by the Lab for Translational Research at the Harvard Medical School and the Genomics Research Centre of the Griffith University, and show how traditional IT approaches fail to fulfill these challenges. We then introduce our design for a translational research ecosystem. Several key contributions are made: A novel approach to managing ad-hoc research ecosystems is introduced; a new security approach for translational research is proposed which allows each participating site to retain control over its data and define its own policies to ensure legal and ethical compliance; and a design for a novel interactive access control framework which allows users to easily share data, while adhering to their organization's policies is presented.
Resumo:
Objectives – To describe the development of an educational workshop to develop procedural skills in undergraduate Paramedic students using fresh frozen cadavers and to report the student’s assessment of the program. Methods – A six-hour anatomy based workshop was developed using fresh frozen cadavers to teach a range of airway and invasive procedural skills to second year undergraduate paramedic students. Embedded QUAN (qual) methodology will be utilised to evaluate the student’s satisfaction, perception and quality of teaching as compared to other existing clinical teaching techniques such as high fidelity simulation. Students will be asked to complete an anonymous validated survey (10 questions formulated on a 5 point Likert scale) and provide a qualitative feedback pre and post the six-hour workshop. Results – This is a prospective study planned for September 2013. Low-risk human research ethics are being sought. Teaching evaluation results from the inaugural 2012 workshop (undergraduate and postgraduate Paramedic students) and interim results for 2013 will be presented. Conclusions – Clinical teaching using fresh frozen cadavers thus far has predominately been used in the education of medical and surgical trainees. A number of studies have found them to be effective and in some cases superior to traditional high fidelity simulation teaching strategies. Fresh frozen cadavers are said to provide perfect anatomy, normal tissue consistency and a realistic operative training experience (Lloyd, Maxwell-Armstrong et al. 2011). The authors believe that this study will show that the use of fresh frozen cadavers offers a safe and effective mode to teach procedural skills to student paramedics that will help bridge the skills gap and increase confidence prior to students undertaking such interventions on living patients. A modified training program may be formulated for general practitioners undertaking Emergency Medicine Advanced Rural Skills.
Resumo:
Liuwei Dihuang Wan (LWD), a classic Chinese medicinal formulae, has been used to improve or restore declined functions related to aging and geriatric diseases, such as impaired mobility, vision, hearing, cognition and memory. It has attracted increasingly much attention as one of the most popular and valuable herbal medicines. However, the systematic analysis of the chemical constituents of LDW is difficult and thus has not been well established. In this paper, a rapid, sensitive and reliable ultra-performance liquid chromatography with electrospray ionization quadrupole time-of-flight high-definition mass spectrometry (UPLC-ESI-Q-TOF-MS) method with automated MetaboLynx analysis in positive and negative ion mode was established to characterize the chemical constituents of LDW. The analysis was performed on a Waters UPLCTM HSS T3 using a gradient elution system. MS/MS fragmentation behavior was proposed for aiding the structural identification of the components. Under the optimized conditions, a total of 50 peaks were tentatively characterized by comparing the retention time and MS data. It is concluded that a rapid and robust platform based on UPLC-ESI-Q-TOF-MS has been successfully developed for globally identifying multiple-constituents of traditional Chinese medicine prescriptions. This is the first report on systematic analysis of the chemical constituents of LDW. This article is protected by copyright. All rights reserved.
Resumo:
Balancing the competing interests of autonomy and protection of individuals is an escalating challenge confronting an ageing Australian society. Legal and medical professionals are increasingly being asked to determine whether individuals are legally competent/capable to make their own testamentary and substitute decision-making, that is financial and/or personal/health care, decisions. No consistent and transparent competency/capacity assessment paradigm currently exists in Australia. Consequently, assessments are currently being undertaken on an ad hoc basis which is concerning as Australia’s population ages and issues of competency/capacity increase. The absence of nationally accepted competency/capacity assessment guidelines and supporting principles results in legal and medical professionals involved with competency/capacity assessment implementing individual processes tailored to their own abilities. Legal and medical approaches differ both between and within the professions. The terminology used also varies. The legal practitioner is concerned with whether the individual has the legal ability to make the decision. A medical practitioner assesses fluctuations in physical and mental abilities. The problem is that the terms competency and capacity are used interchangeably resulting in confusion about what is actually being assessed. The terminological and methodological differences subsequently create miscommunication and misunderstanding between the professions. Consequently, it is not necessarily a simple solution for a legal professional to seek the opinion of a medical practitioner when assessing testamentary and/or substitute decision-making competency/capacity. This research investigates the effects of the current inadequate testamentary and substitute decision-making assessment paradigm and whether there is a more satisfactory approach. This exploration is undertaken within a framework of therapeutic jurisprudence which promotes principles fundamentally important in this context. Empirical research has been undertaken to first, explore the effects of the current process with practising legal and medical professionals; and second, to determine whether miscommunication and misunderstanding actually exist between the professions such that it gives rise to a tense relationship which is not conducive to satisfactory competency/capacity assessments. The necessity of reviewing the adequacy of the existing competency/capacity assessment methodology in the testamentary and substitute decision-making domain will be demonstrated and recommendations for the development of a suitable process made.
Resumo:
Materials used in the engineering always contain imperfections or defects which significantly affect their performances. Based on the large-scale molecular dynamics simulation and the Euler–Bernoulli beam theory, the influence from different pre-existing surface defects on the bending properties of Ag nanowires (NWs) is studied in this paper. It is found that the nonlinear-elastic deformation, as well as the flexural rigidity of the NW is insensitive to different surface defects for the studied defects in this paper. On the contrary, an evident decrease of the yield strength is observed due to the existence of defects. In-depth inspection of the deformation process reveals that, at the onset of plastic deformation, dislocation embryos initiate from the locations of surface defects, and the plastic deformation is dominated by the nucleation and propagation of partial dislocations under the considered temperature. Particularly, the generation of stair-rod partial dislocations and Lomer–Cottrell lock are normally observed for both perfect and defected NWs. The generation of these structures has thwarted attempts of the NW to an early yielding, which leads to the phenomenon that more defects does not necessarily mean a lower critical force.
Resumo:
The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients in Australian and New Zealand hospitals consume ≤50% of the offered food. The ANCDS found a significant association between poor food intake and increased in-hospital mortality after controlling for confounders (nutritional status, age, disease type and severity)1. Evidence for the effectiveness of medical nutrition therapy (MNT) in hospital patients eating poorly is lacking. An exploratory study was conducted in respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, 24-hour food intake (0%, 25%, 50%, 75%, 100% of offered meals) was evaluated for patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT with food intake re-evaluated on day-7. 184 patients were observed over four weeks. Sixty-two patients (34%) consumed ≤50% of the offered meals. Simple interventions (feeding/menu assistance, diet texture modifications) improved intake to ≥75% in 30 patients who did not require further MNT. Of the 32 patients referred for MNT, baseline and day-7 data were available for 20 patients (68±17years, 65% females, BMI: 22±5kg/m2, median energy, protein intake: 2250kJ, 25g respectively). On day-7, 17 participants (85%) demonstrated significantly higher consumption (4300kJ, 53g; p<0.01). Three participants demonstrated no improvement due to ongoing nutrition-impact symptoms. “Percentage food intake” was a quick tool to identify patients in whom simple interventions could enhance intake. MNT was associated with improved dietary intake in hospital patients. Further research is needed to establish a causal relationship.
Resumo:
Crashes on motorway contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence reduce crashes will help address congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a Short time window around the time of crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques, that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists, and that this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with traffic flow data of one hour prior to the crash using an incident detection algorithm. Traffic flow trends (traffic speed/occupancy time series) revealed that crashes could be clustered with regards of the dominant traffic flow pattern prior to the crash. Using the k-means clustering method allowed the crashes to be clustered based on their flow trends rather than their distance. Four major trends have been found in the clustering results. Based on these findings, crash likelihood estimation algorithms can be fine-tuned based on the monitored traffic flow conditions with a sliding window of 60 minutes to increase accuracy of the results and minimize false alarms.
Resumo:
Crashes that occur on motorways contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence, reducing the frequency of crashes assists in addressing congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a short time window around the time of a crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists. We will compare them with normal traffic trends and show this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with crashes corresponding to traffic flow data using an incident detection algorithm. Traffic trends (traffic speed time series) revealed that crashes can be clustered with regards to the dominant traffic patterns prior to the crash. Using the K-Means clustering method with Euclidean distance function allowed the crashes to be clustered. Then, normal situation data was extracted based on the time distribution of crashes and were clustered to compare with the “high risk” clusters. Five major trends have been found in the clustering results for both high risk and normal conditions. The study discovered traffic regimes had differences in the speed trends. Based on these findings, crash likelihood estimation models can be fine-tuned based on the monitored traffic conditions with a sliding window of 30 minutes to increase accuracy of the results and minimize false alarms.
Resumo:
It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.
Resumo:
Background and aims The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients consume ≤50% of the offered food in Australian and New Zealand hospitals. After controlling for confounders (nutritional status, age, disease type and severity), the ANCDS also established an independent association between poor food intake and increased in-hospital mortality. This study aimed to evaluate if medical nutrition therapy (MNT) could improve dietary intake in hospital patients eating poorly. Methods An exploratory pilot study was conducted in the respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, percentage food intake (0%, 25%, 50%, 75%, and 100%) was evaluated for each main meal and snack for a 24-hour period in patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT. Food intake was re-evaluated on the seventh day following recruitment (post-MNT). Results 184 patients were observed over four weeks; 32 patients were referred for MNT. Although baseline and post-MNT data for 20 participants (68±17years, 65% females) indicated a significant increase in median energy and protein intake post-MNT (3600kJ/day, 40g/day) versus baseline (2250kJ/day, 25g/day) (p<0.05), the increased intake met only 50% of dietary requirements. Persistent nutrition impact symptoms affected intake. Conclusion In this pilot study whilst dietary intake improved, it remained inadequate to meet participants’ estimated requirements due to ongoing nutrition-impact symptoms. Appropriate medical management and early enteral feeding could be a possible solution for such patients.
Resumo:
Aim To develop clinical practice guidelines for nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory. Background Numerous studies have reported that nurse-administered procedural sedation and analgesia is safe. However, the broad scope of existing guidelines for the administration and monitoring of patients who receive sedation during medical procedures without an anaesthetist presents means there is a lack of specific guidance regarding optimal nursing practices for the unique circumstances in which nurse-administered procedural sedation and analgesia is used in the cardiac catheterisation laboratory. Methods A sequential mixed methods design was utilised. Initial recommendations were produced from three studies conducted by the authors: an integrative review; a qualitative study; and a cross-sectional survey. The recommendations were revised in accordance with responses from a modified Delphi study. The first Delphi round was completed by nine senior cardiac catheterisation laboratory nurses. All but one of the draft recommendations met the pre-determined cut-off point for inclusion. There were a total of 59 responses to the second round. Consensus was reached on all recommendations. Implications for nursing The guidelines that were derived from the Delphi study offer twenty four recommendations within six domains of nursing practice: Pre-procedural assessment; Pre-procedural patient and family education; Pre-procedural patient comfort; Intra-procedural patient comfort; Intra-procedural patient assessment and monitoring; and Post-procedural patient assessment and monitoring. Conclusion These guidelines provide an important foundation towards the delivery of safe, consistent and evidence-based nursing care for the many patients who receive sedation in the cardiac catheterisation laboratory setting.
Resumo:
Introduction Malnutrition is common among hospitalised patients, with poor follow-up of nutrition support post-discharge. Published studies on the efficacy of ambulatory nutrition support (ANS) for malnourished patients post-discharge are scarce. The aims of this study were to evaluate the rate of dietetics follow-up of malnourished patients post-discharge, before (2008) and after (2010) implementation of a new ANS service, and to evaluate nutritional outcomes post-implementation. Materials and Methods Consecutive samples of 261 (2008) and 163 (2010) adult inpatients referred to dietetics and assessed as malnourished using Subjective Global Assessment (SGA) were enrolled. All subjects received inpatient nutrition intervention and dietetic outpatient clinic follow-up appointments. For the 2010 cohort, ANS was initiated to provide telephone follow-up and home visits for patients who failed to attend the outpatient clinic. Subjective Global Assessment, body weight, quality of life (EQ-5D VAS) and handgrip strength were measured at baseline and five months post-discharge. Paired t-test was used to compare pre- and post-intervention results. Results In 2008, only 15% of patients returned for follow-up with a dietitian within four months post-discharge. After implementation of ANS in 2010, the follow-up rate was 100%. Mean weight improved from 44.0 ± 8.5kg to 46.3 ± 9.6kg, EQ-5D VAS from 61.2 ± 19.8 to 71.6 ± 17.4 and handgrip strength from 15.1 ± 7.1 kg force to 17.5 ± 8.5 kg force; p<0.001 for all. Seventy-four percent of patients improved in SGA score. Conclusion Ambulatory nutrition support resulted in significant improvements in follow-up rate, nutritional status and quality of life of malnourished patients post-discharge.
Hepatitis C, mental health and equity of access to antiviral therapy : a systematic narrative review
Resumo:
Introduction Access to hepatitis C (hereafter HCV) antiviral therapy has commonly excluded populations with mental health and substance use disorders because they were considered as having contraindications to treatment, particularly due to the neuropsychiatric effects of interferon that can occur in some patients. In this review we examined access to HCV interferon antiviral therapy by populations with mental health and substance use problems to identify the evidence and reasons for exclusion. Methods We searched the following major electronic databases for relevant articles: PsycINFO, Medline, CINAHL, Scopus, Google Scholar. The inclusion criteria comprised studies of adults aged 18 years and older, peer-reviewed articles, date range of (2002--2012) to include articles since the introduction of pegylated interferon with ribarvirin, and English language. The exclusion criteria included articles about HCV populations with medical co-morbidities, such as hepatitis B (hereafter HBV) and human immunodeficiency virus (hereafter HIV), because the clinical treatment, pathways and psychosocial morbidity differ from populations with only HCV. We identified 182 articles, and of these 13 met the eligibility criteria. Using an approach of systematic narrative review we identified major themes in the literature. Results Three main themes were identified including: (1) pre-treatment and preparation for antiviral therapy, (2) adherence and treatment completion, and (3) clinical outcomes. Each of these themes was critically discussed in terms of access by patients with mental health and substance use co-morbidities demonstrating that current research evidence clearly demonstrates that people with HCV, mental health and substance use co-morbidities have similar clinical outcomes to those without these co-morbidities. Conclusions While research evidence is largely supportive of increased access to interferon by people with HCV, mental health and substance use co-morbidities, there is substantial further work required to translate evidence into clinical practice. Further to this, we conclude that a reconsideration of the appropriateness of the tertiary health service model of care for interferon management is required and exploration of the potential for increased HCV care in primary health care settings.