915 resultados para Risk Identification
Resumo:
High-risk adolescents are most vulnerable to the negative outcomes of risk taking behaviour, such as injury. It has been theorised by Jessor (1987) that adolescent risk behaviours (e.g. violence, alcohol use) can be predicted by assessing the risk factors (e.g. peer models for violence) and protective factors (e.g. school connectedness) in a young person’s life. The aim of this research is to examine the influence of risk factors and protective factors on the proneness of high-risk adolescents to engage in risky behaviour. 2,521 Grade 9 students (13-14 years of age) from 35 schools in Queensland, Australia participated in this study. The findings examine the influence of risk factors and protective factors on self-reported risky behaviour and injury experiences for adolescents who have been categorized as high-risk. Thereby, providing insight that may be used to target preventive interventions aimed at high-risk adolescents.
Resumo:
In this paper we argue that rationalist ‘predict then act’ approaches to disaster risk management (DRM) policy promote unrealistic public expectations of DRM provisions, the avoidance of decision making by political elites, an over-reliance on technical expertise and engineering solutions to reducing exposure to natural events, and a reactive approach to DRM overall. We propose an alternative incrementalist approach that focuses on managing uncertainties rather than reducing them and building resilience not simply through the reduction of hazard exposure, but also through the ongoing reduction of community vulnerability, the explicit consideration of normative priorities, and more effective community engagement in climate risk debates.
Resumo:
Background Single nucleotide polymorphisms (SNPs) rs429358 (ε4) and rs7412 (ε2), both invoking changes in the amino-acid sequence of the apolipoprotein E (APOE) gene, have previously been tested for association with multiple sclerosis (MS) risk. However, none of these studies was sufficiently powered to detect modest effect sizes at acceptable type-I error rates. As both SNPs are only imperfectly captured on commonly used microarray genotyping platforms, their evaluation in the context of genome-wide association studies has been hindered until recently. Methods We genotyped 12 740 subjects hitherto not studied for their APOE status, imputed raw genotype data from 8739 subjects from five independent genome wide association studies datasets using the most recent high-resolution reference panels, and extracted genotype data for 8265 subjects from previous candidate gene assessments. Results Despite sufficient power to detect associations at genome-wide significance thresholds across a range of ORs, our analyses did not support a role of rs429358 or rs7412 on MS susceptibility. This included meta-analyses of the combined data across 13 913 MS cases and 15 831 controls (OR=0.95, p=0.259, and OR 1.07, p=0.0569, for rs429358 and rs7412, respectively). Conclusion Given the large sample size of our analyses, it is unlikely that the two APOE missense SNPs studied here exert any relevant effects on MS susceptibility.
Resumo:
Recent road safety statistics show that the decades-long fatalities decreasing trend is stopping and stagnating. Statistics further show that crashes are mostly driven by human error, compared to other factors such as environmental conditions and mechanical defects. Within human error, the dominant error source is perceptive errors, which represent about 50% of the total. The next two sources are interpretation and evaluation, which accounts together with perception for more than 75% of human error related crashes. Those statistics show that allowing drivers to perceive and understand their environment better, or supplement them when they are clearly at fault, is a solution to a good assessment of road risk, and, as a consequence, further decreasing fatalities. To answer this problem, currently deployed driving assistance systems combine more and more information from diverse sources (sensors) to enhance the driver's perception of their environment. However, because of inherent limitations in range and field of view, these systems' perception of their environment remains largely limited to a small interest zone around a single vehicle. Such limitations can be overcomed by increasing the interest zone through a cooperative process. Cooperative Systems (CS), a specific subset of Intelligent Transportation Systems (ITS), aim at compensating for local systems' limitations by associating embedded information technology and intervehicular communication technology (IVC). With CS, information sources are not limited to a single vehicle anymore. From this distribution arises the concept of extended or augmented perception. Augmented perception allows extending an actor's perceptive horizon beyond its "natural" limits not only by fusing information from multiple in-vehicle sensors but also information obtained from remote sensors. The end result of an augmented perception and data fusion chain is known as an augmented map. It is a repository where any relevant information about objects in the environment, and the environment itself, can be stored in a layered architecture. This thesis aims at demonstrating that augmented perception has better performance than noncooperative approaches, and that it can be used to successfully identify road risk. We found it was necessary to evaluate the performance of augmented perception, in order to obtain a better knowledge on their limitations. Indeed, while many promising results have already been obtained, the feasibility of building an augmented map from exchanged local perception information and, then, using this information beneficially for road users, has not been thoroughly assessed yet. The limitations of augmented perception, and underlying technologies, have not be thoroughly assessed yet. Most notably, many questions remain unanswered as to the IVC performance and their ability to deliver appropriate quality of service to support life-saving critical systems. This is especially true as the road environment is a complex, highly variable setting where many sources of imperfections and errors exist, not only limited to IVC. We provide at first a discussion on these limitations and a performance model built to incorporate them, created from empirical data collected on test tracks. Our results are more pessimistic than existing literature, suggesting IVC limitations have been underestimated. Then, we develop a new CS-applications simulation architecture. This architecture is used to obtain new results on the safety benefits of a cooperative safety application (EEBL), and then to support further study on augmented perception. At first, we confirm earlier results in terms of crashes numbers decrease, but raise doubts on benefits in terms of crashes' severity. In the next step, we implement an augmented perception architecture tasked with creating an augmented map. Our approach is aimed at providing a generalist architecture that can use many different types of sensors to create the map, and which is not limited to any specific application. The data association problem is tackled with an MHT approach based on the Belief Theory. Then, augmented and single-vehicle perceptions are compared in a reference driving scenario for risk assessment,taking into account the IVC limitations obtained earlier; we show their impact on the augmented map's performance. Our results show that augmented perception performs better than non-cooperative approaches, allowing to almost tripling the advance warning time before a crash. IVC limitations appear to have no significant effect on the previous performance, although this might be valid only for our specific scenario. Eventually, we propose a new approach using augmented perception to identify road risk through a surrogate: near-miss events. A CS-based approach is designed and validated to detect near-miss events, and then compared to a non-cooperative approach based on vehicles equiped with local sensors only. The cooperative approach shows a significant improvement in the number of events that can be detected, especially at the higher rates of system's deployment.
Resumo:
Background: To report the incidence and risk factors for hypotony and estimate the risk of sympathetic ophthalmia following diode laser trans-scleral cyclophotocoagulation (TSCPC). Design: Retrospective study using data from a private tertiary glaucoma clinic and review of the literature. Participants: Seventy eyes of 70 patients with refractory glaucoma who received TSCPC treatment. Methods: Review of the records of consecutive patients who underwent TSCPC by a single ophthalmic surgeon and review of the literature. Main Outcome Measures: Hypotony (including phthisis bulbi), sympathetic ophthalmia. Results: Seven eyes (10%; CI 5-19%) developed hypotony and included 4 eyes that developed phthisis. Higher total energy delivered during TSCPC treatment was associated with an increased risk of hypotony: eyes that developed hypotony received a mean total energy of 192.5 ± 73.2 joules, compared to a mean of 152.9 ± 83.2 joules in hypotony-free cases. The difference in mean energy delivered between the hypotony and non-hypotony group was 38.53 (95% CI: -27.57 to 104.63). The risk of sympathetic ophthalmia estimated from a review of the published literature and current series was one in 1512, or 0.07% (CI 0.03% - 0.17%). Conclusions: Total laser energy is one of several risk factors that act in a sufficient component cause-model to produce hypotony in an individual patient. The small sample size precluded inference for other individual putative risk factors but titrating laser energy may help decrease the occurrence of hypotony. The risk of sympathetic ophthalmia calculated from the literature is likely an overestimate caused by publication bias.
Resumo:
Australian governments face the twin challenges of dealing with extreme weather-related disasters (such as floods and bushfires) and adapting to the impacts of climate change. These challenges are connected, so any response would benefit from a more integrated approach across and between the different levels of government.This report summarises the findings of an NCCARF-funded project that addresses this problem. The project undertook a three-way comparative case study of the 2009 Victorian bushfires, the 2011 Perth Hills bushfires, and the 2011 Brisbane floods. It collected data from the official inquiry reports into each of these events, and conducted new interviews and workshops with key stakeholders. The findings of this project included recommendations that range from the conceptual to the practical. First, it was argued that a reconceptualization of terms such as ‘community’ and ‘resilience’ was necessary to allow for more tailored responses to varying circumstances. Second, it was suggested that the high level of uncertainty inherent in disaster risk management and climate change adaptation requires a more iterative approach to policymaking and planning. Third, some specific institutional reforms were proposed that included: 1) a new funding mechanism that would encourage collaboration between and across different levels of government, as well as promoting partnerships with business and the community; 2) improving community engagement through new resilience grants run by local councils; 3) embedding climate change researchers within disaster risk management agencies to promote institutional learning, and; 4) creating an inter-agency network that encourages collaboration between organisations.
Resumo:
Emergency management and climate change adaptation will increasingly challenge all levels of government because of three main factors. First, Australia is extremely vulnerable to the impacts of climate change, particularly through the increasing frequency, duration and/or intensity of disasters such as floods and bushfires. Second, the system of government that divides powers by function and level can often act as a barrier to a well-integrated response. Third, policymaking processes struggle to cope with such complex inter-jurisdictional issues. This paper discusses these factors and explores the nature of the challenge for Australian governments. Investigations into the 2009 Victorian bushfires, the 2011 Perth Hills bushfires, and the 2011 Brisbane floods offer an indication of the challenges ahead and it is argued that there is a need to: improve community engagement and communication; refocus attention on resilience; improve interagency communication and collaboration; and, develop institutional arrangements that support continual improvement and policy learning. These findings offer an opportunity for improving responses as well as a starting point for integrating disaster risk management and climate change adaptation policies. The paper is based on the preliminary findings of an NCCARF funded research project: The Right Tool for the Job: Achieving climate change adaptation outcomes through improved disaster management policies, planning and risk management strategies involving Griffith University and RMIT. It should be noted from the outset that the purpose of this research project is not to criticise the actions of emergency service workers and volunteers who do an incredible job under extreme circumstances, often risking their own lives in the process. The aim is simply to offer emergency management agencies the opportunity to step back and rethink their overall approach to the challenge they face in the light of the impacts of climate change.
Resumo:
High density SNP arrays can be used to identify DNA copy number changes in tumors such as homozygous deletions of tumor suppressor genes and focal amplifications of oncogenes. Illumina Human CNV370 Bead chip arrays were used to assess the genome for unbalanced chromosomal events occurring in 39 cell lines derived from stage III metastatic melanomas. A number of genes previously recognized to have an important role in the development and progression of melanoma were identified including homozygous deletions of CDKN2A (13 of 39 samples), CDKN2B (10 of 39), PTEN (3 of 39), PTPRD (3 of 39), TP53 (1 of 39), and amplifications of CCND1 (2 of 39), MITF (2 of 39), MDM2 (1 of 39), and NRAS (1 of 39). In addition, a number of focal homozygous deletions potentially targeting novel melanoma tumor suppressor genes were identified. Because of their likely functional significance for melanoma progression, FAS, CH25H, BMPR1A, ACTA2, and TFG were investigated in a larger cohort of melanomas through sequencing. Nonsynonymous mutations were identified in BMPR1A (1 of 43), ACTA2 (3 of 43), and TFG (5 of 103). A number of potentially important mutation events occurred in TFG including the identification of a mini mutation ‘‘hotspot’’ at amino acid residue 380 (P380S and P380L) and the presence of multiple mutations in two melanomas. Mutations in TFG may have important clinical relevance for current therapeutic strategies to treat metastatic melanoma.
Resumo:
Fatigue/sleepiness is recognised as an important contributory factor in fatal and serious injury road traffic incidents (RTIs), however, identifying fatigue/sleepiness as a causal factor remains an uncertain science. Within Australia attending police officers at a RTI report the causal factors; one option is fatigue/sleepiness. In some Australian jurisdictions police incident databases are subject to post hoc analysis using a proxy definition for fatigue/sleepiness. This secondary analysis identifies further RTIs caused by fatigue/sleepiness not initially identified by attending officers. The current study investigates the efficacy of such proxy definitions for attributing fatigue/sleepiness as a RTI causal factor. Over 1600 Australian drivers were surveyed regarding their experience and involvement in fatigue/sleep-related RTIs and near-misses during the past five years. Driving while fatigued/sleepy had been experienced by the majority of participants (66.0% of participants). Fatigue/sleep-related near misses were reported by 19.1% of participants, with 2.4% being involved in a fatigue/sleep-related RTI. Examination of the characteristics for the most recent event (either a near miss or crash) found that the largest proportion of incidents (28.0%) occurred when commuting to or from work, followed by social activities (25.1%), holiday travel (19.8%), or for work purposes (10.1%). The fatigue/sleep related RTI and near-miss experience of a representative sample of Australian drivers does not reflect the proxy definitions used for fatigue/sleepiness identification. In particular those RTIs that occur in urban areas and at slow speeds may not be identified. While important to have a strategy for identifying fatigue/sleepiness related RTIs proxy measures appear best suited to identifying specific subsets of such RTIs.
Resumo:
The mechanistic details of the pathogenesis of Chlamydia, an obligate intracellular pathogen of global importance, have eluded scientists due to the scarcity of traditional molecular genetic tools to investigate this organism. Here we report a chemical biology strategy that has uncovered the first essential protease for this organism. Identification and application of a unique CtHtrA inhibitor (JO146) to cultures of Chlamydia resulted in a complete loss of viable elementary body formation. JO146 treatment during the replicative phase of development resulted in a loss of Chlamydia cell morphology, diminishing inclusion size, and ultimate loss of inclusions from the host cells. This completely prevented the formation of viable Chlamydia elementary bodies. In addition to its effect on the human C. trachomatis strain, JO146 inhibited the viability of the mouse strain, Chlamydia muridarum, both in vitro and in vivo. Thus, we report a chemical biology approach to establish an essential role for Chlamydia CtHtrA. The function of CtHtrA for Chlamydia appears to be essential for maintenance of cell morphology during replicative the phase and these findings provide proof of concept that proteases can be targetted for anti-microbial therapy for intracellular pathogens.
Resumo:
The need to address on-road motorcycle safety in Australia is important due to the disproportionately high percentage of riders and pillions killed and injured each year. One approach to preventing motorcycle-related injury is through training and education. However, motorcycle rider training lacks empirical support as an effective road safety countermeasure to reduce crash involvement. Previous reviews have highlighted that risk-taking is a contributing factor in many motorcycle crashes, rather than merely a lack of vehicle-control skills (Haworth & Mulvihill, 2005; Jonah, Dawson & Bragg, 1982; Watson et al, 1996). Hence, though the basic vehicle-handling skills and knowledge of road rules that are taught in most traditional motorcycle licence training programs may be seen as an essential condition of safe riding, they do not appear to be sufficient in terms of crash reduction. With this in mind there is considerable scope for the improvement of program focus and content for rider training and education. This program of research examined an existing traditional pre-licence motorcycle rider training program and formatively evaluated the addition of a new classroom-based module to address risky riding; the Three Steps to Safer Riding program. The pilot program was delivered in the real world context of the Q-Ride motorcycle licensing system in the state of Queensland, Australia. Three studies were conducted as part of the program of research: Study 1, a qualitative investigation of delivery practices and student learning needs in an existing rider training course; Study 2, an investigation of the extent to which an existing motorcycle rider training course addressed risky riding attitudes and motives; and Study 3, a formative evaluation of the new program. A literature review as well as the investigation of learning needs for motorcyclists in Study 1 aimed to inform the initial planning and development of the Three Steps to Safer Riding program. Findings from Study 1 suggested that the training delivery protocols used by the industry partner training organisation were consistent with a learner-centred approach and largely met the learning needs of trainee riders. However, it also found that information from the course needs to be reinforced by on-road experiences for some riders once licensed and that personal meaning for training information was not fully gained until some riding experience had been obtained. While this research informed the planning and development of the new program, a project team of academics and industry experts were responsible for the formulation of the final program. Study 2 and Study 3 were conducted for the purpose of formative evaluation and program refinement. Study 2 served primarily as a trial to test research protocols and data collection methods with the industry partner organisation and, importantly, also served to gather comparison data for the pilot program which was implemented with the same rider training organisation. Findings from Study 2 suggested that the existing training program of the partner organisation generally had a positive (albeit small) effect on safety in terms of influencing attitudes to risk taking, the propensity for thrill seeking, and intentions to engage in future risky riding. However, maintenance of these effects over time and the effects on riding behaviour remain unclear due to a low response rate upon follow-up 24 months after licensing. Study 3 was a formative evaluation of the new pilot program to establish program effects and possible areas for improvement. Study 3a examined the short term effects of the intervention pilot on psychosocial factors underpinning risky riding compared to the effects of the standard traditional training program (examined in Study 2). It showed that the course which included the Three Steps to Safer Riding program elicited significantly greater positive attitude change towards road safety than the existing standard licensing course. This effect was found immediately following training, and mean scores for attitudes towards safety were also maintained at the 12 month follow-up. The pilot program also had an immediate effect on other key variables such as risky riding intentions and the propensity for thrill seeking, although not significantly greater than the traditional standard training. A low response rate at the 12 month follow-up unfortunately prevented any firm conclusions being drawn regarding the impact of the pilot program on self-reported risky riding once licensed. Study 3a further showed that the use of intermediate outcomes such as self-reported attitudes and intentions for evaluation purposes provides insights into the mechanisms underpinning risky riding that can be changed by education and training. A multifaceted process evaluation conducted in Study 3b confirmed that the intervention pilot was largely delivered as designed, with course participants also rating most aspects of training delivery highly. The complete program of research contributed to the overall body of knowledge relating to motorcycle rider training, with some potential implications for policy in the area of motorcycle rider licensing. A key finding of the research was that psychosocial influences on risky riding can be shaped by structured education that focuses on awareness raising at a personal level and provides strategies to manage future riding situations. However, the formative evaluation was mainly designed to identify areas of improvement for the Three Steps to Safer Riding program and found several areas of potential refinement to improve future efficacy of the program. This included aspects of program content, program delivery, resource development, and measurement tools. The planned future follow-up of program participants' official crash and traffic offence records over time may lend further support for the application of the program within licensing systems. The findings reported in this thesis offer an initial indication that the Three Steps to Safer Riding is a useful resource to accompany skills-based training programs.
Resumo:
BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.
Resumo:
Maternally inherited diabetes and deafness (MIDD) is an autosomal dominant inherited syndrome caused by the mitochondrial DNA (mtDNA) nucleotide mutation A3243G. It affects various organs including the eye with external ophthalmoparesis, ptosis, and bilateral macular pattern dystrophy.1, 2 The prevalence of retinal involvement in MIDD is high, with 50% to 85% of patients exhibiting some macular changes.1 Those changes, however, can vary between patients and within families dramatically based on the percentage of retinal mtDNA mutations, making it difficult to give predictions on an individual’s visual prognosis...
Resumo:
Public policymakers are caught in a dilemma : there is a growing list of urgent issues to address, at the same time that public expenditure is being cut. Adding to this dilemma is a system of government designed in the 19th century and competing theories of policymaking dating back to the 1950s. The interlinked problems of disaster risk management and climate change adaptation are cases in point. As the climate changes, there will be more frequent, intense and/or prolonged disasters such as floods and bushfires. Clearly a well integrated whole of government response is needed, but how might this be achieved? Further, how could academic research contribute to resolving this dilemma in a way that would produce something of theoretical interest as well as practical outcomes for policymakers? These are the questions addressed by our research via a comparative analysis of the 2009 Victorian bushfires, the 2011 Perth Hills bushfires, and the 2011 Brisbane floods. Our findings suggest that there is a need to: improve community engagement and communication; refocus attention on resilience; improve interagency communication and collaboration; and, develop institutional arrangements that support continual improvement and policy learning. These findings have implications for all areas of public policy theory and practice.
Resumo:
Background Cancer-related malnutrition is associated with increased morbidity, poorer tolerance of treatment, decreased quality of life, increased hospital admissions, and increased health care costs (Isenring et al., 2013). This study’s aim was to determine whether a novel, automated screening system was a useful tool for nutrition screening when compared against a full nutrition assessment using the Patient-Generated Subjective Global Assessment (PG-SGA) tool. Methods A single site, observational, cross-sectional study was conducted in an outpatient oncology day care unit within a Queensland tertiary facility, with three hundred outpatients (51.7% male, mean age 58.6 ± 13.3 years). Eligibility criteria: ≥18 years, receiving anticancer treatment, able to provide written consent. Patients completed the Malnutrition Screening Tool (MST). Nutritional status was assessed using the PG-SGA. Data for the automated screening system was extracted from the pharmacy software program Charm. This included body mass index (BMI) and weight records dating back up to six months. Results The prevalence of malnutrition was 17%. Any weight loss over three to six weeks prior to the most recent weight record as identified by the automated screening system relative to malnutrition resulted in 56.52% sensitivity, 35.43% specificity, 13.68% positive predictive value, 81.82% negative predictive value. MST score 2 or greater was a stronger predictor of nutritional risk relative to PG-SGA classified malnutrition (70.59% sensitivity, 69.48% specificity, 32.14% positive predictive value, 92.02% negative predictive value). Conclusions Both the automated screening system and the MST fell short of the accepted professional standard for sensitivity (80%) or specificity (60%) when compared to the PG-SGA. However, although the MST remains a better predictor of malnutrition in this setting, uptake of this tool in the Oncology Day Care Unit remains challenging.