981 resultados para pro-diversity interventions
Resumo:
For the past two decades the concept of managing individual difference in the workforce has been popular in many Western organizations, with calls to manage this "diversity" for the greater good of the organization and the\ individuals in it. Paradoxically, there is no agreed definition for this concept, and its description remains unclear and often contested (Jensen, Maznevski & Schneider 2011). Indeed, a range of terms is used, including diversity: diversity at work, managing diversity, diversity management, workplace diversity, productive diversity, and so forth. The foundation of the concept of managing diversity is the idea that an organization's workforce displays a range of “diverse” characteristics. The characteristics that are included under the heading Of "diversity" vary.
Resumo:
The inconsistent findings of past board diversity research demand a test of competing linear and curvilinear diversity–performance predictions. This research focuses on board age and gender diversity, and presents a positive linear prediction based on resource dependence theory, a negative linear prediction based on social identity theory, and an inverted U-shaped curvilinear prediction based on the integration of resource dependence theory with social identity theory. The predictions were tested using archival data on 288 large organizations listed on the Australian Securities Exchange, with a 1-year time lag between diversity (age and gender) and performance (employee productivity and return on assets). The results indicate a positive linear relationship between gender diversity and employee productivity, a negative linear relationship between age diversity and return on assets, and an inverted U-shaped curvilinear relationship between age diversity and return on assets. The findings provide additional evidence on the business case for board gender diversity and refine the business case for board age diversity.
Resumo:
Since the pioneering work of Hough in 1902 (1) the term ‘delayed onset muscle soreness (DOMS)’ has dominated the field of athletic recovery. DOMS typically occurs after exercise induced muscle damage (EIMD), particularly if the exercise is unaccustomed or involves a large amount of eccentric (muscle lengthening) contractions. The symptoms of EIMD manifest as a temporary reduction in muscle force, disturbed proprioceptive acuity, increases in inflammatory markers both within the injured muscle and in the blood as well as increased muscle soreness, stiffness and swelling. The intensity of discomfort and soreness associated with DOMS increases within the first 24 hours, peaks between 24 and 72 hours, before subsiding and eventually disappearing 5-7 days after the exercise. Consequently, DOMS may interfere with athletic training or competition and several recovery interventions have been utilised by athletes and coaches in an attempt to offset the negative effects...
Resumo:
In vivo small molecules as necessary intermediates are involved in numerous critical metabolic pathways and biological processes associated with many essential biological functions and events. There is growing evidence that MS-based metabolomics is emerging as a powerful tool to facilitate the discovery of functional small molecules that can better our understanding of development, infection, nutrition, disease, toxicity, drug therapeutics, gene modifications and host-pathogen interaction from metabolic perspectives. However, further progress must still be made in MS-based metabolomics because of the shortcomings in the current technologies and knowledge. This technique-driven review aims to explore the discovery of in vivo functional small molecules facilitated by MS-based metabolomics and to highlight the analytic capabilities and promising applications of this discovery strategy. Moreover, the biological significance of the discovery of in vivo functional small molecules with different biological contexts is also interrogated at a metabolic perspective.
Resumo:
BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.
Resumo:
Traditionally, infectious diseases and under-nutrition have been considered major health problems in Sri Lanka with little attention paid to obesity and associated non-communicable diseases (NCDs). However, the recent Sri Lanka Diabetes and Cardiovascular Study (SLDCS) reported the epidemic level of obesity, diabetes and metabolic syndrome. Moreover, obesity-associated NCDs is the leading cause of death in Sri Lanka and there is an exponential increase in hospitalization due to NCDs adversely affecting the development of the country. Despite Sri Lanka having a very high prevalence of NCDs and associated mortality, little is known about the causative factors for this burden. It is widely believed that the global NCD epidemic is associated with recent lifestyle changes, especially dietary factors. In the absence of sufficient data on dietary habits in Sri Lanka, successful interventions to manage these serious health issues would not be possible. In view of the current situation the dietary survey was undertaken to assess the intakes of energy, macro-nutrients and selected other nutrients with respect to socio demographic characteristics and the nutritional status of Sri Lankan adults especially focusing on obesity. Another aim of this study was to develop and validate a culturally specific food frequency questionnaire (FFQ) to assess dietary risk factors of NCDs in Sri Lankan adults. Data were collected from a subset of the national SLDCS using a multi-stage, stratified, random sampling procedure (n=500). However, data collection in the SLDCS was affected by the prevailing civil war which resulted in no data being collected from Northern and Eastern provinces. To obtain a nationally representative sample, additional subjects (n=100) were later recruited from the two provinces using similar selection criteria. Ethical Approval for this study was obtained from the Ethical Review Committee, Faculty of Medicine, University of Colombo, Sri Lanka and informed consent was obtained from the subjects before data were collected. Dietary data were obtained using the 24-h Dietary Recall (24HDR) method. Subjects were asked to recall all foods and beverages, consumed over the previous 24-hour period. Respondents were probed for the types of foods and food preparation methods. For the FFQ validation study, a 7-day weight diet record (7-d WDR) was used as the reference method. All foods recorded in the 24 HDR were converted into grams and then intake of energy and nutrients were analysed using NutriSurvey 2007 (EBISpro, Germany) which was modified for Sri Lankan food recipes. Socio-demographic details and body weight perception were collected from interviewer-administrated questionnaire. BMI was calculated and overweight (BMI ≥23 kg.m-2), obesity (BMI ≥25 kg.m-2) and abdominal obesity (Men: WC ≥ 90 cm; Women: WC ≥ 80 cm) were categorized according to Asia-pacific anthropometric cut-offs. The SPSS v. 16 for Windows and Minitab v10 were used for statistical analysis purposes. From a total of 600 eligible subjects, 491 (81.8%) participated of whom 34.5% (n=169) were males. Subjects were well distributed among different socio-economic parameters. A total of 312 different food items were recorded and nutritionists grouped similar food items which resulted in a total of 178 items. After performing step-wise multiple regression, 93 foods explained 90% of the variance for total energy intake, carbohydrates, protein, total fat and dietary fibre. Finally, 90 food items and 12 photographs were selected. Seventy-seven subjects completed (response rate = 65%) the FFQ and 7-day WDR. Estimated mean energy intake (SD) from FFQ (1794±398 kcal) and 7DWR (1698±333 kcal, P<0.001) was significantly different due to a significant overestimation of carbohydrate (~10 g/d, P<0.001) and to some extent fat (~5 g/d, NS). Significant positive correlations were found between the FFQ and 7DWR for energy (r = 0.39), carbohydrate (r = 0.47), protein (r = 0.26), fat (r =0.17) and dietary fiber (r = 0.32). Bland-Altman graphs indicated fairly good agreement between methods with no relationship between bias and average intake of each nutrient examined. The findings from the nutrition survey showed on average, Sri Lankan adults consumed over 14 portions of starch/d; moreover, males consumed 5 more portions of cereal than females. Sri Lankan adults consumed on average 3.56 portions of added sugars/d. Moreover, mean daily intake of fruit (0.43) and vegetable (1.73) portions was well below minimum dietary recommendations (fruits 2 portions/d; vegetables 3 portions/d). The total fruit and vegetable intake was 2.16 portions/d. Daily consumption of meat or alternatives was 1.75 portions and the sum of meat and pulses was 2.78 portions/d. Starchy foods were consumed by all participants and over 88% met the minimum daily recommendations. Importantly, nearly 70% of adults exceeded the maximum daily recommendation for starch (11portions/d) and a considerable proportion consumed larger numbers of starch servings daily, particularly men. More than 12% of men consumed over 25 starch servings/d. In contrast to their starch consumption, participants reported very low intakes of other food groups. Only 11.6%, 2.1% and 3.5% of adults consumed the minimum daily recommended servings of vegetables, fruits, and fruits and vegetables combined, respectively. Six out of ten adult Sri Lankans sampled did not consume any fruits. Milk and dairy consumption was extremely low; over a third of the population did not consume any dairy products and less than 1% of adults consumed 2 portions of dairy/d. A quarter of Sri Lankans did not report consumption of meat and pulses. Regarding protein consumption, 36.2% attained the minimum Sri Lankan recommendation for protein; and significantly more men than women achieved the recommendation of ≥3 servings of meat or alternatives daily (men 42.6%, women 32.8%; P<0.05). Over 70% of energy was derived from carbohydrates (Male:72.8±6.4%, Female:73.9±6.7%), followed by fat (Male:19.9±6.1%, Female:18.5±5.7%) and proteins (Male:10.6±2.1%, Female:10.9±5.6%). The average intake of dietary fiber was 21.3 g/day and 16.3 g/day for males and females, respectively. There was a significant difference in nutritional intake related to ethnicities, areas of residence, education levels and BMI categories. Similarly, dietary diversity was significantly associated with several socio-economic parameters among Sri Lankan adults. Adults with BMI ≥25 kg.m-2 and abdominally obese Sri Lankan adults had the highest diet diversity values. Age-adjusted prevalence (95% confidence interval) of overweight, obesity, and abdominal obesity among Sri Lankan adults were 17.1% (13.8-20.7), 28.8% (24.8-33.1), and 30.8% (26.8-35.2), respectively. Men, compared with women, were less overweight, 14.2% (9.4-20.5) versus 18.5% (14.4-23.3), P = 0.03, less obese, 21.0% (14.9-27.7) versus 32.7% (27.6-38.2), P < .05; and less abdominally obese, 11.9% (7.4-17.8) versus 40.6% (35.1-46.2), P < .05. Although, prevalence of obesity has reached to epidemic level body weight misperception was common among Sri Lankan adults. Two-thirds of overweight males and 44.7% of females considered themselves as in "about right weight". Over one third of both male and female obese subjects perceived themselves as "about right weight" or "underweight". Nearly 32% of centrally obese men and women perceived that their waist circumference is about right. People who perceived overweight or very overweight (n = 154) only 63.6% tried to lose their body weight (n = 98), and quarter of adults seek advices from professionals (n = 39). A number of important conclusions can be drawn from this research project. Firstly, the newly developed FFQ is an acceptable tool for assessing the nutrient intake of Sri Lankans and will assist proper categorization of individuals by dietary exposure. Secondly, a substantial proportion of the Sri Lankan population does not consume a varied and balanced diet, which is suggestive of a close association between the nutrition-related NCDs in the country and unhealthy eating habits. Moreover, dietary diversity is positively associated with several socio-demographic characteristics and obesity among Sri Lankan adults. Lastly, although obesity is a major health issue among Sri Lankan adults, body weight misperception was common among underweight, healthy weight, overweight, and obese adults in Sri Lanka. Over 2/3 of overweight and 1/3 of obese Sri Lankan adults believe that they are in "right weight" or "under-weight" categories.
Resumo:
Background IL-23 is a member of the IL-6 super-family and plays key roles in cancer. Very little is currently known about the role of IL-23 in non-small cell lung cancer (NSCLC). Methods RT-PCR and chromatin immunopreciptiation (ChIP) were used to examine the levels, epigenetic regulation and effects of various drugs (DNA methyltransferase inhibitors, Histone Deacetylase inhibitors and Gemcitabine) on IL-23 expression in NSCLC cells and macrophages. The effects of recombinant IL-23 protein on cellular proliferation were examined by MTT assay. Statistical analysis consisted of Student's t-test or one way analysis of variance (ANOVA) where groups in the experiment were three or more. Results In a cohort of primary non-small cell lung cancer (NSCLC) tumours, IL-23A expression was significantly elevated in patient tumour samples (p<0.05). IL-23A expression is epigenetically regulated through histone post-translational modifications and DNA CpG methylation. Gemcitabine, a chemotherapy drug indicated for first-line treatment of NSCLC also induced IL-23A expression. Recombinant IL-23 significantly increased cellular proliferation in NSCLC cell lines. Conclusions These results may therefore have important implications for treating NSCLC patients with either epigenetic targeted therapies or Gemcitabine. © 2012 Elsevier Ireland Ltd.
Resumo:
There is a continuing need to improve safety at Railway Level Crossings (RLX) particularly those that do not have gates and lights regulating traffic flow. A number of Intelligent Transport System (ITS) interventions have been proposed to improve drivers’ awareness and reduce errors in detecting and responding appropriately at level crossings. However, as with other technologies, successful implementation and ultimately effectiveness rests with the acceptance of the technology by the end user. In the current research, four focus groups were held (n=38) with drivers in metropolitan and regional locations in Queensland to examine their perceptions of potential in-vehicle and road-based ITS interventions to improve safety at RLX. The findings imply that further development of the ITS interventions, in particular the design and related promotion of the final product, must consider ease of use, usefulness and relative cost.
Resumo:
Indigenous juveniles (those aged 10 to 16 years in Queensland and 10 to 17 years in all other jurisdictions) are over-represented at all stages of the criminal justice system, and their over-representation becomes more pronounced at the most severe end of the system (ie in detention). Recent figures show that Indigenous juveniles are 24 times as likely to be detained in a juvenile correctional facility as non-Indigenous juveniles (Richards & Lyneham 2010). A variety of explanations for this over-representation have been proposed, including: • lack of access or disparate access to diversionary programs (Allard et al. 2010; Cunneen 2008; Snowball 2008); • systemic discrimination against Indigenous juveniles (eg police bias against Indigenous juveniles) (Cunneen 2008; Kenny & Lennings 2007); • inadequate resourcing of Aboriginal legal services (Cunneen & Schwartz 2008); and • genuinely higher levels of offending by Indigenous juveniles (Kenny & Lennings 2007; Weatherburn et al. 2003). A range of measures (including diversion and juvenile conferencing programs) has recently been implemented to reduce the over-representation of Indigenous juveniles in detention, and minimise the contact of juveniles with the formal criminal justice system. Diversionary measures can only have a limited impact, however, and reducing offending and reoffending have been identified as critical factors to address if the over-representation of Indigenous juveniles is to be reduced (Allard et al. 2010; Weatherburn et al. 2003). While acknowledging that other measures designed to reduce the over-representation of Indigenous juveniles are important, this paper reviews the evidence on policies and programs that reduce offending by Indigenous juveniles in Australia. Where relevant, research from comparable jurisdictions, such as New Zealand and Canada, is also discussed.
Resumo:
BACKGROUND: Ankle joint equinus, or restricted dorsiflexion range of motion (ROM), has been linked to a range of pathologies of relevance to clinical practitioners. This systematic review and meta-analysis investigated the effects of conservative interventions on ankle joint ROM in healthy individuals and athletic populations. METHODS: Keyword searches of Embase Medline Cochrane and CINAHL databases were performed with the final search being run in August 2013. Studies were eligible for inclusion if they assessed the effect of a non-surgical intervention on ankle joint dorsiflexion in healthy populations. Studies were quality rated using a standard quality assessment scale. Standardised mean differences (SMDs) and 95% confidence intervals (CIs) were calculated and results were pooled where study methods were homogenous. RESULTS: Twenty-three studies met eligibility criteria, with a total of 734 study participants. Results suggest that there is some evidence to support the efficacy of static stretching alone (SMDs: range 0.70 to 1.69) and static stretching in combination with ultrasound (SMDs: range 0.91 to 0.95), diathermy (SMD 1.12), diathermy and ice (SMD 1.16), heel raise exercises (SMDs: range 0.70 to 0.77), superficial moist heat (SMDs: range 0.65 to 0.84) and warm up (SMD 0.87) in improving ankle joint dorsiflexion ROM. CONCLUSIONS: Some evidence exists to support the efficacy of stretching alone and stretching in combination with other therapies in increasing ankle joint ROM in healthy individuals. There is a paucity of quality evidence to support the efficacy of other non-surgical interventions, thus further research in this area is warranted.
Resumo:
Background Post-heart transplant psychological distress may both directly hinder physiological health as well as indirectly impact on clinical outcomes by increasing unhealthy behaviours, such as immunosuppression non-adherence. Reducing psychological distress for heart transplant recipients is therefore vitally important, in order to improve patients’ overall health and well-being but also clinical outcomes, such as morbidity and mortality. Evidence from other populations suggests that non-pharmacological interventions may be an effective strategy. Aim To appraise the efficacy of non-pharmacological interventions on psychological outcomes after heart transplant. Method A systematic review was conducted using the Joanna Briggs Institute methodology. Experimental and quasi-experimental studies that involved any non-pharmacological intervention for heart transplant recipients were included, provided that data on psychological outcomes were reported. Multiple electronic databases were searched for published and unpublished studies and reference lists of retrieved studies were scrutinized for further primary research. Data were extracted using a standardised data extraction tool. Included studies were assessed by two independent reviewers using standardised critical appraisal instruments. Results Three studies fulfilled the inclusion and exclusion criteria, which involved only 125 heart transplant recipients. Two studies reported on exercise programs. One study reported a web-based psychosocial intervention. While psychological outcomes significantly improved from baseline to follow-up for the recipients who received the interventions, between-group comparisons were not reported. The methodological quality of the studies was judged to be poor. Conclusions Further research is required, as we found there is insufficient evidence available to draw conclusions for or against the use of non-pharmacological interventions after heart transplant.
Resumo:
Many wild koala populations in Australia continue to experience serious declines due to factors such as disease caused by Chlamydia. This thesis is the first of its kind to investigate diversity of the chlamydial infections in wild koala populations across Australia and has made significant progress towards the development of a vaccine for koalas. The findings in this study have demonstrated that it is feasible to develop a safe and effective recombinant vaccine against Chlamydia in both disease free as well as severely diseased koalas. Most importantly, this study is also first of its kind to evaluate a multi-component vaccine that should be effective against the range of Chlamydia pecorum strains circulating in both captive as well as wild koala populations.
Resumo:
Objectives Titanium implant surfaces with modified topographies have improved osteogenic properties in vivo. However, the molecular mechanisms remain obscure. This study explored the signaling pathways responsible for the pro-osteogenic properties of micro-roughened (SLA) and chemically/nanostructurally (modSLA) modified titanium surfaces on human alveolar bone-derived osteoprogenitor cells (BCs) in vitro. Materials and methods The activation of stem cell signaling pathways (TGFβ/BMP, Wnt, FGF, Hedgehog, Notch) was investigated following early exposure (24 and 72 h) of BCs to SLA and modSLA surfaces in the absence of osteogenic cell culture supplements. Results Key regulatory genes from the TGFβ/BMP (TGFBR2, BMPR2, BMPR1B, ACVR1B, SMAD1, SMAD5), Wnt (Wnt/β-catenin and Wnt/Ca2+) (FZD1, FZD3, FZD5, LRP5, NFATC1, NFATC2, NFATC4, PYGO2, LEF1) and Notch (NOTCH1, NOTCH2, NOTCH4, PSEN1, PSEN2, PSENEN) pathways were upregulated on the modified surfaces. These findings correlated with a higher expression of osteogenic markers bone sialoprotein (IBSP) and osteocalcin (BGLAP), and bone differentiation factors BMP2, BMP6, and GDF15, as observed on the modified surfaces. Conclusions These findings demonstrate that the activation of the pro-osteogenic cell signaling pathways by modSLA and SLA surfaces leads to enhanced osteogenic differentiation as evidenced after 7 and 14 days culture in osteogenic media and provides a mechanistic insight into the superior osseointegration on the modified surfaces observed in vivo.
Resumo:
Prior to graduation engineering students are expected to provide evidence of relevant experience in the workplace. This experience is expected to provide opportunities for exposure to the profession and to help students develop confidence, skills and capabilities as emerging professionals. This investigation considers the expectations and challenges in implementing WIL programs in different contexts. While this will inform the next iteration of engineering course development at QUT the issues and interventions described provide useful insights into options available and engineering curriculum design more broadly. This comparative analysis across three phases highlights expectations and challenges including stakeholder responsibilities, expectations, and assessment. The study draws on the findings of a 2005 investigation into the purpose and provision of WIL and findings of a 2012 Faculty review of the current WIL model. The enhancement of WIL through a series of developmental phases highlights strengths and weaknesses of various models. It is anticipated that this investigation will inform course development decisions on a whole-of-course approach to WIL that improves student engagement and learning experience. The importance of WIL is not disputed. However with industry expectations, increasing student numbers and cohort diversity the ways in which students and industry currently engage in WIL are not sustainable and more creative, flexible and engaging approaches are needed.
Resumo:
BACKGROUND: Outdoor workers are at high risk of harmful ultraviolet radiation exposure and are identified as an at risk group for the development of skin cancer. This systematic evidence based review provides an update to a previous review published in 2007 about interventions for the prevention of skin cancer in outdoor workers. RESULTS: This review includes interventions published between 2007-2012 and presents findings about sun protection behaviours and/or objective measures of skin cancer risk. Six papers met inclusion criteria and were included in the review. Large studies with extended follow-up times demonstrated the efficacy of educational and multi-component interventions to increase sun protection, with some higher use of personal protective equipment such as sunscreen. However, there is less evidence for the effectiveness of policy or specific intervention components. CONCLUSIONS: Further research aimed at improving overall attitudes towards sun protection in outdoor workers is needed to provide an overarching framework.