25 resultados para training methods taxonomy
Resumo:
Background: To study the differences in ophthalmology resident training between China and the Hong Kong Special Administrative Region (HKSAR).Methods: Training programs were selected from among the largest and best-known teaching hospitals. Ophthalmology residents were sent an anonymous 48-item questionnaire by mail. Work satisfaction, time allocation between training activities and volume of surgery performed were determined.Results: 50/75 residents (66.7 %) from China and 20/26 (76.9 %) from HKSAR completed the survey. Age (28.9 ± 2.5 vs. 30.2 ± 2.9 years, p = 0.15) and number of years in training (3.4 ± 1.6 vs. 2.8 ± 1.5, p = 0.19) were comparable between groups. The number of cataract procedures performed by HKSAR trainees (extra-capsular, median 80.0, quartile range: 30.0, 100.0; phacoemulsification, median: 20.0, quartile range: 0.0, 100.0) exceeded that for Chinese residents (extra-capsular: median = 0, p < 0.0001; phacoemulsification: median = 0, p < 0.0001). Chinese trainees spent more time completing medical charts (>50 % of time on charts: 62.5 % versus 5.3 %, p < 0.0001) and received less supervision (≥90 % of training supervised: 4.4 % versus 65 %, p < 0.0001). Chinese residents were more likely to feel underpaid (96.0 % vs. 31.6 %, p < 0.0001) and hoped their children would not practice medicine (69.4 % vs. 5.0 %, p = 0.0001) compared HKSAR residents.Conclusions: In this study, ophthalmology residents in China report strikingly less surgical experience and supervision, and lower satisfaction than HKSAR residents. The HKSAR model of hands-on resident training might be useful in improving the low cataract surgical rate in China.
Resumo:
Background: In 2006, the Buttimer report highlighted the paucity of demographic data on those applying for and entering postgraduate medical education and training (PGMET) in Ireland. Today, concerns that there is an "exodus" of graduates of Irish medical schools are at the forefront of national discussion, however, published data on PGMET remains inadequate.
Aims: The objectives of this study were to collate existing data relating to trainees and training programmes at three stages of training and to examine the career plans of junior trainees.
Methods: Data from application forms for training programmes, commencing July 2012, under the Royal College of Physicians of Ireland (n = 870), were integrated with data from other existing sources. Candidates entering basic specialist training were surveyed with regard to career plans. Descriptive and comparative analysis was performed in SPSS version 18.
Results: Graduates of Irish medical schools made up over 70 % of appointees. Over 80 % of BST trainees aspired to work as consultants in Ireland, but 92.5 % planned to spend time working abroad (response rate 77 %). Decisions to leave the Irish system were linked to lifestyle, but also to failure to be appointed to higher specialist training. Significant numbers of trainees return to Ireland after a period abroad.
Conclusions: The trainee "exodus" is more complex than is often portrayed. The desire to spend time working outside Ireland must be accounted for in workforce planning and configuration of training programmes. Expansion of HST is a potential solution to reduce the numbers of graduates leaving Ireland post-BST.
Resumo:
Aim The aim of the study is to evaluate factors that enable or constrain the implementation and service delivery of early warnings systems or acute care training in practice. Background To date there is limited evidence to support the effectiveness of acute care initiatives (early warning systems, acute care training, outreach) in reducing the number of adverse events (cardiac arrest, death, unanticipated Intensive Care admission) through increased recognition and management of deteriorating ward based patients in hospital [1-3]. The reasons posited are that previous research primarily focused on measuring patient outcomes following the implementation of an intervention or programme without considering the social factors (the organisation, the people, external influences) which may have affected the process of implementation and hence measured end-points. Further research which considers the social processes is required in order to understand why a programme works, or does not work, in particular circumstances [4]. Method The design is a multiple case study approach of four general wards in two acute hospitals where Early Warning Systems (EWS) and Acute Life-threatening Events Recognition and Treatment (ALERT) course have been implemented. Various methods are being used to collect data about individual capacities, interpersonal relationships and institutional balance and infrastructures in order to understand the intended and unintended process outcomes of implementing EWS and ALERT in practice. This information will be gathered from individual and focus group interviews with key participants (ALERT facilitators, nursing and medical ALERT instructors, ward managers, doctors, ward nurses and health care assistants from each hospital); non-participant observation of ward organisation and structure; audit of patients' EWS charts and audit of the medical notes of patients who deteriorated during the study period to ascertain whether ALERT principles were followed. Discussion & progress to date This study commenced in January 2007. Ethical approval has been granted and data collection is ongoing with interviews being conducted with key stakeholders. The findings from this study will provide evidence for policy-makers to make informed decisions regarding the direction for strategic and service planning of acute care services to improve the level of care provided to acutely ill patients in hospital. References 1. Esmonde L, McDonnell A, Ball C, Waskett C, Morgan R, Rashidain A et al. Investigating the effectiveness of Critical Care Outreach Services: A systematic review. Intensive Care Medicine 2006; 32: 1713-1721 2. McGaughey J, Alderdice F, Fowler R, Kapila A, Mayhew A, Moutray M. Outreach and Early Warning Systems for the prevention of Intensive Care admission and death of critically ill patients on general hospital wards. Cochrane Database of Systematic Reviews 2007, Issue 3. www.thecochranelibrary.com 3. Winters BD, Pham JC, Hunt EA, Guallar E, Berenholtz S, Pronovost PJ (2007) Rapid Response Systems: A systematic review. Critical Care Medicine 2007; 35 (5): 1238-43 4. Pawson R and Tilley N. Realistic Evaluation. London; Sage: 1997
Resumo:
Statement of purpose The purpose of this concurrent session is to present the main findings and recommendations from a five year study evaluating the implementation of Early Warning Systems (EWS) and the Acute Life-threatening Events: Recognition and Treatment (ALERT) course in Northern Ireland. The presentation will provide delegates with an understanding of those factors that enable and constrain successful implementation of EWS and ALERT in practice in order to provide an impetus for change. Methods The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory [1]. The stages included identifying the programme theories underpinning EWS and ALERT, generating hypotheses, gathering empirical evidence and refining the programme theories. This approach used a variety of mixed methods including individual and focus group interviews, observation and documentary analysis of EWS compliance data and ALERT training records. A within and across case comparison facilitated the development of mid-range theories from the research evidence. Results The official RRS theories developed from the realist synthesis were critically evaluated and compared with the study findings to develop a mid-range theory to explain what works, for whom in what circumstances. The findings of what works suggests that clinical experience, established working relationships, flexible implementation of protocols, ongoing experiential learning, empowerment and pre-emptive management are key to the success of EWS and ALERT implementation. Each concept is presented as ‘context, mechanism and outcome configurations’ to provide an understanding of how the context impacts on individual reasoning or behaviour to produce certain outcomes. Conclusion These findings highlight the combination of factors that can improve the implementation and sustainability of EWS and ALERT and in light of this evidence several recommendations are made to provide policymakers with guidance and direction for future policy development. References: 1. Pawson R and Tilley N. (1997) Realistic Evaluation. Sage Publications; London Type of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast
Resumo:
Background: People with intellectual disabilities often present with unique challenges that make it more difficult to meet their
palliative care needs.
Aim: To define consensus norms for palliative care of people with intellectual disabilities in Europe.
Design: Delphi study in four rounds: (1) a taskforce of 12 experts from seven European countries drafted the norms, based on available empirical knowledge and regional/national guidelines; (2) using an online survey, 34 experts from 18 European countries evaluated the draft norms, provided feedback and distributed the survey within their professional networks. Criteria for consensus
were clearly defined; (3) modifications and recommendations were made by the taskforce; and (4) the European Association for
Palliative Care reviewed and approved the final version.
Setting and participants: Taskforce members: identified through international networking strategies. Expert panel: a purposive sample identified through taskforce members’ networks.
Results: A total of 80 experts from 15 European countries evaluated 52 items within the following 13 norms: equity of access, communication, recognising the need for palliative care, assessment of total needs, symptom management, end-of-life decision making, involving those who matter, collaboration, support for family/carers, preparing for death, bereavement support, education/training
and developing/managing services. None of the items scored less than 86% agreement, making a further round unnecessary. In light of respondents’ comments, several items were modified and one item was deleted.
Conclusion: This White Paper presents the first guidance for clinical practice, policy and research related to palliative care for people with intellectual disabilities based on evidence and European consensus, setting a benchmark for changes in policy and practice.
Resumo:
To assess the outcomes of cataract surgery performed by novice surgeons during training in a rural programme. Design: Retrospective study. Participants: Three hundred thirty-four patients operated by two trainees under supervision at rural Chinese county hospitals. Methods: Two trainees performed surgeries under supervision. Visual acuity, refraction and examinations were carried out 3 months postoperatively. Main Outcome Measures: Postoperative uncorrected visual acuity, pinhole visual acuity, causes of visual impairment (postoperative uncorrected visual acuity<6/18) Results: Among 518 operated patients, 426 (82.2%) could be contacted and 334 (64.4% of operated patients) completed the examinations. The mean age was 74.1±8.8 years and 62.9% were women. Postoperative uncorrected visual acuity was available in 372 eyes. Among them, uncorrected visual acuity was ≥6/18 in 278 eyes (74.7%) and <6/60 in 60 eyes (16.1%), and 323 eyes (86.8%) had pinhole visual acuity≥6/18 and 38 eyes (10.2%) had pinhole visual acuity<6/60. Main causes of visual impairment were uncorrected refractive error (63.9%) and comorbid eye disease (24.5%). Comorbid eye diseases associated with pinhole visual acuity<6/60 (n=23, 6.2%) included glaucoma, other optic nerve atrophy, vitreous haemorrhage and retinal detachment. Conclusions: The findings suggest that hands-on training remains safe and effective even when not implemented in centralized training centres. Further refinement of the training protocol, providing postoperative refractive services and more accurate preoperative intraocular lens calculations, can help optimize outcomes. © 2012 The Authors Clinical and Experimental Ophthalmology © 2012 Royal Australian and New Zealand College of Ophthalmologists.
Resumo:
BACKGROUND: Falls and fall-related injuries are symptomatic of an aging population. This study aimed to design, develop, and deliver a novel method of balance training, using an interactive game-based system to promote engagement, with the inclusion of older adults at both high and low risk of experiencing a fall.
STUDY DESIGN: Eighty-two older adults (65 years of age and older) were recruited from sheltered accommodation and local activity groups. Forty volunteers were randomly selected and received 5 weeks of balance game training (5 males, 35 females; mean, 77.18 ± 6.59 years), whereas the remaining control participants recorded levels of physical activity (20 males, 22 females; mean, 76.62 ± 7.28 years). The effect of balance game training was measured on levels of functional balance and balance confidence in individuals with and without quantifiable balance impairments.
RESULTS: Balance game training had a significant effect on levels of functional balance and balance confidence (P < 0.05). This was further demonstrated in participants who were deemed at high risk of falls. The overall pattern of results suggests the training program is effective and suitable for individuals at all levels of ability and may therefore play a role in reducing the risk of falls.
CONCLUSIONS: Commercial hardware can be modified to deliver engaging methods of effective balance assessment and training for the older population.
Resumo:
Major food adulteration and contamination events occur with alarming regularity and are known to be episodic, with the question being not if but when another large-scale food safety/integrity incident will occur. Indeed, the challenges of maintaining food security are now internationally recognised. The ever increasing scale and complexity of food supply networks can lead to them becoming significantly more vulnerable to fraud and contamination, and potentially dysfunctional. This can make the task of deciding which analytical methods are more suitable to collect and analyse (bio)chemical data within complex food supply chains, at targeted points of vulnerability, that much more challenging. It is evident that those working within and associated with the food industry are seeking rapid, user-friendly methods to detect food fraud and contamination, and rapid/high-throughput screening methods for the analysis of food in general. In addition to being robust and reproducible, these methods should be portable and ideally handheld and/or remote sensor devices, that can be taken to or be positioned on/at-line at points of vulnerability along complex food supply networks and require a minimum amount of background training to acquire information rich data rapidly (ergo point-and-shoot). Here we briefly discuss a range of spectrometry and spectroscopy based approaches, many of which are commercially available, as well as other methods currently under development. We discuss a future perspective of how this range of detection methods in the growing sensor portfolio, along with developments in computational and information sciences such as predictive computing and the Internet of Things, will together form systems- and technology-based approaches that significantly reduce the areas of vulnerability to food crime within food supply chains. As food fraud is a problem of systems and therefore requires systems level solutions and thinking.
Resumo:
BACKGROUND AND PURPOSE: To assess the impact of a standardized delineation protocol and training interventions on PET/CT-based target volume delineation (TVD) in NSCLC in a multicenter setting.
MATERIAL AND METHODS: Over a one-year period, 11 pairs, comprised each of a radiation oncologist and nuclear medicine physician with limited experience in PET/CT-based TVD for NSCLC from nine different countries took part in a training program through an International Atomic Energy Agency (IAEA) study (NCT02247713). Teams delineated gross tumor volume of the primary tumor, during and after training interventions, according to a provided delineation protocol. In-house developed software recorded the performed delineations, to allow visual inspection of strategies and to assess delineation accuracy.
RESULTS: Following the first training, overall concordance indices for 3 repetitive cases increased from 0.57±0.07 to 0.66±0.07. The overall mean surface distance between observer and expert contours decreased from -0.40±0.03 cm to -0.01±0.33 cm. After further training overall concordance indices for another 3 repetitive cases further increased from 0.64±0.06 to 0.80±0.05 (p=0.01). Mean surface distances decreased from -0.34±0.16 cm to -0.05±0.20 cm (p=0.01).
CONCLUSION: Multiple training interventions improve PET/CT-based TVD delineation accuracy in NSCLC and reduces interobserver variation.