56 resultados para training and jobs
Resumo:
Exercising in the heat induces thermoregulatory and other physiological strain that can lead to impairments in endurance exercise capacity. The purpose of this consensus statement is to provide up-to-date recommendations to optimise performance during sporting activities undertaken in hot ambient conditions. The most important intervention one can adopt to reduce physiological strain and optimise performance is to heat acclimatise. Heat acclimatisation should comprise repeated exercise-heat exposures over 1-2 weeks. In addition, athletes should initiate competition and training in a euhydrated state and minimise dehydration during exercise. Following the development of commercial cooling systems (eg, cooling-vest), athletes can implement cooling strategies to facilitate heat loss or increase heat storage capacity before training or competing in the heat. Moreover, event organisers should plan for large shaded areas, along with cooling and rehydration facilities, and schedule events in accordance with minimising the health risks of athletes, especially in mass participation events and during the first hot days of the year. Following the recent examples of the 2008 Olympics and the 2014 FIFA World Cup, sport governing bodies should consider allowing additional (or longer) recovery periods between and during events, for hydration and body cooling opportunities, when competitions are held in the heat.
Resumo:
Background.- The main goals of the European Board of Physical and Rehabili-tation Medicine (EBPRM), founded in 1991 as the third speciality board of theUnion of European Medical Specialists (UEMS), are to harmonize pre-graduate,post-graduate and continuous medical education in physical and rehabilitationmedicine (PRM) all over Europe. The harmonization of curricula of the medi-cal specialities and the assessment of medical specialists has become one of thepriorities of the UEMS and its working groups to which the EBPRM contributes.Action.- The EBPRM will continue to promote a specific minimal undergraduatecurriculum on PRM including issues like disability, participation and handicapto be taught all over Europe as a basis for general medical practice. The EBPRMwill also expand the existing EBPRM postgraduate curriculum into a detailedcatalogue of learning objectives. This catalogue will serve as a tool to boostharmonization of the national curricula across Europe as well as to structurethe content of the MCQ examination. It would be a big step forward towardsharmonization of European PRM specialist training if an important number ofcountries would use the certifying MCQ examination of the Board as a part ofthe national assessments for PRM specialists.
Resumo:
Introduction: As part of the MicroArray Quality Control (MAQC)-II project, this analysis examines how the choice of univariate feature-selection methods and classification algorithms may influence the performance of genomic predictors under varying degrees of prediction difficulty represented by three clinically relevant endpoints. Methods: We used gene-expression data from 230 breast cancers (grouped into training and independent validation sets), and we examined 40 predictors (five univariate feature-selection methods combined with eight different classifiers) for each of the three endpoints. Their classification performance was estimated on the training set by using two different resampling methods and compared with the accuracy observed in the independent validation set. Results: A ranking of the three classification problems was obtained, and the performance of 120 models was estimated and assessed on an independent validation set. The bootstrapping estimates were closer to the validation performance than were the cross-validation estimates. The required sample size for each endpoint was estimated, and both gene-level and pathway-level analyses were performed on the obtained models. Conclusions: We showed that genomic predictor accuracy is determined largely by an interplay between sample size and classification difficulty. Variations on univariate feature-selection methods and choice of classification algorithm have only a modest impact on predictor performance, and several statistically equally good predictors can be developed for any given classification problem.
Resumo:
The aim of this exploratory study was to assess the impact of clinicians' defense mechanisms-defined as self-protective psychological mechanisms triggered by the affective load of the encounter with the patient-on adherence to a communication skills training (CST). The population consisted of oncology clinicians (N = 31) who participated in a CST. An interview with simulated cancer patients was recorded prior and 6 months after CST. Defenses were measured before and after CST and correlated with a prototype of an ideally conducted interview based on the criteria of CST-teachers. Clinicians who used more adaptive defense mechanisms showed better adherence to communication skills after CST than clinicians with less adaptive defenses (F(1, 29) = 5.26, p = 0.03, d = 0.42). Improvement in communication skills after CST seems to depend on the initial levels of defenses of the clinician prior to CST. Implications for practice and training are discussed. Communication has been recognized as a central element of cancer care [1]. Ineffective communication may contribute to patients' confusion, uncertainty, and increased difficulty in asking questions, expressing feelings, and understanding information [2, 3], and may also contribute to clinicians' lack of job satisfaction and emotional burnout [4]. Therefore, communication skills trainings (CST) for oncology clinicians have been widely developed over the last decade. These trainings should increase the skills of clinicians to respond to the patient's needs, and enhance an adequate encounter with the patient with efficient exchange of information [5]. While CSTs show a great diversity with regard to their pedagogic approaches [6, 7], the main elements of CST consist of (1) role play between participants, (2) analysis of videotaped interviews with simulated patients, and (3) interactive case discussion provided by participants. As recently stated in a consensus paper [8], CSTs need to be taught in small groups (up to 10-12 participants) and have a minimal duration of at least 3 days in order to be effective. Several systematic reviews evaluated the impact of CST on clinicians' communication skills [9-11]. Effectiveness of CST can be assessed by two main approaches: participant-based and patient-based outcomes. Measures can be self-reported, but, according to Gysels et al. [10], behavioral assessment of patient-physician interviews [12] is the most objective and reliable method for measuring change after training. Based on 22 studies on participants' outcomes, Merckaert et al. [9] reported an increase of communication skills and participants' satisfaction with training and changes in attitudes and beliefs. The evaluation of CST remains a challenging task and variables mediating skills improvement remain unidentified. We recently thus conducted a study evaluating the impact of CST on clinicians' defenses by comparing the evolution of defenses of clinicians participating in CST with defenses of a control group without training [13]. Defenses are unconscious psychological processes which protect from anxiety or distress. Therefore, they contribute to the individual's adaptation to stress [14]. Perry refers to the term "defensive functioning" to indicate the degree of adaptation linked to the use of a range of specific defenses by an individual, ranging from low defensive functioning when he or she tends to use generally less adaptive defenses (such as projection, denial, or acting out) to high defensive functioning when he or she tends to use generally more adaptive defenses (such as altruism, intellectualization, or introspection) [15, 16]. Although several authors have addressed the emotional difficulties of oncology clinicians when facing patients and their need to preserve themselves [7, 17, 18], no research has yet been conducted on the defenses of clinicians. For example, repeated use of less adaptive defenses, such as denial, may allow the clinician to avoid or reduce distress, but it also diminishes his ability to respond to the patient's emotions, to identify and to respond adequately to his needs, and to foster the therapeutic alliance. Results of the above-mentioned study [13] showed two groups of clinicians: one with a higher defensive functioning and one with a lower defensive functioning prior to CST. After the training, a difference in defensive functioning between clinicians who participated in CST and clinicians of the control group was only showed for clinicians with a higher defensive functioning. Some clinicians may therefore be more responsive to CST than others. To further address this issue, the present study aimed to evaluate the relationship between the level of adherence to an "ideally conducted interview", as defined by the teachers of the CST, and the level of the clinician' defensive functioning. We hypothesized that, after CST, clinicians with a higher defensive functioning show a greater adherence to the "ideally conducted interview" than clinicians with a lower defensive functioning.
Resumo:
Despite the limited research on the effects of altitude (or hypoxic) training interventions on team-sport performance, players from all around the world engaged in these sports are now using altitude training more than ever before. In March 2013, an Altitude Training and Team Sports conference was held in Doha, Qatar, to establish a forum of research and practical insights into this rapidly growing field. A round-table meeting in which the panellists engaged in focused discussions concluded this conference. This has resulted in the present position statement, designed to highlight some key issues raised during the debates and to integrate the ideas into a shared conceptual framework. The present signposting document has been developed for use by support teams (coaches, performance scientists, physicians, strength and conditioning staff) and other professionals who have an interest in the practical application of altitude training for team sports. After more than four decades of research, there is still no consensus on the optimal strategies to elicit the best results from altitude training in a team-sport population. However, there are some recommended strategies discussed in this position statement to adopt for improving the acclimatisation process when training/competing at altitude and for potentially enhancing sea-level performance. It is our hope that this information will be intriguing, balanced and, more importantly, stimulating to the point that it promotes constructive discussion and serves as a guide for future research aimed at advancing the bourgeoning body of knowledge in the area of altitude training for team sports.
Resumo:
The aim of this study was to investigate the synergistic effects of endurance training and hypoxia on endurance performance in normoxic and hypoxic conditions (approximately 3000 m above sea level) as well as on lactate and glucose metabolism during prolonged exercise. For this purpose, 14 well-trained cyclists performed 12 training sessions in conditions of normobaric hypoxia (HYP group, n = 7) or normoxia (NOR group, n = 7) over 4 weeks. Before and after training, lactate and glucose turnover rates were measured by infusion of exogenous lactate and stable isotope tracers. Endurance performance was assessed during incremental tests performed in normoxia and hypoxia and a 40 km time trial performed in normoxia. After training, performance was similarly and significantly improved in the NOR and HYP groups (training, P < 0.001) in normoxic conditions. No further effect of hypoxic training was found on markers of endurance performance in hypoxia (training x hypoxia interaction, n.s.). In addition, training and hypoxia had no significant effect on lactate turnover rate. In contrast, there was a significant interaction of training and hypoxia (P < 0.05) on glucose metabolism, as follows: plasma insulin and glucose concentrations were significantly increased; glucose metabolic clearance rate was decreased; and the insulin to glucagon ratio was increased after training in the HYP group. In conclusion, our results show that, compared with training in normoxia, training in hypoxia has no further effect on endurance performance in both normoxic and hypoxic conditions or on lactate metabolic clearance rate. Additionally, these findings suggest that training in hypoxia impairs blood glucose regulation in endurance-trained subjects during exercise.
Resumo:
The aim of this study was to investigate the synergistic effects of endurance training and hypoxia on endurance performance in normoxic and hypoxic conditions (approximately 3000 m above sea level) as well as on lactate and glucose metabolism during prolonged exercise. For this purpose, 14 well-trained cyclists performed 12 training sessions in conditions of normobaric hypoxia (HYP group, n = 7) or normoxia (NOR group, n = 7) over 4 weeks. Before and after training, lactate and glucose turnover rates were measured by infusion of exogenous lactate and stable isotope tracers. Endurance performance was assessed during incremental tests performed in normoxia and hypoxia and a 40 km time trial performed in normoxia. After training, performance was similarly and significantly improved in the NOR and HYP groups (training, P < 0.001) in normoxic conditions. No further effect of hypoxic training was found on markers of endurance performance in hypoxia (training x hypoxia interaction, n.s.). In addition, training and hypoxia had no significant effect on lactate turnover rate. In contrast, there was a significant interaction of training and hypoxia (P < 0.05) on glucose metabolism, as follows: plasma insulin and glucose concentrations were significantly increased; glucose metabolic clearance rate was decreased; and the insulin to glucagon ratio was increased after training in the HYP group. In conclusion, our results show that, compared with training in normoxia, training in hypoxia has no further effect on endurance performance in both normoxic and hypoxic conditions or on lactate metabolic clearance rate. Additionally, these findings suggest that training in hypoxia impairs blood glucose regulation in endurance-trained subjects during exercise.
Resumo:
Ample evidence indicates that inhibitory control (IC), a key executive component referring to the ability to suppress cognitive or motor processes, relies on a right-lateralized fronto-basal brain network. However, whether and how IC can be improved with training and the underlying neuroplastic mechanisms remains largely unresolved. We used functional and structural magnetic resonance imaging to measure the effects of 2 weeks of training with a Go/NoGo task specifically designed to improve frontal top-down IC mechanisms. The training-induced behavioral improvements were accompanied by a decrease in neural activity to inhibition trials within the right pars opercularis and triangularis, and in the left pars orbitalis of the inferior frontal gyri. Analyses of changes in brain anatomy induced by the IC training revealed increases in grey matter volume in the right pars orbitalis and modulations of white matter microstructure in the right pars triangularis. The task-specificity of the effects of training was confirmed by an absence of change in neural activity to a control working memory task. Our combined anatomical and functional findings indicate that differential patterns of functional and structural plasticity between and within inferior frontal gyri enhanced the speed of top-down inhibition processes and in turn IC proficiency. The results suggest that training-based interventions might help overcoming the anatomic and functional deficits of inferior frontal gyri manifesting in inhibition-related clinical conditions. More generally, we demonstrate how multimodal neuroimaging investigations of training-induced neuroplasticity enable revealing novel anatomo-functional dissociations within frontal executive brain networks. Hum Brain Mapp 36:2527-2543, 2015. © 2015 Wiley Periodicals, Inc.
Resumo:
Nombreux sont les groupes de recherche qui se sont intéressés, ces dernières années, à la manière de monitorer l'entraînement des sportifs de haut niveau afin d'optimaliser le rendement de ce dernier tout en préservant la santé des athlètes. Un des problèmes cardinaux d'un entraînement sportif mal conduit est le syndrome du surentraînement. La définition du syndrome susmentionné proposée par Kreider et al. est celle qui est actuellement acceptée par le « European College of Sport Science » ainsi que par le « American College of Sports Medicine», à savoir : « An accumulation of training and/or non-training stress resulting in long-term decrement in performance capacity with or without related physiological and psychological signs and symptoms of maladaptation in which restoration of performance capacity may take several weeks or months. » « Une accumulation de stress lié, ou non, à l'entraînement, résultant en une diminution à long terme de la capacité de performance. Cette dernière est associée ou non avec des signes et des symptômes physiologiques et psychologiques d'inadaptation de l'athlète à l'entraînement. La restauration de ladite capacité de performance peut prendre plusieurs semaines ou mois. » Les recommandations actuelles, concernant le monitoring de l'entraînement et la détection précoce du syndrome du surentrainement, préconisent, entre autre, un suivi psychologique à l'aide de questionnaires (tel que le Profile of Mood State (POMS)), un suivi de la charge d'entraînement perçue par l'athlète (p.ex. avec la session rating of perceived exertion (RPE) method selon C. Foster), un suivi des performances des athlètes et des charges d'entraînement effectuées ainsi qu'un suivi des problèmes de santé (blessures et maladies). Le suivi de paramètres sanguins et hormonaux n'est pas recommandé d'une part pour des questions de coût et de faisabilité, d'autre part car la littérature scientifique n'a, jusqu'ici, pas été en mesure de dégager des évidences à ce sujet. A ce jour, peu d'études ont suivi ces paramètres de manière rigoureuse, sur une longue période et chez un nombre d'athlète important. Ceci est précisément le but de notre étude.
Resumo:
Research into the biomechanical manifestation of fatigue during exhaustive runs is increasingly popular but additional understanding of the adaptation of the spring-mass behaviour during the course of strenuous, self-paced exercises continues to be a challenge in order to develop optimized training and injury prevention programs. This study investigated continuous changes in running mechanics and spring-mass behaviour during a 5-km run. 12 competitive triathletes performed a 5-km running time trial (mean performance: 17 min 30 s) on a 200 m indoor track. Vertical and anterior-posterior ground reaction forces were measured every 200 m by a 5-m long force platform system, and used to determine spring-mass model characteristics. After a fast start, running velocity progressively decreased (- 11.6%; P<0.001) in the middle part of the race before an end spurt in the final 400-600 m. Stride length (- 7.4%; P<0.001) and frequency (- 4.1%; P=0.001) decreased over the 25 laps, while contact time (+ 8.9%; P<0.001) and total stride duration (+ 4.1%; P<0.001) progressively lengthened. Peak vertical forces (- 2.0%; P<0.01) and leg compression (- 4.3%; P<0.05), but not centre of mass vertical displacement (+ 3.2%; P>0.05), decreased with time. As a result, vertical stiffness decreased (- 6.0%; P<0.001) during the run, whereas leg stiffness changes were not significant (+ 1.3%; P>0.05). Spring-mass behaviour progressively changes during a 5-km time trial towards deteriorated vertical stiffness, which alters impact and force production characteristics.
Resumo:
Fall prevention in elderly subjects is often based on training and rehabilitation programs that include mostly traditional balance and strength exercises. By applying such conventional interventions to improve gait performance and decrease fall risk, some important factors are neglected such as the dynamics of the gait and the motor learning processes. The EU project "Self Mobility Improvement in the eLderly by counteractING falls" (SMILING project) aimed to improve age-related gait and balance performance by using unpredicted external perturbations during walking through motorized shoes that change insole inclination at each stance. This paper describes the shoe-worn inertial module and the gait analysis method needed to control in real-time the shoe insole inclination during training, as well as gait spatio-temporal parameters obtained during long distance walking before and after the 8-week training program that assessed the efficacy of training with these motorized shoes.
Resumo:
BACKGROUND AND OBJECTIVES: Central nervous system (CNS) stimulants may be used to reduce tiredness and increase alertness, competitiveness, and aggression. They are more likely to be used in competition but may be used during training to increase the intensity of the training session. There are several potential dangers involving their misuse in contact sports. This paper reviews the three main CNS stimulants, ephedrine, amfetamine, and cocaine, in relation to misuse in sport. METHODS: Description of the pharmacology, actions, and side effects of amfetamine, cocaine, and ephedrine. RESULTS: CNS stimulants have psychotropic effects that may be perceived to be ergogenic. Some are prescription drugs, such as Ephedra alkaloids, and there are issues regarding their appropriate therapeutic use. Recently attention has been given to their widespread use by athletes, despite the lack of evidence regarding any ergogenic or real performance benefit, and their potentially serious side effects. Recreational drugs, some of which are illegal (cocaine, amfetamines), are commonly used by athletes and cause potential ergolytic effects. Overall, these drugs are important for their frequent use and mention in anti-doping laboratories statistics and the media, and their potentially serious adverse effects. CONCLUSIONS: Doping with CNS stimulants is a real public health problem and all sports authorities should participate in its prevention. Dissemination of information is essential to prevent doping in sport and to provide alternatives. Adequate training and education in this domain should be introduced.
Resumo:
BACKGROUND: Brief motivational intervention (BMI) has shown promising results to reduce alcohol use in young adults. Knowledge on mechanisms that predict BMI efficacy could potentially improve treatment effect sizes through data that optimize clinical training and implementation. Particularly, little attention has been given to counselor influence on treatment mechanisms. METHODS: We investigated the influence of counselors on BMI efficacy in reducing alcohol use among non-treatment-seeking young men (age 20) screened as hazardous drinkers. Participants were randomly allocated to (i) a group receiving a single BMI from 1 of 18 counselors selected to maximize differences in several of their characteristics (gender, professional status, clinical experience, and motivational interviewing [MI] experience) or (ii) a control group receiving assessment only. Drinking at 3-month follow-up was first compared between the BMI and control groups to assess efficacy. Then, the influence of counselors' characteristics (i.e., gender, professional status, clinical experience, MI experience, BMI attitudes, and expectancies) and within-session behaviors (i.e., measured by the Motivational Interviewing Skill Code) on outcome was tested in regression analyses. RESULTS: There was a significant (p = 0.02) decrease in alcohol use among the BMI group compared to the control group. Counselors that were male, more experienced, that had more favorable BMI attitudes and expectancies, higher MI skills, but surprisingly less MI-consistent behaviors, had significantly better outcomes than the control group while their counterparts did not. CONCLUSIONS: The current study demonstrated BMI efficacy on alcohol use reduction within a sample of non-treatment-seeking young adult males. Moreover, BMI effect was related to interindividual differences among counselors, and results therefore provide recommendations for BMI training and implementation with similar populations.
Resumo:
Summary : Forensic science - both as a source of and as a remedy for error potentially leading to judicial error - has been studied empirically in this research. A comprehensive literature review, experimental tests on the influence of observational biases in fingermark comparison, and semistructured interviews with heads of forensic science laboratories/units in Switzerland and abroad were the tools used. For the literature review, some of the areas studied are: the quality of forensic science work in general, the complex interaction between science and law, and specific propositions as to error sources not directly related to the interaction between law and science. A list of potential error sources all the way from the crime scene to the writing of the report has been established as well. For the empirical tests, the ACE-V (Analysis, Comparison, Evaluation, and Verification) process of fingermark comparison was selected as an area of special interest for the study of observational biases, due to its heavy reliance on visual observation and recent cases of misidentifications. Results of the tests performed with forensic science students tend to show that decision-making stages are the most vulnerable to stimuli inducing observational biases. For the semi-structured interviews, eleven senior forensic scientists answered questions on several subjects, for example on potential and existing error sources in their work, of the limitations of what can be done with forensic science, and of the possibilities and tools to minimise errors. Training and education to augment the quality of forensic science have been discussed together with possible solutions to minimise the risk of errors in forensic science. In addition, the time that samples of physical evidence are kept has been determined as well. Results tend to show considerable agreement on most subjects among the international participants. Their opinions on possible explanations for the occurrence of such problems and the relative weight of such errors in the three stages of crime scene, laboratory, and report writing, disagree, however, with opinions widely represented in existing literature. Through the present research it was therefore possible to obtain a better view of the interaction of forensic science and judicial error to propose practical recommendations to minimise their occurrence. Résumé : Les sciences forensiques - considérés aussi bien comme source de que comme remède à l'erreur judiciaire - ont été étudiées empiriquement dans cette recherche. Une revue complète de littérature, des tests expérimentaux sur l'influence du biais de l'observation dans l'individualisation de traces digitales et des entretiens semi-directifs avec des responsables de laboratoires et unités de sciences forensiques en Suisse et à l'étranger étaient les outils utilisés. Pour la revue de littérature, quelques éléments étudies comprennent: la qualité du travail en sciences forensiques en général, l'interaction complexe entre la science et le droit, et des propositions spécifiques quant aux sources d'erreur pas directement liées à l'interaction entre droit et science. Une liste des sources potentielles d'erreur tout le long du processus de la scène de crime à la rédaction du rapport a également été établie. Pour les tests empiriques, le processus d'ACE-V (analyse, comparaison, évaluation et vérification) de l'individualisation de traces digitales a été choisi comme un sujet d'intérêt spécial pour l'étude des effets d'observation, due à son fort recours à l'observation visuelle et dû à des cas récents d'identification erronée. Les résultats des tests avec des étudiants tendent à prouver que les étapes de prise de décision sont les plus vulnérables aux stimuli induisant des biais d'observation. Pour les entretiens semi-structurés, onze forensiciens ont répondu à des questions sur des sujets variés, par exemple sur des sources potentielles et existantes d'erreur dans leur travail, des limitations de ce qui peut être fait en sciences forensiques, et des possibilités et des outils pour réduire au minimum ses erreurs. La formation et l'éducation pour augmenter la qualité des sciences forensiques ont été discutées ainsi que les solutions possibles pour réduire au minimum le risque d'erreurs en sciences forensiques. Le temps que des échantillons sont gardés a été également déterminé. En général, les résultats tendent à montrer un grand accord sur la plupart des sujets abordés pour les divers participants internationaux. Leur avis sur des explications possibles pour l'occurrence de tels problèmes et sur le poids relatif de telles erreurs dans les trois étapes scène de crime;', laboratoire et rédaction de rapports est cependant en désaccord avec les avis largement représentés dans la littérature existante. Par cette recherche il était donc possible d'obtenir une meilleure vue de l'interaction des sciences forensiques et de l'erreur judiciaire afin de proposer des recommandations pratiques pour réduire au minimum leur occurrence. Zusammenfassung : Forensische Wissenschaften - als Ursache und als Hilfsmittel gegen Fehler, die möglicherweise zu Justizirrtümern führen könnten - sind hier empirisch erforscht worden. Die eingestzten Methoden waren eine Literaturübersicht, experimentelle Tests über den Einfluss von Beobachtungseffekten (observer bias) in der Individualisierung von Fingerabdrücken und halbstandardisierte Interviews mit Verantwortlichen von kriminalistischen Labors/Diensten in der Schweiz und im Ausland. Der Literaturüberblick umfasst unter anderem: die Qualität der kriminalistischen Arbeit im Allgemeinen, die komplizierte Interaktion zwischen Wissenschaft und Recht und spezifische Fehlerquellen, welche nicht direkt auf der Interaktion von Recht und Wissenschaft beruhen. Eine Liste möglicher Fehlerquellen vom Tatort zum Rapportschreiben ist zudem erstellt worden. Für die empirischen Tests wurde der ACE-V (Analyse, Vergleich, Auswertung und Überprüfung) Prozess in der Fingerabdruck-Individualisierung als speziell interessantes Fachgebiet für die Studie von Beobachtungseffekten gewählt. Gründe sind die Wichtigkeit von visuellen Beobachtungen und kürzliche Fälle von Fehlidentifizierungen. Resultate der Tests, die mit Studenten durchgeführt wurden, neigen dazu Entscheidungsphasen als die anfälligsten für Stimuli aufzuzeigen, die Beobachtungseffekte anregen könnten. Für die halbstandardisierten Interviews beantworteten elf Forensiker Fragen über Themen wie zum Beispiel mögliche und vorhandene Fehlerquellen in ihrer Arbeit, Grenzen der forensischen Wissenschaften und Möglichkeiten und Mittel um Fehler zu verringern. Wie Training und Ausbildung die Qualität der forensischen Wissenschaften verbessern können ist zusammen mit möglichen Lösungen zur Fehlervermeidung im selben Bereich diskutiert worden. Wie lange Beweismitten aufbewahrt werden wurde auch festgehalten. Resultate neigen dazu, für die meisten Themen eine grosse Übereinstimmung zwischen den verschiedenen internationalen Teilnehmern zu zeigen. Ihre Meinungen über mögliche Erklärungen für das Auftreten solcher Probleme und des relativen Gewichts solcher Fehler in den drei Phasen Tatort, Labor und Rapportschreiben gehen jedoch mit den Meinungen, welche in der Literatur vertreten werden auseinander. Durch diese Forschungsarbeit war es folglich möglich, ein besseres Verständnis der Interaktion von forensischen Wissenschaften und Justizirrtümer zu erhalten, um somit praktische Empfehlungen vorzuschlagen, welche diese verringern. Resumen : Esta investigación ha analizado de manera empírica el rol de las ciencias forenses como fuente y como remedio de potenciales errores judiciales. La metodología empleada consistió en una revisión integral de la literatura, en una serie de experimentos sobre la influencia de los sesgos de observación en la individualización de huellas dactilares y en una serie de entrevistas semiestructuradas con jefes de laboratorios o unidades de ciencias forenses en Suiza y en el extranjero. En la revisión de la literatura, algunas de las áreas estudiadas fueron: la calidad del trabajo en ciencias forenses en general, la interacción compleja entre la ciencia y el derecho, así como otras fuentes de error no relacionadas directamente con la interacción entre derecho y ciencia. También se ha establecido una lista exhaustiva de las fuentes potenciales de error desde la llegada a la escena del crimen a la redacción del informe. En el marco de los tests empíricos, al analizar los sesgos de observación dedicamos especial interés al proceso de ACE-V (análisis, comparación, evaluación y verificación) para la individualización de huellas dactilares puesto que este reposa sobre la observación visual y ha originado varios casos recientes de identificaciones erróneas. Los resultados de las experimentaciones realizadas con estudiantes sugieren que las etapas en las que deben tornarse decisiones son las más vulnerables a lös factores que pueden generar sesgos de observación. En el contexto de las entrevistas semi-estructuradas, once científicos forenses de diversos países contestaron preguntas sobre varios temas, incluyendo las fuentes potenciales y existehtes de error en su trabajo, las limitaciones propias a las ciencias forenses, las posibilidades de reducir al mínimo los errores y las herramientas que podrían ser utilizadas para ello. Se han sugerido diversas soluciones para alcanzar este objetivo, incluyendo el entrenamiento y la educación para aumentar la calidad de las ciencias forenses. Además, se ha establecido el periodo de conservación de las muestras judiciales. Los resultados apuntan a un elevado grado de consenso entre los entrevistados en la mayoría de los temas. Sin embargo, sus opiniones sobre las posibles causas de estos errores y su importancia relativa en las tres etapas de la investigación -la escena del crimen, el laboratorio y la redacción de informe- discrepan con las que predominan ampliamente en la literatura actual. De este modo, esta investigación nos ha permitido obtener una mejor imagen de la interacción entre ciencias forenses y errores judiciales, y comenzar a formular una serie de recomendaciones prácticas para reducirlos al minimo.
Resumo:
With the free movement of people in the European Union, medical mobility has increased significantly. This is notably the case for disciplines for which shortage of well-trained staff has occurred. Pathology is among those specialties and effectively the discipline is confronted with a striking increase in mobility among trainees and qualified specialists. The presumption underlying unlimited mobility is that the competencies of the medical specialists in the European countries are more or less equal, including significant similarities in the postgraduate training programs. In order to assess whether reality corresponds with this presumption, we conducted a survey of the content and practice requirements of the curricula in the EU and affiliated countries. The results indicate a striking heterogeneity in the training program content and practice requirements. To name a few elements: duration of the training program varied between 4 and 6 years; the number of autopsies required varied between none at all and 300; the number of biopsies required varied between none at all and 15,000. We conclude that harmonization of training outcomes in Europe is a goal that needs to be pursued. This will be difficult to reach through harmonization of training programs, as these are co-determined by political, cultural, and administrative factors, difficult to influence. Harmonization might be attained by defining the general and specific competencies at the end of training and subsequent testing them through a test to which all trainees in Europe are subjected.