60 resultados para Computer Applications, Computer Skills, Project Managers, Training
Resumo:
Summary (in English) Computer simulations provide a practical way to address scientific questions that would be otherwise intractable. In evolutionary biology, and in population genetics in particular, the investigation of evolutionary processes frequently involves the implementation of complex models, making simulations a particularly valuable tool in the area. In this thesis work, I explored three questions involving the geographical range expansion of populations, taking advantage of spatially explicit simulations coupled with approximate Bayesian computation. First, the neutral evolutionary history of the human spread around the world was investigated, leading to a surprisingly simple model: A straightforward diffusion process of migrations from east Africa throughout a world map with homogeneous landmasses replicated to very large extent the complex patterns observed in real human populations, suggesting a more continuous (as opposed to structured) view of the distribution of modern human genetic diversity, which may play a better role as a base model for further studies. Second, the postglacial evolution of the European barn owl, with the formation of a remarkable coat-color cline, was inspected with two rounds of simulations: (i) determine the demographic background history and (ii) test the probability of a phenotypic cline, like the one observed in the natural populations, to appear without natural selection. We verified that the modern barn owl population originated from a single Iberian refugium and that they formed their color cline, not due to neutral evolution, but with the necessary participation of selection. The third and last part of this thesis refers to a simulation-only study inspired by the barn owl case above. In this chapter, we showed that selection is, indeed, effective during range expansions and that it leaves a distinguished signature, which can then be used to detect and measure natural selection in range-expanding populations. Résumé (en français) Les simulations fournissent un moyen pratique pour répondre à des questions scientifiques qui seraient inabordable autrement. En génétique des populations, l'étude des processus évolutifs implique souvent la mise en oeuvre de modèles complexes, et les simulations sont un outil particulièrement précieux dans ce domaine. Dans cette thèse, j'ai exploré trois questions en utilisant des simulations spatialement explicites dans un cadre de calculs Bayésiens approximés (approximate Bayesian computation : ABC). Tout d'abord, l'histoire de la colonisation humaine mondiale et de l'évolution de parties neutres du génome a été étudiée grâce à un modèle étonnement simple. Un processus de diffusion des migrants de l'Afrique orientale à travers un monde avec des masses terrestres homogènes a reproduit, dans une très large mesure, les signatures génétiques complexes observées dans les populations humaines réelles. Un tel modèle continu (opposé à un modèle structuré en populations) pourrait être très utile comme modèle de base dans l'étude de génétique humaine à l'avenir. Deuxièmement, l'évolution postglaciaire d'un gradient de couleur chez l'Effraie des clocher (Tyto alba) Européenne, a été examiné avec deux séries de simulations pour : (i) déterminer l'histoire démographique de base et (ii) tester la probabilité qu'un gradient phénotypique, tel qu'observé dans les populations naturelles puisse apparaître sans sélection naturelle. Nous avons montré que la population actuelle des chouettes est sortie d'un unique refuge ibérique et que le gradient de couleur ne peux pas s'être formé de manière neutre (sans l'action de la sélection naturelle). La troisième partie de cette thèse se réfère à une étude par simulations inspirée par l'étude de l'Effraie. Dans ce dernier chapitre, nous avons montré que la sélection est, en effet, aussi efficace dans les cas d'expansion d'aire de distribution et qu'elle laisse une signature unique, qui peut être utilisée pour la détecter et estimer sa force.
Resumo:
The main objective of WP1 of the ORAMED (Optimization of RAdiation protection for MEDical staff) project is to obtain a set of standardised data on extremity and eye lens doses for staff in interventional radiology (IR) and cardiology (IC) and to optimise staff protection. A coordinated measurement program in different hospitals in Europe will help towards this direction. This study aims at analysing the first results of the measurement campaign performed in IR and IC procedures in 34 European hospitals. The highest doses were found for pacemakers, renal angioplasties and embolisations. Left finger and wrist seem to receive the highest extremity doses, while the highest eye lens doses are measured during embolisations. Finally, it was concluded that it is difficult to find a general correlation between kerma area product and extremity or eye lens doses.
Resumo:
The aim of this exploratory study was to assess the impact of clinicians' defense mechanisms-defined as self-protective psychological mechanisms triggered by the affective load of the encounter with the patient-on adherence to a communication skills training (CST). The population consisted of oncology clinicians (N = 31) who participated in a CST. An interview with simulated cancer patients was recorded prior and 6 months after CST. Defenses were measured before and after CST and correlated with a prototype of an ideally conducted interview based on the criteria of CST-teachers. Clinicians who used more adaptive defense mechanisms showed better adherence to communication skills after CST than clinicians with less adaptive defenses (F(1, 29) = 5.26, p = 0.03, d = 0.42). Improvement in communication skills after CST seems to depend on the initial levels of defenses of the clinician prior to CST. Implications for practice and training are discussed. Communication has been recognized as a central element of cancer care [1]. Ineffective communication may contribute to patients' confusion, uncertainty, and increased difficulty in asking questions, expressing feelings, and understanding information [2, 3], and may also contribute to clinicians' lack of job satisfaction and emotional burnout [4]. Therefore, communication skills trainings (CST) for oncology clinicians have been widely developed over the last decade. These trainings should increase the skills of clinicians to respond to the patient's needs, and enhance an adequate encounter with the patient with efficient exchange of information [5]. While CSTs show a great diversity with regard to their pedagogic approaches [6, 7], the main elements of CST consist of (1) role play between participants, (2) analysis of videotaped interviews with simulated patients, and (3) interactive case discussion provided by participants. As recently stated in a consensus paper [8], CSTs need to be taught in small groups (up to 10-12 participants) and have a minimal duration of at least 3 days in order to be effective. Several systematic reviews evaluated the impact of CST on clinicians' communication skills [9-11]. Effectiveness of CST can be assessed by two main approaches: participant-based and patient-based outcomes. Measures can be self-reported, but, according to Gysels et al. [10], behavioral assessment of patient-physician interviews [12] is the most objective and reliable method for measuring change after training. Based on 22 studies on participants' outcomes, Merckaert et al. [9] reported an increase of communication skills and participants' satisfaction with training and changes in attitudes and beliefs. The evaluation of CST remains a challenging task and variables mediating skills improvement remain unidentified. We recently thus conducted a study evaluating the impact of CST on clinicians' defenses by comparing the evolution of defenses of clinicians participating in CST with defenses of a control group without training [13]. Defenses are unconscious psychological processes which protect from anxiety or distress. Therefore, they contribute to the individual's adaptation to stress [14]. Perry refers to the term "defensive functioning" to indicate the degree of adaptation linked to the use of a range of specific defenses by an individual, ranging from low defensive functioning when he or she tends to use generally less adaptive defenses (such as projection, denial, or acting out) to high defensive functioning when he or she tends to use generally more adaptive defenses (such as altruism, intellectualization, or introspection) [15, 16]. Although several authors have addressed the emotional difficulties of oncology clinicians when facing patients and their need to preserve themselves [7, 17, 18], no research has yet been conducted on the defenses of clinicians. For example, repeated use of less adaptive defenses, such as denial, may allow the clinician to avoid or reduce distress, but it also diminishes his ability to respond to the patient's emotions, to identify and to respond adequately to his needs, and to foster the therapeutic alliance. Results of the above-mentioned study [13] showed two groups of clinicians: one with a higher defensive functioning and one with a lower defensive functioning prior to CST. After the training, a difference in defensive functioning between clinicians who participated in CST and clinicians of the control group was only showed for clinicians with a higher defensive functioning. Some clinicians may therefore be more responsive to CST than others. To further address this issue, the present study aimed to evaluate the relationship between the level of adherence to an "ideally conducted interview", as defined by the teachers of the CST, and the level of the clinician' defensive functioning. We hypothesized that, after CST, clinicians with a higher defensive functioning show a greater adherence to the "ideally conducted interview" than clinicians with a lower defensive functioning.
Resumo:
This study assessed medical students' perception of individual vs. group training in breaking bad news (BBN) and explored training needs in BBN. Master-level students (N = 124) were randomised to group training (GT)-where only one or two students per group conducted a simulated patient (SP) interview, which was discussed collectively with the faculty-or individual training (IT)-where each student conducted an SP interview, which was discussed during individual supervision. Training evaluation was based on questionnaires, and the videotaped interviews were rated using the Roter Interaction Analysis System. Students were globally satisfied with the training. Still, there were noticeable differences between students performing an interview (GT/IT) and students observing interviews (GT). The analysis of the interviews showed significant differences according to scenarios and to gender. Active involvement through SP interviews seems required for students to feel able to reach training objectives. The evaluation of communication skills, revealing a baseline heterogeneity, supports individualised training.
Resumo:
The present study investigates the short- and long-term outcomes of a computer-assisted cognitive remediation (CACR) program in adolescents with psychosis or at high risk. 32 adolescents participated in a blinded 8-week randomized controlled trial of CACR treatment compared to computer games (CG). Clinical and neuropsychological evaluations were undertaken at baseline, at the end of the program and at 6-month. At the end of the program (n = 28), results indicated that visuospatial abilities (Repeatable Battery for the Assessment of Neuropsychological Status, RBANS; P = .005) improved signifi cantly more in the CACR group compared to the CG group. Furthermore, other cognitive functions (RBANS), psychotic symptoms (Positive and Negative Symptom Scale) and psychosocial functioning (Social and Occupational Functioning Assessment Scale) improved signifi cantly, but at similar rates, in the two groups. At long term (n = 22), cognitive abilities did not demonstrated any amelioration in the control group while, in the CACR group, signifi cant long-term improvements in inhibition (Stroop; P = .040) and reasoning (Block Design Test; P = .005) were observed. In addition, symptom severity (Clinical Global Improvement) decreased signifi cantly in the control group (P = .046) and marginally in the CACR group (P = .088). To sum up, CACR can be successfully administered in this population. CACR proved to be effective over and above CG for the most intensively trained cognitive ability. Finally, on the long-term, enhanced reasoning and inhibition abilities, which are necessary to execute higher-order goals or to adapt behavior to the ever-changing environment, were observed in adolescents benefi ting from a CACR.
Resumo:
To evaluate how young physicians in training perceive their patients' cardiovascular risk based on the medical charts and their clinical judgment. Cross sectional observational study. University outpatient clinic, Lausanne, Switzerland. Two hundred hypertensive patients and 50 non-hypertensive patients with at least one cardiovascular risk factor. Comparison of the absolute 10-year cardiovascular risk calculated by a computer program based on the Framingham score and adapted for physicians by the WHO/ISH with the perceived risk as assessed clinically by the physicians. Physicians underestimated the 10-year cardiovascular risk of their patients compared to that calculated with the Framingham score. Concordance between methods was 39% for hypertensive patients and 30% for non-hypertensive patients. Underestimation of cardiovascular risks for hypertensive patients was related to the fact they had a stabilized systolic blood pressure under 140 mm Hg (OR = 2.1 [1.1; 4.1]). These data show that young physicians in training often have an incorrect perception of the cardiovascular risk of their patients with a tendency to underestimate the risk. However, the calculated risk could also be slightly overestimated when applying the Framingham Heart Study model to a Swiss population. To implement a systematic evaluation of risk factors in primary care a greater emphasis should be placed on the teaching of cardiovascular risk evaluation and on the implementation of quality improvement programs.
Resumo:
The purpose of this study was to test the hypothesis that athletes having a slower oxygen uptake ( VO(2)) kinetics would benefit more, in terms of time spent near VO(2max), from an increase in the intensity of an intermittent running training (IT). After determination of VO(2max), vVO(2max) (i.e. the minimal velocity associated with VO(2max) in an incremental test) and the time to exhaustion sustained at vVO(2max) ( T(lim)), seven well-trained triathletes performed in random order two IT sessions. The two IT comprised 30-s work intervals at either 100% (IT(100%)) or 105% (IT(105%)) of vVO(2max) with 30-s recovery intervals at 50% of vVO(2max) between each repeat. The parameters of the VO(2) kinetics (td(1), tau(1), A(1), td(2), tau(2), A(2), i.e. time delay, time constant and amplitude of the primary phase and slow component, respectively) during the T(lim) test were modelled with two exponential functions. The highest VO(2) reached was significantly lower ( P<0.01) in IT(100%) run at 19.8 (0.9) km(.)h(-1) [66.2 (4.6) ml(.)min(-1.)kg(-1)] than in IT(105%) run at 20.8 (1.0) km(.)h(-1) [71.1 (4.9) ml(.)min(-1.)kg(-1)] or in the incremental test [71.2 (4.2) ml(.)min(-1.)kg(-1)]. The time sustained above 90% of VO(2max) in IT(105%) [338 (149) s] was significantly higher ( P<0.05) than in IT(100%) [168 (131) s]. The average T(lim) was 244 (39) s, tau(1) was 15.8 (5.9) s and td(2) was 96 (13) s. tau(1) was correlated with the difference in time spent above 90% of VO(2max) ( r=0.91; P<0.01) between IT(105%) and IT(100%). In conclusion, athletes with a slower VO(2) kinetics in a vVO(2max) constant-velocity test benefited more from the 5% rise of IT work intensity, exercising for longer above 90% of VO(2max) when the IT intensity was increased from 100 to 105% of vVO(2max).