895 resultados para Context-based teaching
Resumo:
Rapport de synthèse : L'article qui fait l'objet de ma thèse évalue une nouvelle approche pédagogique pour l'apprentissage de certains chapitres de physiopathologie. Le dispositif pédagogique se base sur l'alternance d'apprentissage ex-cathedra et de l'utilisation d'un site web comprenant des vignettes cliniques. Lors de la consultation de ces-dernières, l'étudiant est invité à demander des examens de laboratoire dont il pourrait justifier la pertinence selon le cas clinique étudié. La nouveauté du procédé réside dans le fait que, préalablement à son cours ex-cathedra, l'enseignant peut consulter les statistiques de demandes de laboratoire et ainsi orienter son cours selon les éléments mal compris par les étudiants. A la suite du cours ex-cathedra, les étudiants peuvent consulter sur internet la vignette clinique complète avec des explications. A l'issue de tout le cours, une évaluation auprès des étudiants a été conduite. Le procédé a été mis en place durant deux années consécutives et l'article en discute notamment les résultats. Nous avons pu conclure que cette méthode innovatrice d'enseignement amène les étudiants à mieux se préparer pour les cours ex-cathedra tout en permettant à l'enseignant d'identifier plus précisément quelles thématiques étaient difficiles pour les étudiants et donc d'ajuster au mieux son cours. Mon travail de thèse a consisté à créer ce dispositif d'apprentissage, à créer l'application web des vignettes cliniques et à l'implanter durant deux années consécutives. J'ai ensuite analysé les données des évaluations et écrit l'article que j'ai présenté à la revue 'Medical Teacher'. Après quelques corrections et précisions demandées par le comité de lecture, l'article a été accepté et publié. Ce travail a débouché sur une seconde version de l'application web qui est actuellement utilisée lors du module 3.1 de 3è année à l'Ecole de Médecine à Lausanne. Summary : Since the early days of sexual selection, our understanding of the selective forces acting on males and females during reproduction has increased remarkably. However, despite a long tradition of experimental and theoretical work in this field and relentless effort, numerous questions remain unanswered and many results are conflicting. Moreover, the interface between sexual selection and conservation biology has to date received little attention, despite existing evidence for its importance. In the present thesis, I first used an empirical approach to test various sexual selection hypotheses in a population of whitefish of central Switzerland. This precise population is characterized by a high prevalence of gonadal alterations in males. In particular, I challenged the hypothesis that whitefish males displaying peculiar gonadal features are of lower genetic quality than other seemingly normal males. Additionally, I also worked on identifying important determinant of sperm behavior. During a second theoretical part of my work, which is part of a larger project on the evolution of female mate preferences in harvested fish populations, I developed an individual-based simulation model to estimate how different mate discrimination costs affect the demographical behavior of fish populations and the evolutionary trajectories of female mate preferences. This latter work provided me with some insight on a recently published article addressing the importance of sexual selection for harvesting-induced evolution. I built upon this insight in a short perspective paper. In parallel, I let some methodological questions drive my thoughts, and wrote an essay about possible synergies between the biological, the philosophical and the statistical approach to biological questions.
Resumo:
We present an approach to teaching evidence-based management (EBMgt) that trains future managers how to produce local evidence. Local evidence is causally interpretable data, collected on-site in companies to address a specific business problem. Our teaching method is a variant of problem-based learning, a method originally developed to teach evidence-based medicine. Following this method, students learn an evidence-based problem-solving cycle for addressing actual business cases. Executing this cycle, students use and produce scientific evidence through literature searches and the design of local, experimental tests of causal hypotheses. We argue the value of teaching EBMgt with a focus on producing local evidence, how it can be taught, and what can be taught. We conclude by outlining our contribution to the literature on teaching EBMgt and by discussing limitations of our approach.
Resumo:
Some faculty members from different universities around the world have begun to use Wikipedia as a teaching tool in recent years. These experiences show, in most cases, very satisfactory results and a substantial improvement in various basic skills, as well as a positive influence on the students' motivation. Nevertheless and despite the growing importance of e-learning methodologies based on the use of the Internet for higher education, the use of Wikipedia as a teaching resource remains scarce among university faculty.Our investigation tries to identify which are the main factors that determine acceptance or resistance to that use. We approach the decision to use Wikipedia as a teaching tool by analyzing both the individual attributes of faculty members and the characteristics of the environment where they develop their teaching activity. From a specific survey sent to all faculty of the Universitat Oberta de Catalunya (UOC), pioneer and leader in online education in Spain, we have tried to infer the influence of these internal and external elements. The questionnaire was designed to measure different constructs: perceived quality of Wikipedia, teaching practices involving Wikipedia, use experience, perceived usefulness and use of 2.0 tools. Control items were also included for gathering information on gender, age, teaching experience, academic rank, and area of expertise.Our results reveal that academic rank, teaching experience, age or gender, are not decisive factors in explaining the educational use of Wikipedia. Instead, the decision to use it is closely linked to the perception of Wikipedia's quality, the use of other collaborative learning tools, an active attitude towards web 2.0 applications, and connections with the professional non-academic world. Situational context is also very important, since the use is higher when faculty members have got reference models in their close environment and when they perceive it is positively valued by their colleagues. As far as these attitudes, practices and cultural norms diverge in different scientific disciplines, we have also detected clear differences in the use of Wikipedia among areas of academic expertise. As a consequence, a greater application of Wikipedia both as a teaching resource and as a driver for teaching innovation would require much more active institutional policies and some changes in the dominant academic culture among faculty members.
Resumo:
Universities must motivate future professionals so that they are able to apply their experience over and beyond the scientific and technological context. These professionals should also be trained so that they are aware of the current position as regards the economy and limited energy resources, and they must be creative, knowledgeable and committed if they are to rethink the current model.The Departments of Architectural Technology II and Applied Physics, in collaboration with the Interdisciplinary Centre of Technology, Innovation and Education for Sustainability (CITIES), believed that students could be given the opportunity to specialise in the area of sustainable development by means of their final theses [2]. With this objective in mind, a line of theses called Energy Assessments was created as part of the Plan for Resource Consumption Efficiency (PECR). The line was based on a learning strategy that focused on the student.The teaching staff was able to observe that, in terms of cognitive aspects, the students improved their knowledge of environmental issues and the associated skills, and that they were more able to solve problems in the area of sustainability and had greater concerns about this subject matter after having completed their theses.
Resumo:
Tämän Pro gradu-tutkielman tavoitteena olirakentaa esiymmärrys sosiaalisen pääoman roolista ja mittaamisesta uuden teknologian start-up yrityksissä. Pääasiallisena tarkoituksena tässä tutkimuksessa olilöytää sosiaalisen pääoman ja start-up yrityksen tuloksellisuuden välille yhdistävä tekijä. Tutkimuksen empiirinen aineisto kerättiin pääasiallisesti kuuden OKO Venture Capitalin sijoitusportfolioon sisältyvien case-yritysten kvalitatiivisten teemahaastatteluiden sekä kvantitatiivisten kyselylomakkeiden avulla. Kvalitatiivisten haastatteluiden tulosten perusteella sosiaalisen pääoman ja tuloksellisuuden välille löytyi yhdistävä tekijä, jota käytettiin myöhemmin hyväksi kvantitatiivisessa kyselylomakkeessa. Tämän tutkielman tulokset osoittivat, että startegisen päätöksenteon kautta sosiaalinen pääoma vaikuttaa osittain start-up yritysten tuloksellisuuteen. Manageriaalisesti tärkempi löydös tässä tutkimuksessa oli kuitenkin se, että sosiaalista pääomaa voidaan käyttää hyväksi ennustettaessa uuden teknologian start-up yritysten tulevaisuuden kassavirtoja.
Resumo:
In this paper we discuss the main privacy issues around mobile business models and we envision new solutions having privacy protection as a main value proposition. We construct a framework to help analyze the situation and assume that a third party is necessary to warrant transactions between mobile users and m-commerce providers. We then use the business model canvas to describe a generic business model pattern for privacy third party services. This pattern is then illustrated in two different variations of a privacy business model, which we call privacy broker and privacy management software. We conclude by giving examples for each business model and by suggesting further directions of investigation
Resumo:
Bandura (1986) developed the concept of moral disengagement to explain how individuals can engage in detrimental behavior while experiencing low levels of negative feelings such as guilt-feelings. Most of the research conducted on moral disengagement investigated this concept as a global concept (e.g., Bandura, Barbaranelli, Caprara, & Pastorelli, 1996; Moore, Detert, Klebe Treviño, Baker, & Mayer, 2012) while Bandura (1986, 1990) initially developed eight distinct mechanisms of moral disengagement grouped into four categories representing the various means through which moral disengagement can operate. In our work, we propose to develop measures of this concept based on its categories, namely rightness of actions, rejection of personal responsibility, distortion of negative consequences, and negative perception of the victims, and which is not specific a particular area of research. Through our measures, we aim at better understanding the cognitive process leading individuals to behave unethically by investigating which category plays a role in explaining unethical behavior depending on the situations in which individuals are. To this purpose, we conducted five studies to develop the measures and to test its predictive validity. Particularly, we assessed the ability of the newly developed measures to predict two types of unethical behaviors, i.e. discriminatory behavior and cheating behavior. Confirmatory Factor analyses demonstrated a good fit of the model and findings generally supported our predictions.
Resumo:
Introduction. If we are to promote more patient-centred approaches in care delivery, we have to better characterize the situations in which being patient-centred is difficult to achieve. Data from professionals in health and social care are important because they are the people charged with operationalizing patient-centred care (PCC) in their daily practice. However, empirical accounts from frontline care providers are still lacking, and it is important to gather experiences not only from doctors but also from the other care providers. Indeed, experiences from different professions can help inform our understanding of patient care, which is expected to be both patient-centred and collaborative. Methods. This study was based on the following research question: What factors make the provision of PCC difficult to achieve? Sample and setting. A purposeful sampling technique was used, allowing for a series of choices about the participants and their professional affiliation. Because patient-centredness is the focus, 3 professions appeared to be of special interest: general internists, nurses and social workers. The study was undertaken in the General Internal Medicine Division of a teaching hospital located in a North American context. Data Collection. To answer the research question, a methodological approach based on a theory called phenomenology was chosen. Accordingly, semi-structured interviews were used since they generate understanding of the meanings different individuals have of their lived world. Interviews with 8 physicians, 10 nurses and 10 social workers were eventually conducted. Data analysis. An inductive thematic analysis was employed to make sense of the interview data. Results. The thematic analysis allowed identifying various types of challenges to PCC. Although most of the challenges were perceived by all three groups of professionals, they were perceived to a different degree across the professions, which likely reflected the scope of practice of each profession. The challenges and their distribution across the professions are illustrated in Table 1. Examples of challenges are provided in Table 2. Discussion. There is a tension between what is supposed to be done - what stands in the philosophy of patient -centredness - and what is currently done - the real life with all the challenges to PCC. According to some participants' accounts, PCC clearly risks becoming a mere illusion for health care professionals on which too great pressures are imposed.
Resumo:
In the context of the evidence-based practices movement, the emphasis on computing effect sizes and combining them via meta-analysis does not preclude the demonstration of functional relations. For the latter aim, we propose to augment the visual analysis to add consistency to the decisions made on the existence of a functional relation without losing sight of the need for a methodological evaluation of what stimuli and reinforcement or punishment are used to control the behavior. Four options for quantification are reviewed, illustrated, and tested with simulated data. These quantifications include comparing the projected baseline with the actual treatment measurements, on the basis of either parametric or nonparametric statistics. The simulated data used to test the quantifications include nine data patterns in terms of the presence and type of effect and comprising ABAB and multiple baseline designs. Although none of the techniques is completely flawless in terms of detecting a functional relation only when it is present but not when it is absent, an option based on projecting split-middle trend and considering data variability as in exploratory data analysis proves to be the best performer for most data patterns. We suggest that the information on whether a functional relation has been demonstrated should be included in meta-analyses. It is also possible to use as a weight the inverse of the data variability measure used in the quantification for assessing the functional relation. We offer an easy to use code for open-source software for implementing some of the quantifications.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
The prevailing undergraduate medical training process still favors disconnection and professional distancing from social needs. The Brazilian Ministries of Education and Health, through the National Curriculum Guidelines, the Incentives Program for Changes in the Medical Curriculum (PROMED), and the National Program for Reorientation of Professional Training in Health (PRO-SAÚDE), promoted the stimulus for an effective connection between medical institutions and the Unified National Health System (SUS). In accordance to the new paradigm for medical training, the Centro Universitário Serra dos Órgãos (UNIFESO) established a teaching plan in 2005 using active methodologies, specifically problem-based learning (PBL). Research was conducted through semi-structured interviews with third-year undergraduate students at the UNIFESO Medical School. The results were categorized as proposed by Bardin's thematic analysis, with the purpose of verifying the students' impressions of the new curriculum. Active methodologies proved to be well-accepted by students, who defined them as exciting and inclusive of theory and practice in medical education.
Resumo:
The focus of the present work was on 10- to 12-year-old elementary school students’ conceptual learning outcomes in science in two specific inquiry-learning environments, laboratory and simulation. The main aim was to examine if it would be more beneficial to combine than contrast simulation and laboratory activities in science teaching. It was argued that the status quo where laboratories and simulations are seen as alternative or competing methods in science teaching is hardly an optimal solution to promote students’ learning and understanding in various science domains. It was hypothesized that it would make more sense and be more productive to combine laboratories and simulations. Several explanations and examples were provided to back up the hypothesis. In order to test whether learning with the combination of laboratory and simulation activities can result in better conceptual understanding in science than learning with laboratory or simulation activities alone, two experiments were conducted in the domain of electricity. In these experiments students constructed and studied electrical circuits in three different learning environments: laboratory (real circuits), simulation (virtual circuits), and simulation-laboratory combination (real and virtual circuits were used simultaneously). In order to measure and compare how these environments affected students’ conceptual understanding of circuits, a subject knowledge assessment questionnaire was administered before and after the experimentation. The results of the experiments were presented in four empirical studies. Three of the studies focused on learning outcomes between the conditions and one on learning processes. Study I analyzed learning outcomes from experiment I. The aim of the study was to investigate if it would be more beneficial to combine simulation and laboratory activities than to use them separately in teaching the concepts of simple electricity. Matched-trios were created based on the pre-test results of 66 elementary school students and divided randomly into a laboratory (real circuits), simulation (virtual circuits) and simulation-laboratory combination (real and virtual circuits simultaneously) conditions. In each condition students had 90 minutes to construct and study various circuits. The results showed that studying electrical circuits in the simulation–laboratory combination environment improved students’ conceptual understanding more than studying circuits in simulation and laboratory environments alone. Although there were no statistical differences between simulation and laboratory environments, the learning effect was more pronounced in the simulation condition where the students made clear progress during the intervention, whereas in the laboratory condition students’ conceptual understanding remained at an elementary level after the intervention. Study II analyzed learning outcomes from experiment II. The aim of the study was to investigate if and how learning outcomes in simulation and simulation-laboratory combination environments are mediated by implicit (only procedural guidance) and explicit (more structure and guidance for the discovery process) instruction in the context of simple DC circuits. Matched-quartets were created based on the pre-test results of 50 elementary school students and divided randomly into a simulation implicit (SI), simulation explicit (SE), combination implicit (CI) and combination explicit (CE) conditions. The results showed that when the students were working with the simulation alone, they were able to gain significantly greater amount of subject knowledge when they received metacognitive support (explicit instruction; SE) for the discovery process than when they received only procedural guidance (implicit instruction: SI). However, this additional scaffolding was not enough to reach the level of the students in the combination environment (CI and CE). A surprising finding in Study II was that instructional support had a different effect in the combination environment than in the simulation environment. In the combination environment explicit instruction (CE) did not seem to elicit much additional gain for students’ understanding of electric circuits compared to implicit instruction (CI). Instead, explicit instruction slowed down the inquiry process substantially in the combination environment. Study III analyzed from video data learning processes of those 50 students that participated in experiment II (cf. Study II above). The focus was on three specific learning processes: cognitive conflicts, self-explanations, and analogical encodings. The aim of the study was to find out possible explanations for the success of the combination condition in Experiments I and II. The video data provided clear evidence about the benefits of studying with the real and virtual circuits simultaneously (the combination conditions). Mostly the representations complemented each other, that is, one representation helped students to interpret and understand the outcomes they received from the other representation. However, there were also instances in which analogical encoding took place, that is, situations in which the slightly discrepant results between the representations ‘forced’ students to focus on those features that could be generalised across the two representations. No statistical differences were found in the amount of experienced cognitive conflicts and self-explanations between simulation and combination conditions, though in self-explanations there was a nascent trend in favour of the combination. There was also a clear tendency suggesting that explicit guidance increased the amount of self-explanations. Overall, the amount of cognitive conflicts and self-explanations was very low. The aim of the Study IV was twofold: the main aim was to provide an aggregated overview of the learning outcomes of experiments I and II; the secondary aim was to explore the relationship between the learning environments and students’ prior domain knowledge (low and high) in the experiments. Aggregated results of experiments I & II showed that on average, 91% of the students in the combination environment scored above the average of the laboratory environment, and 76% of them scored also above the average of the simulation environment. Seventy percent of the students in the simulation environment scored above the average of the laboratory environment. The results further showed that overall students seemed to benefit from combining simulations and laboratories regardless of their level of prior knowledge, that is, students with either low or high prior knowledge who studied circuits in the combination environment outperformed their counterparts who studied in the laboratory or simulation environment alone. The effect seemed to be slightly bigger among the students with low prior knowledge. However, more detailed inspection of the results showed that there were considerable differences between the experiments regarding how students with low and high prior knowledge benefitted from the combination: in Experiment I, especially students with low prior knowledge benefitted from the combination as compared to those students that used only the simulation, whereas in Experiment II, only students with high prior knowledge seemed to benefit from the combination relative to the simulation group. Regarding the differences between simulation and laboratory groups, the benefits of using a simulation seemed to be slightly higher among students with high prior knowledge. The results of the four empirical studies support the hypothesis concerning the benefits of using simulation along with laboratory activities to promote students’ conceptual understanding of electricity. It can be concluded that when teaching students about electricity, the students can gain better understanding when they have an opportunity to use the simulation and the real circuits in parallel than if they have only the real circuits or only a computer simulation available, even when the use of the simulation is supported with the explicit instruction. The outcomes of the empirical studies can be considered as the first unambiguous evidence on the (additional) benefits of combining laboratory and simulation activities in science education as compared to learning with laboratories and simulations alone.
Resumo:
Over the past decade, organizations worldwide have begun to widely adopt agile software development practices, which offer greater flexibility to frequently changing business requirements, better cost effectiveness due to minimization of waste, faster time-to-market, and closer collaboration between business and IT. At the same time, IT services are continuing to be increasingly outsourced to third parties providing the organizations with the ability to focus on their core capabilities as well as to take advantage of better demand scalability, access to specialized skills, and cost benefits. An output-based pricing model, where the customers pay directly for the functionality that was delivered rather than the effort spent, is quickly becoming a new trend in IT outsourcing allowing to transfer the risk away from the customer while at the same time offering much better incentives for the supplier to optimize processes and improve efficiency, and consequently producing a true win-win outcome. Despite the widespread adoption of both agile practices and output-based outsourcing, there is little formal research available on how the two can be effectively combined in practice. Moreover, little practical guidance exists on how companies can measure the performance of their agile projects, which are being delivered in an output-based outsourced environment. This research attempted to shed light on this issue by developing a practical project monitoring framework which may be readily applied by organizations to monitor the performance of agile projects in an output-based outsourcing context, thus taking advantage of the combined benefits of such an arrangement Modified from action research approach, this research was divided into two cycles, each consisting of the Identification, Analysis, Verification, and Conclusion phases. During Cycle 1, a list of six Key Performance Indicators (KPIs) was proposed and accepted by the professionals in the studied multinational organization, which formed the core of the proposed framework and answered the first research sub-question of what needs to be measured. In Cycle 2, a more in-depth analysis was provided for each of the suggested Key Performance Indicators including the techniques for capturing, calculating, and evaluating the information provided by each KPI. In the course of Cycle 2, the second research sub-question was answered, clarifying how the data for each KPI needed to be measured, interpreted, and acted upon. Consequently, after two incremental research cycles, the primary research question was answered describing the practical framework that may be used for monitoring the performance of agile IT projects delivered in an output-based outsourcing context. This framework was evaluated by the professionals within the context of the studied organization and received positive feedback across all four evaluation criteria set forth in this research, including the low overhead of data collection, high value of provided information, ease of understandability of the metric dashboard, and high generalizability of the proposed framework.