838 resultados para Teaching with geospatial technologies
Resumo:
OBJECTIVE: To develop and compare two new technologies for diagnosing a contiguous gene syndrome, the Williams-Beuren syndrome (WBS). METHODS: The first proposed method, named paralogous sequence quantification (PSQ), is based on the use of paralogous sequences located on different chromosomes and quantification of specific mismatches present at these loci using pyrosequencing technology. The second exploits quantitative real time polymerase chain reaction (QPCR) to assess the relative quantity of an analysed locus. RESULTS: A correct and unambiguous diagnosis was obtained for 100% of the analysed samples with either technique (n = 165 and n = 155, respectively). These methods allowed the identification of two patients with atypical deletions in a cohort of 182 WBS patients. Both patients presented with mild facial anomalies, mild mental retardation with impaired visuospatial cognition, supravalvar aortic stenosis, and normal growth indices. These observations are consistent with the involvement of GTF2IRD1 or GTF2I in some of the WBS facial features. CONCLUSIONS: Both PSQ and QPCR are robust, easy to interpret, and simple to set up. They represent a competitive alternative for the diagnosis of segmental aneuploidies in clinical laboratories. They have advantages over fluorescence in situ hybridisation or microsatellites/SNP genotyping for detecting short segmental aneuploidies as the former is costly and labour intensive while the latter depends on the informativeness of the polymorphisms.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
The Universitat Oberta de Catalunya (Open University of Catalonia, UOC) is an online university that makes extensive use of information and communication technologies to provide education. Ever since its establishment in 1995, the UOC has developed and tested methodologies and technological support services to meet the educational challenges posed by its student community and its teaching and management staff. The know-how it has acquired in doing so is the basis on which it has created the Open Apps platform, which is designed to provide access to open source technical applications, information on successful learning and teaching experiences, resources and other solutions, all in a single environment. Open Apps is an open, online catalogue, the content of which is available to all students for learning purposes, all IT professionals for downloading and all teachers for reusing.To contribute to the transfer of knowledge, experience and technology, each of the platform¿s apps comes with full documentation, plus information on cases in which it has been used and related tools. It is hoped that such transfer will lead to the growth of an external partner network, and that this, in turn, will result in improvements to the applications and teaching/learning practices, and in greater scope for collaboration.Open Apps is a strategic project that has arisen from the UOC's commitment to the open access movement and to giving knowledge and technology back to society, as well as its firm belief that sustainability depends on communities of interest.
Resumo:
Résumé: Les gouvernements des pays occidentaux ont dépensé des sommes importantes pour faciliter l'intégration des technologies de l'information et de la communication dans l'enseignement espérant trouver une solution économique à l'épineuse équation que l'on pourrait résumer par la célèbre formule " faire plus et mieux avec moins ". Cependant force est de constater que, malgré ces efforts et la très nette amélioration de la qualité de service des infrastructures, cet objectif est loin d'être atteint. Si nous pensons qu'il est illusoire d'attendre et d'espérer que la technologie peut et va, à elle seule, résoudre les problèmes de qualité de l'enseignement, nous croyons néanmoins qu'elle peut contribuer à améliorer les conditions d'apprentissage et participer de la réflexion pédagogique que tout enseignant devrait conduire avant de dispenser ses enseignements. Dans cette optique, et convaincu que la formation à distance offre des avantages non négligeables à condition de penser " autrement " l'enseignement, nous nous sommes intéressé à la problématique du développement de ce type d'applications qui se situent à la frontière entre les sciences didactiques, les sciences cognitives, et l'informatique. Ainsi, et afin de proposer une solution réaliste et simple permettant de faciliter le développement, la mise-à-jour, l'insertion et la pérennisation des applications de formation à distance, nous nous sommes impliqué dans des projets concrets. Au fil de notre expérience de terrain nous avons fait le constat que (i)la qualité des modules de formation flexible et à distance reste encore très décevante, entre autres parce que la valeur ajoutée que peut apporter l'utilisation des technologies n'est, à notre avis, pas suffisamment exploitée et que (ii)pour réussir tout projet doit, outre le fait d'apporter une réponse utile à un besoin réel, être conduit efficacement avec le soutien d'un " champion ". Dans l'idée de proposer une démarche de gestion de projet adaptée aux besoins de la formation flexible et à distance, nous nous sommes tout d'abord penché sur les caractéristiques de ce type de projet. Nous avons ensuite analysé les méthodologies de projet existantes dans l'espoir de pouvoir utiliser l'une, l'autre ou un panachage adéquat de celles qui seraient les plus proches de nos besoins. Nous avons ensuite, de manière empirique et par itérations successives, défini une démarche pragmatique de gestion de projet et contribué à l'élaboration de fiches d'aide à la décision facilitant sa mise en oeuvre. Nous décrivons certains de ses acteurs en insistant particulièrement sur l'ingénieur pédagogique que nous considérons comme l'un des facteurs clé de succès de notre démarche et dont la vocation est de l'orchestrer. Enfin, nous avons validé a posteriori notre démarche en revenant sur le déroulement de quatre projets de FFD auxquels nous avons participé et qui sont représentatifs des projets que l'on peut rencontrer dans le milieu universitaire. En conclusion nous pensons que la mise en oeuvre de notre démarche, accompagnée de la mise à disposition de fiches d'aide à la décision informatisées, constitue un atout important et devrait permettre notamment de mesurer plus aisément les impacts réels des technologies (i) sur l'évolution de la pratique des enseignants, (ii) sur l'organisation et (iii) sur la qualité de l'enseignement. Notre démarche peut aussi servir de tremplin à la mise en place d'une démarche qualité propre à la FFD. D'autres recherches liées à la réelle flexibilisation des apprentissages et aux apports des technologies pour les apprenants pourront alors être conduites sur la base de métriques qui restent à définir. Abstract: Western countries have spent substantial amount of monies to facilitate the integration of the Information and Communication Technologies (ICT) into Education hoping to find a solution to the touchy equation that can be summarized by the famous statement "do more and better with less". Despite these efforts, and notwithstanding the real improvements due to the undeniable betterment of the infrastructure and of the quality of service, this goal is far from reached. Although we think it illusive to expect technology, all by itself, to solve our economical and educational problems, we firmly take the view that it can greatly contribute not only to ameliorate learning conditions but participate to rethinking the pedagogical approach as well. Every member of our community could hence take advantage of this opportunity to reflect upon his or her strategy. In this framework, and convinced that integrating ICT into education opens a number of very interesting avenues provided we think teaching "out of the box", we got ourself interested in courseware development positioned at the intersection of didactics and pedagogical sciences, cognitive sciences and computing. Hence, and hoping to bring a realistic and simple solution that could help develop, update, integrate and sustain courseware we got involved in concrete projects. As ze gained field experience we noticed that (i)The quality of courseware is still disappointing, amongst others, because the added value that the technology can bring is not made the most of, as it could or should be and (ii)A project requires, besides bringing a useful answer to a real problem, to be efficiently managed and be "championed". Having in mind to propose a pragmatic and practical project management approach we first looked into open and distance learning characteristics. We then analyzed existing methodologies in the hope of being able to utilize one or the other or a combination to best fit our needs. In an empiric manner and proceeding by successive iterations and refinements, we defined a simple methodology and contributed to build descriptive "cards" attached to each of its phases to help decision making. We describe the different actors involved in the process insisting specifically on the pedagogical engineer, viewed as an orchestra conductor, whom we consider to be critical to ensure the success of our approach. Last but not least, we have validated a posteriori our methodology by reviewing four of the projects we participated to and that we think emblematic of the university reality. We believe that the implementation of our methodology, along with the availability of computerized cards to help project managers to take decisions, could constitute a great asset and contribute to measure the technologies' real impacts on (i) the evolution of teaching practices (ii) the organization and (iii) the quality of pedagogical approaches. Our methodology could hence be of use to help put in place an open and distance learning quality assessment. Research on the impact of technologies to learning adaptability and flexibilization could rely on adequate metrics.
Resumo:
Understanding how wikis are used to support collaborative learning is an important concern for researchers and teachers. Adopting a discourse analytic approach, this paper attempts to understand the teaching processes when a wiki is embedded in a science project in primary education to foster collaborative learning. Through studying interaction between the teacher and students, our findings identify ways in which the teacher prompts collaborative learning but also shed light on the difficulties for the teacher in supporting student collective collaboration. It is argued that technological wiki features supporting collaborative learning can only be realized if teacher talk and pedagogy are aligned with the characteristics of wiki collaborative work: the freedom of students to organize and participate by themselves, creating dialogic space and promoting student participation. We argue that a dialogic approach for examining interaction can be used to help to design a more effective pedagogic approach in the use of wikis in education, to shift into Web 2.0 learning paradigm and to equip learners with the competences they need to participate in knowledge co-construction.
Resumo:
Este trabajo persigue dos objetivos: el primero es analizar el uso de las TIC en un grupo de estudiantes de segundo curso de Magisterio de la Universidad de Girona; el segundo es analizar los documentos normativos legales que establecen el currículum de educación primaria en Cataluña para observar qué tipo de papel juegan las TIC en las nuevas programaciones educativas. La primera parte se ha llevado a cabo mediante una encuesta, cuyos resultados permiten observar tres aspectos distintos: el primero, que una parte considerable del grupo considera las TIC más como un complemento para el aprendizaje que como una forma de aprendizaje; el segundo, que a pesar de hacer un uso considerable de las TIC, el conocimiento que tienen de ellas es muy básico y utilizan aplicaciones muy genéricas; y el tercero es que una parte de sus propuestas didácticas para el uso de las TIC son propuestas tradicionales simplemente adaptadas a un nuevo instrumento, sin buscar realmente la innovación que puede suponer la incorporación de las TIC. En la segunda parte del artículo, a partir del análisis e interpretación de los documentos legales que establecen el currículum de Educación Primaria, se observa que en un mismo documento conviven aserciones sobre las TIC como complemento al aprendizaje de contenidos con otras formulaciones que consideran las TIC como constructoras de conocimiento. A partir del perfil de los estudiantes y del estado de los documentos legales, al final del artículo se hacen propuestas para formar al futuro maestro teniendo en cuenta las TIC como herramientas básicas de conocimiento
Resumo:
Programming and mathematics are core areas of computer science (CS) and consequently also important parts of CS education. Introductory instruction in these two topics is, however, not without problems. Studies show that CS students find programming difficult to learn and that teaching mathematical topics to CS novices is challenging. One reason for the latter is the disconnection between mathematics and programming found in many CS curricula, which results in students not seeing the relevance of the subject for their studies. In addition, reports indicate that students' mathematical capability and maturity levels are dropping. The challenges faced when teaching mathematics and programming at CS departments can also be traced back to gaps in students' prior education. In Finland the high school curriculum does not include CS as a subject; instead, focus is on learning to use the computer and its applications as tools. Similarly, many of the mathematics courses emphasize application of formulas, while logic, formalisms and proofs, which are important in CS, are avoided. Consequently, high school graduates are not well prepared for studies in CS. Motivated by these challenges, the goal of the present work is to describe new approaches to teaching mathematics and programming aimed at addressing these issues: Structured derivations is a logic-based approach to teaching mathematics, where formalisms and justifications are made explicit. The aim is to help students become better at communicating their reasoning using mathematical language and logical notation at the same time as they become more confident with formalisms. The Python programming language was originally designed with education in mind, and has a simple syntax compared to many other popular languages. The aim of using it in instruction is to address algorithms and their implementation in a way that allows focus to be put on learning algorithmic thinking and programming instead of on learning a complex syntax. Invariant based programming is a diagrammatic approach to developing programs that are correct by construction. The approach is based on elementary propositional and predicate logic, and makes explicit the underlying mathematical foundations of programming. The aim is also to show how mathematics in general, and logic in particular, can be used to create better programs.
Resumo:
This research responds to a pervasive call for our educational institutions to provide students with literacy skills, and teachers with the instructional supports necessary to facilitate this skill acquisition. Questions were posed to gain information concerning the efficacy ofteaching literacy strategies to students with learning difficulties, the impact of this training on their volunteer tutors, and the influence of this experience on these tutors' ensuing instructional practice as teacher candidates in a preservice education program. Study #1 compared a nontreatment group of students with literacy difficulties who participated in the program and found that program participants were superior at reading letter patterns and at comprehending the elements of story grammar. Concurrently, the second study explored the experiences of 19 volunteer tutors and uncovered that they acquired instructional skills as they established a knowledge base in teaching reading and writing, and they affirmed personal goals to become future teachers. Study #3 tracked 6 volunteer tutors into their pre-service year and identified their constructions, and beliefs about literacy instruction. These teacher candidates discussed how they had intended to teach reading and writing strategies based on their position that effective teaching ofthese skills in the primary grades is integral to academic success. The teacher candidates emphasized the need to build rapport with students, and the need to exercise flexibility in lesson plan delivery while including activities to meet emotional and developmental requirements of students. The teacher candidates entered their pre-service education with an initial cognition set based on the limited teaching context of tutoring. This foundational ii perception represented their prior knowledge of literacy instruction, a perception that appeared untenable once they were immersed in a regular instructional setting. This disparity provoked some of the teacher candidates to denounce their teacher mentors for not consistently employing literacy strategies and individualized instruction. This critical perspective could have been a demonstration of cognitive dissonance. In the end, when the teacher candidates began to look toward the future and how they would manage the demands of an inclusive classroom, they recognized the differences in the contexts. With an appreciation for the need for balance between prior and present knowledge, the teacher candidates remained committed to implementing their tutoring strategies in future teaching positions. This document highlights the need for teacher candidates with instructional experience prior to teacher education, to engage in cognitive negotiations to assimilate newly acquired pedagogies into existing pedagogies.
Resumo:
Stimulus equivalence involves teaching two conditional discriminations that share one stimulus in common and testing all possible conditional discriminations not taught (Saunders & Green, 1999). Despite considerable research in the laboratory, applied studies of stimulus equivalence have been limited (Vause, Martin, Marion, & Sakko, 2005). This study investigated the field-effectiveness of stimulus equivalence in teaching reading skills to children with Autism. Participants were four children with Autism receiving centre-based intensive behavioural intervention (lBI) treatment. Three of the participants, who already matched pictures to their dictated names, demonstrated six to eight more emergent performances after being taught only to match written words to the same names. One participant struggled with the demands of the study and his participation was discontinued. Results suggest that stimulus equivalence provided an effective and efficient teaching strategy for three of the four participants in this study.
Resumo:
The current study examined the effectiveness of a sexual abuse prevention program developed locally for children with intellectual disabilities. The program package included a board game with informational storybooks that were designed to be used in a family setting. Additionally, this research sought to determine if parents could be effective at presenting the sexual abuse pr~vention materials to their children. A multiple baseline across behaviours design was used with two participants with a diagnosis of autism. Through role play scenarios as well as verbal knowledge tests, it was determined that the program was effective at teaching the participants the skills presented for self protection. It was also determined that the skills learned were generalized to scenarios that were untrained during the game play. Finally, with additional supports, it was determined that parents were able to effectively teach their children the required skills.
Resumo:
Persons with intellectual disabilities (ID) are far more likely to be abused than the general population, but there is little research on teaching people with ID about their rights. The goal of this study was to teach four participants with ID and limited communication abilities about their human rights by training them on specific rights topics. The training program included icebreaker activities, instruction on rights concepts, watching and answering questions about videotaped scenarios of rights restrictions, watching and answering questions about role pl ay scenarios of rights restrictions, and responding to brief, low risk in situ rights restrictions imposed by the researchers. Participant performance did not improve significantly or consistently from baseline to training on the questions asked about the videotaped or the role play scenarios, but two of three participants demonstrated defmite improvements in responding to in situ rights restrictions.
Resumo:
Complex social-cognitive deficits are common in individuals diagnosed with high functioning autism and Asperger syndrome. Research on effective and evidence-based social interventions is needed for this population. This study focused specifically on the challenges these individuals face with respect to flexible thinking and related flexible behaviour in social situations. Madrigal and Winner's (2008) Superflex curriculum - targets social flexibility, however at the time of this study no published research had been conducted to determine the effectiveness of this approach. This study was a pilot study, which sought to examine the impact of the Superflex curriculum within a 10-week training program in teaching one individual with high functioning autism how to think and behave flexibly in social situations. Multiple measurement tools were utilized, and analyses within and across the measures revealed inconsistencies, especially with respect to generalization. Although preliminary, this study provided valuable information for subsequent research.
Resumo:
The purpose of this research study was to determine if the instructional model, Teaching Games for Understanding (TGfU), would allow for the successful teaching of sport to disengaged female students in Physical Education (PE) classes. An instrumental case study research design was used to determine grade nine female students’ experiences with TGfU, the factors of TGfU that facilitated their engagement, and the ways in which these students resisted engaging in TGfU. Data was collected through a pre and post TGfU unit focus group, participant observation, in-depth interviews, and researcher reflections. Results showed that TGfU caused an increase in the participants’ engagement in PE physically, mentally, and socially/emotionally. Future researchers could structure their entire study holistically and should examine TGfU’s impact on student engagement over the course of an entire semester. Subsequent studies should moreover examine the presence of disengagement within physically skilled students in PE.
Teaching Adolescents to Think and Act Responsibly Through Narrative Film-making: A Qualitative Study
Resumo:
The current qualitative study examined an adapted version of the psychoeducational program, Teaching Adolescents to Think and Act Responsibly: The EQUIP Approach (DiBiase, Gibbs, Potter, & Blount, 2012). The adapted version, referred to as the EQUIP – Narrative Filmmaking Program, was implemented as a means of character education. The purpose of this study was three-fold: 1) to examine how the EQUIP – Narrative Film-making Program influenced student’s thoughts, feelings, and behaviours; 2) to explore the students’ and the teacher’s perception of their experience with the program; and 3) to assess whether or not the integrated EQUIP – Narrative Film-making Program addressed the goals of Ontario’s character education initiative. Purposive sampling was used to select one typical Grade 9 Exploring Technologies class, consisting of 15 boys from a Catholic board of education in the southern Ontario region. The EQUIP – Narrative Film-making Program required students to create moral narrative films that first portrayed a set of self-centered cognitive distortions, with follow-up portrayals of behavioural modifications. Before, during, and after intervention questionnaires were administered to the students and teacher. The student questionnaires invited responses to a set of cognitive distortion vignettes. In addition, data was collected through student and teacher interviews, and researcher observation protocol reports. Initially the data was coded according to an a priori set of themes that were further analyzed according to emotion and values coding methods. The results indicated that while each student was unique in his thoughts, feelings, and behavioural responses to the cognitive distortion vignettes after completing the EQUIP program, the overall trends showed students had a more positive attitude, with a decreased proclivity for antisocial behaviour and self-serving cognitive distortion portrayed in the vignettes. Overall, the teacher and students’ learning experiences were mainly positive and the program met the learning expectations of Ontario’s character education initiative. Based on these results of the present study, it is recommended that the EQUIP – Narrative Film-making Program be further evaluated through quantitative research and longitudinal study.
Resumo:
This study investigated instructor perceptions of motivators and barriers that exist with respect to participation in educational development in the postsecondary context. Eight instructors from a mid-size, research intensive university in south-western Ontario participated in semistructured interviews to explore this particular issue. Data were analyzed using a qualitative approach. Motivation theory was used as a conceptual framework in this study, referring primarily to the work of Ryan and Deci (2000), Deci and Ryan (1985), and Pink (2009). The identified motivators and barriers spanned all 3 levels of postsecondary institutions: the micro (i.e., the individual), the meso (i.e., the department or Faculty), and the macro (i.e., the institution). Significant motivators to participation in educational development included desire to improve one’s teaching (micro), feedback from students (meso), and tenure and promotion (macro). Significant barriers to participation included lack of time (micro), the perception that an investment towards one’s research was more important than an investment to enhancing teaching (meso), and the impression that quality teaching was not valued by the institution (macro). The study identifies connections between the micro, meso, macro framework and motivation theory, and offers recommendations for practice.