4 resultados para Foreign language learning and teaching
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Les recherches relatives à l'utilisation des TICE se concentrent fréquemment soit sur la dimension cognitive, sur la dimension linguistique ou sur la dimension culturelle. Le plus souvent, les recherches empiriques se proposent d'évaluer les effets directs des TICE sur les performances langagières des apprenants. En outre, les recherches, surtout en psychologie cognitive, sont le plus souvent effectuées en laboratoire. C'est pourquoi le travail présenté dans cette thèse se propose d'inscrire l'utilisation des TICE dans une perspective écologique, et de proposer une approche intégrée pour l'analyse des pratiques effectives aussi bien en didactique des langues qu'en didactique de la traduction. En ce qui concerne les aspects cognitifs, nous recourons à un concept apprécié des praticiens, celui de stratégies d'apprentissage. Les quatre premiers chapitres de la présente thèse sont consacrés à l'élaboration du cadre théorique dans lequel nous inscrivons notre recherche. Nous aborderons en premier lieu les aspects disciplinaires, et notamment l’interdisciplinarité de nos deux champs de référence. Ensuite nous traiterons les stratégies d'apprentissage et les stratégies de traduction. Dans un troisième mouvement, nous nous efforcerons de définir les deux compétences visées par notre recherche : la production écrite et la traduction. Dans un quatrième temps, nous nous intéresserons aux modifications introduites par les TICE dans les pratiques d'enseignement et d'apprentissage de ces deux compétences. Le cinquième chapitre a pour objet la présentation, l'analyse des données recueillies auprès de groupes d'enseignants et d'étudiants de la section de français de la SSLMIT. Il s’agira dans un premier temps, de présenter notre corpus. Ensuite nous procéderons à l’analyse des données. Enfin, nous présenterons, après une synthèse globale, des pistes didactiques et scientifiques à même de prolonger notre travail.
Resumo:
In the framework of industrial problems, the application of Constrained Optimization is known to have overall very good modeling capability and performance and stands as one of the most powerful, explored, and exploited tool to address prescriptive tasks. The number of applications is huge, ranging from logistics to transportation, packing, production, telecommunication, scheduling, and much more. The main reason behind this success is to be found in the remarkable effort put in the last decades by the OR community to develop realistic models and devise exact or approximate methods to solve the largest variety of constrained or combinatorial optimization problems, together with the spread of computational power and easily accessible OR software and resources. On the other hand, the technological advancements lead to a data wealth never seen before and increasingly push towards methods able to extract useful knowledge from them; among the data-driven methods, Machine Learning techniques appear to be one of the most promising, thanks to its successes in domains like Image Recognition, Natural Language Processes and playing games, but also the amount of research involved. The purpose of the present research is to study how Machine Learning and Constrained Optimization can be used together to achieve systems able to leverage the strengths of both methods: this would open the way to exploiting decades of research on resolution techniques for COPs and constructing models able to adapt and learn from available data. In the first part of this work, we survey the existing techniques and classify them according to the type, method, or scope of the integration; subsequently, we introduce a novel and general algorithm devised to inject knowledge into learning models through constraints, Moving Target. In the last part of the thesis, two applications stemming from real-world projects and done in collaboration with Optit will be presented.
Resumo:
Creativity seems mysterious; when we experience a creative spark, it is difficult to explain how we got that idea, and we often recall notions like ``inspiration" and ``intuition" when we try to explain the phenomenon. The fact that we are clueless about how a creative idea manifests itself does not necessarily imply that a scientific explanation cannot exist. We are unaware of how we perform certain tasks, such as biking or language understanding, but we have more and more computational techniques that can replicate and hopefully explain such activities. We should understand that every creative act is a fruit of experience, society, and culture. Nothing comes from nothing. Novel ideas are never utterly new; they stem from representations that are already in mind. Creativity involves establishing new relations between pieces of information we had already: then, the greater the knowledge, the greater the possibility of finding uncommon connections, and the more the potential to be creative. In this vein, a beneficial approach to a better understanding of creativity must include computational or mechanistic accounts of such inner procedures and the formation of the knowledge that enables such connections. That is the aim of Computational Creativity: to develop computational systems for emulating and studying creativity. Hence, this dissertation focuses on these two related research areas: discussing computational mechanisms to generate creative artifacts and describing some implicit cognitive processes that can form the basis for creative thoughts.
Resumo:
The development of Next Generation Sequencing promotes Biology in the Big Data era. The ever-increasing gap between proteins with known sequences and those with a complete functional annotation requires computational methods for automatic structure and functional annotation. My research has been focusing on proteins and led so far to the development of three novel tools, DeepREx, E-SNPs&GO and ISPRED-SEQ, based on Machine and Deep Learning approaches. DeepREx computes the solvent exposure of residues in a protein chain. This problem is relevant for the definition of structural constraints regarding the possible folding of the protein. DeepREx exploits Long Short-Term Memory layers to capture residue-level interactions between positions distant in the sequence, achieving state-of-the-art performances. With DeepRex, I conducted a large-scale analysis investigating the relationship between solvent exposure of a residue and its probability to be pathogenic upon mutation. E-SNPs&GO predicts the pathogenicity of a Single Residue Variation. Variations occurring on a protein sequence can have different effects, possibly leading to the onset of diseases. E-SNPs&GO exploits protein embeddings generated by two novel Protein Language Models (PLMs), as well as a new way of representing functional information coming from the Gene Ontology. The method achieves state-of-the-art performances and is extremely time-efficient when compared to traditional approaches. ISPRED-SEQ predicts the presence of Protein-Protein Interaction sites in a protein sequence. Knowing how a protein interacts with other molecules is crucial for accurate functional characterization. ISPRED-SEQ exploits a convolutional layer to parse local context after embedding the protein sequence with two novel PLMs, greatly surpassing the current state-of-the-art. All methods are published in international journals and are available as user-friendly web servers. They have been developed keeping in mind standard guidelines for FAIRness (FAIR: Findable, Accessible, Interoperable, Reusable) and are integrated into the public collection of tools provided by ELIXIR, the European infrastructure for Bioinformatics.