746 resultados para Regulation devices and piloting learning
Resumo:
Two important challenges that teachers are currently facing are the sharing and the collaborative authoring of their learning design solutions, such as didactical units and learning materials. On the one hand, there are tools that can be used for the creation of design solutions and only some of them facilitate the co-edition. However, they do not incorporate mechanisms that support the sharing of the designs between teachers. On the other hand, there are tools that serve as repositories of educational resources but they do not enable the authoring of the designs. In this paper we present LdShake, a web tool whose novelty is focused on the combined support for the social sharing and co-edition of learning design solutions within communities of teachers. Teachers can create and share learning designs with other teachers using different access rights so that they can read, comment or co-edit the designs. Therefore, each design solution is associated to a group of teachers able to work on its definition, and another group that can only see the design. The tool is generic in that it allows the creation of designs based on any pedagogical approach. However, it can be particularized in instances providing pre-formatted designs structured according to a specific didactic method (such as Problem-Based Learning, PBL). A particularized LdShake instance has been used in the context of Human Biology studies where teams of teachers are required to work together in the design of PBL solutions. A controlled user study, that compares the use of a generic LdShake and a Moodle system, configured to enable the creation and sharing of designs, has been also carried out. The combined results of the real and controlled studies show that the social structure, and the commenting, co-edition and publishing features of LdShake provide a useful, effective and usable approach for facilitating teachers' teamwork.
Resumo:
OBJECTIVETo identify the association between the use of web simulation electrocardiography and the learning approaches, strategies and styles of nursing degree students.METHODA descriptive and correlational design with a one-group pretest-posttest measurement was used. The study sample included 246 students in a Basic and Advanced Cardiac Life Support nursing class of nursing degree.RESULTSNo significant differences between genders were found in any dimension of learning styles and approaches to learning. After the introduction of web simulation electrocardiography, significant differences were found in some item scores of learning styles: theorist (p < 0.040), pragmatic (p < 0.010) and approaches to learning.CONCLUSIONThe use of a web electrocardiogram (ECG) simulation is associated with the development of active and reflexive learning styles, improving motivation and a deep approach in nursing students.
Resumo:
The amygdala is part of a neural network that contributes to the regulation of emotional behaviors. Rodents, especially rats, are used extensively as model organisms to decipher the functions of specific amygdala nuclei, in particular in relation to fear and emotional learning. Analysis of the role of the nonhuman primate amygdala in these functions has lagged work in the rodent but provides evidence for conservation of basic functions across species. Here we provide quantitative information regarding the morphological characteristics of the main amygdala nuclei in rats and monkeys, including neuron and glial cell numbers, neuronal soma size, and individual nuclei volumes. The volumes of the lateral, basal, and accessory basal nuclei were, respectively, 32, 39, and 39 times larger in monkeys than in rats. In contrast, the central and medial nuclei were only 8 and 4 times larger in monkeys than in rats. The numbers of neurons in the lateral, basal, and accessory basal nuclei were 14, 11, and 16 times greater in monkeys than in rats, whereas the numbers of neurons in the central and medial nuclei were only 2.3 and 1.5 times greater in monkeys than in rats. Neuron density was between 2.4 and 3.7 times lower in monkeys than in rats, whereas glial density was only between 1.1 and 1.7 times lower in monkeys than in rats. We compare our data in rats and monkeys with those previously published in humans and discuss the theoretical and functional implications that derive from our quantitative structural findings.
Resumo:
Recent findings in neuroscience suggest that adult brain structure changes in response to environmental alterations and skill learning. Whereas much is known about structural changes after intensive practice for several months, little is known about the effects of single practice sessions on macroscopic brain structure and about progressive (dynamic) morphological alterations relative to improved task proficiency during learning for several weeks. Using T1-weighted and diffusion tensor imaging in humans, we demonstrate significant gray matter volume increases in frontal and parietal brain areas following only two sessions of practice in a complex whole-body balancing task. Gray matter volume increase in the prefrontal cortex correlated positively with subject's performance improvements during a 6 week learning period. Furthermore, we found that microstructural changes of fractional anisotropy in corresponding white matter regions followed the same temporal dynamic in relation to task performance. The results make clear how marginal alterations in our ever changing environment affect adult brain structure and elucidate the interrelated reorganization in cortical areas and associated fiber connections in correlation with improvements in task performance.
Resumo:
BACKGROUND: Electrophysiological cardiac devices are increasingly used. The frequency of subclinical infection is unknown. We investigated all explanted devices using sonication, a method for detection of microbial biofilms on foreign bodies. METHODS AND RESULTS: Consecutive patients in whom cardiac pacemakers and implantable cardioverter/defibrillators were removed at our institution between October 2007 and December 2008 were prospectively included. Devices (generator and/or leads) were aseptically removed and sonicated, and the resulting sonication fluid was cultured. In parallel, conventional swabs of the generator pouch were performed. A total of 121 removed devices (68 pacemakers, 53 implantable cardioverter/defibrillators) were included. The reasons for removal were insufficient battery charge (n=102), device upgrading (n=9), device dysfunction (n=4), or infection (n=6). In 115 episodes (95%) without clinical evidence of infection, 44 (38%) grew bacteria in sonication fluid, including Propionibacterium acnes (n=27), coagulase-negative staphylococci (n=11), Gram-positive anaerobe cocci (n=3), Gram-positive anaerobe rods (n=1), Gram-negative rods (n=1), and mixed bacteria (n=1). In 21 of 44 sonication-positive episodes, bacterial counts were significant (>or=10 colony-forming units/mL of sonication fluid). In 26 sterilized controls, sonication cultures remained negative in 25 cases (96%). In 112 cases without clinical infection, conventional swab cultures were performed: 30 cultures (27%) were positive, and 18 (60%) were concordant with sonication fluid cultures. Six devices and leads were removed because of infection, growing Staphylococcus aureus, Streptococcus mitis, and coagulase-negative staphylococci in 6 sonication fluid cultures and 4 conventional swab cultures. CONCLUSIONS: Bacteria can colonize cardiac electrophysiological devices without clinical signs of infection.
Resumo:
Abstract : The occupational health risk involved with handling nanoparticles is the probability that a worker will experience an adverse health effect: this is calculated as a function of the worker's exposure relative to the potential biological hazard of the material. Addressing the risks of nanoparticles requires therefore knowledge on occupational exposure and the release of nanoparticles into the environment as well as toxicological data. However, information on exposure is currently not systematically collected; therefore this risk assessment lacks quantitative data. This thesis aimed at, first creating the fundamental data necessary for a quantitative assessment and, second, evaluating methods to measure the occupational nanoparticle exposure. The first goal was to determine what is being used where in Swiss industries. This was followed by an evaluation of the adequacy of existing measurement methods to assess workplace nanopaiticle exposure to complex size distributions and concentration gradients. The study was conceived as a series of methodological evaluations aimed at better understanding nanoparticle measurement devices and methods. lt focused on inhalation exposure to airborne particles, as respiration is considered to be the most important entrance pathway for nanoparticles in the body in terms of risk. The targeted survey (pilot study) was conducted as a feasibility study for a later nationwide survey on the handling of nanoparticles and the applications of specific protection means in industry. The study consisted of targeted phone interviews with health and safety officers of Swiss companies that were believed to use or produce nanoparticles. This was followed by a representative survey on the level of nanoparticle usage in Switzerland. lt was designed based on the results of the pilot study. The study was conducted among a representative selection of clients of the Swiss National Accident Insurance Fund (SUVA), covering about 85% of Swiss production companies. The third part of this thesis focused on the methods to measure nanoparticles. Several pre- studies were conducted studying the limits of commonly used measurement devices in the presence of nanoparticle agglomerates, This focus was chosen, because several discussions with users and producers of the measurement devices raised questions about their accuracy measuring nanoparticle agglomerates and because, at the same time, the two survey studies revealed that such powders are frequently used in industry. The first preparatory experiment focused on the accuracy of the scanning mobility particle sizer (SMPS), which showed an improbable size distribution when measuring powders of nanoparticle agglomerates. Furthermore, the thesis includes a series of smaller experiments that took a closer look at problems encountered with other measurement devices in the presence of nanoparticle agglomerates: condensation particle counters (CPC), portable aerosol spectrometer (PAS) a device to estimate the aerodynamic diameter, as well as diffusion size classifiers. Some initial feasibility tests for the efficiency of filter based sampling and subsequent counting of carbon nanotubes (CNT) were conducted last. The pilot study provided a detailed picture of the types and amounts of nanoparticles used and the knowledge of the health and safety experts in the companies. Considerable maximal quantities (> l'000 kg/year per company) of Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO (mainly first generation particles) were declared by the contacted Swiss companies, The median quantity of handled nanoparticles, however, was 100 kg/year. The representative survey was conducted by contacting by post mail a representative selection of l '626 SUVA-clients (Swiss Accident Insurance Fund). It allowed estimation of the number of companies and workers dealing with nanoparticles in Switzerland. The extrapolation from the surveyed companies to all companies of the Swiss production sector suggested that l'309 workers (95%-confidence interval l'073 to l'545) of the Swiss production sector are potentially exposed to nanoparticles in 586 companies (145 to l'027). These numbers correspond to 0.08% (0.06% to 0.09%) of all workers and to 0.6% (0.2% to 1.1%) of companies in the Swiss production sector. To measure airborne concentrations of sub micrometre-sized particles, a few well known methods exist. However, it was unclear how well the different instruments perform in the presence of the often quite large agglomerates of nanostructured materials. The evaluation of devices and methods focused on nanoparticle agglomerate powders. lt allowed the identification of the following potential sources of inaccurate measurements at workplaces with considerable high concentrations of airborne agglomerates: - A standard SMPS showed bi-modal particle size distributions when measuring large nanoparticle agglomerates. - Differences in the range of a factor of a thousand were shown between diffusion size classifiers and CPC/SMPS. - The comparison between CPC/SMPS and portable aerosol Spectrometer (PAS) was much better, but depending on the concentration, size or type of the powders measured, the differences were still of a high order of magnitude - Specific difficulties and uncertainties in the assessment of workplaces were identified: the background particles can interact with particles created by a process, which make the handling of background concentration difficult. - Electric motors produce high numbers of nanoparticles and confound the measurement of the process-related exposure. Conclusion: The surveys showed that nanoparticles applications exist in many industrial sectors in Switzerland and that some companies already use high quantities of them. The representative survey demonstrated a low prevalence of nanoparticle usage in most branches of the Swiss industry and led to the conclusion that the introduction of applications using nanoparticles (especially outside industrial chemistry) is only beginning. Even though the number of potentially exposed workers was reportedly rather small, it nevertheless underscores the need for exposure assessments. Understanding exposure and how to measure it correctly is very important because the potential health effects of nanornaterials are not yet fully understood. The evaluation showed that many devices and methods of measuring nanoparticles need to be validated for nanoparticles agglomerates before large exposure assessment studies can begin. Zusammenfassung : Das Gesundheitsrisiko von Nanopartikel am Arbeitsplatz ist die Wahrscheinlichkeit dass ein Arbeitnehmer einen möglichen Gesundheitsschaden erleidet wenn er diesem Stoff ausgesetzt ist: sie wird gewöhnlich als Produkt von Schaden mal Exposition gerechnet. Für eine gründliche Abklärung möglicher Risiken von Nanomaterialien müssen also auf der einen Seite Informationen über die Freisetzung von solchen Materialien in die Umwelt vorhanden sein und auf der anderen Seite solche über die Exposition von Arbeitnehmenden. Viele dieser Informationen werden heute noch nicht systematisch gesarnmelt und felilen daher für Risikoanalysen, Die Doktorarbeit hatte als Ziel, die Grundlagen zu schaffen für eine quantitative Schatzung der Exposition gegenüber Nanopartikel am Arbeitsplatz und die Methoden zu evaluieren die zur Messung einer solchen Exposition nötig sind. Die Studie sollte untersuchen, in welchem Ausmass Nanopartikel bereits in der Schweizer Industrie eingesetzt werden, wie viele Arbeitnehrner damit potentiel] in Kontakt komrrien ob die Messtechnologie für die nötigen Arbeitsplatzbelastungsmessungen bereits genügt, Die Studie folcussierte dabei auf Exposition gegenüber luftgetragenen Partikel, weil die Atmung als Haupteintrittspforte iïlr Partikel in den Körper angesehen wird. Die Doktorarbeit besteht baut auf drei Phasen auf eine qualitative Umfrage (Pilotstudie), eine repräsentative, schweizerische Umfrage und mehrere technische Stndien welche dem spezitischen Verständnis der Mëglichkeiten und Grenzen einzelner Messgeräte und - teclmikeri dienen. Die qualitative Telephonumfrage wurde durchgeführt als Vorstudie zu einer nationalen und repräsentativen Umfrage in der Schweizer Industrie. Sie zielte auf Informationen ab zum Vorkommen von Nanopartikeln, und den angewendeten Schutzmassnahmen. Die Studie bestand aus gezielten Telefoninterviews mit Arbeit- und Gesundheitsfachpersonen von Schweizer Unternehmen. Die Untemehmen wurden aufgrund von offentlich zugànglichen lnformationen ausgewählt die darauf hinwiesen, dass sie mit Nanopartikeln umgehen. Der zweite Teil der Dolctorarbeit war die repräsentative Studie zur Evalniernng der Verbreitnng von Nanopaitikelanwendungen in der Schweizer lndustrie. Die Studie baute auf lnformationen der Pilotstudie auf und wurde mit einer repräsentativen Selektion von Firmen der Schweizerischen Unfall Versicherungsanstalt (SUVA) durchgeüihxt. Die Mehrheit der Schweizerischen Unternehmen im lndustrieselctor wurde damit abgedeckt. Der dritte Teil der Doktorarbeit fokussierte auf die Methodik zur Messung von Nanopartikeln. Mehrere Vorstudien wurden dnrchgefîihrt, um die Grenzen von oft eingesetzten Nanopartikelmessgeräten auszuloten, wenn sie grösseren Mengen von Nanopartikel Agglomeraten ausgesetzt messen sollen. Dieser F okns wurde ans zwei Gründen gewählt: weil mehrere Dislcussionen rnit Anwendem und auch dem Produzent der Messgeràte dort eine Schwachstelle vermuten liessen, welche Zweifel an der Genauigkeit der Messgeräte aufkommen liessen und weil in den zwei Umfragestudien ein häufiges Vorkommen von solchen Nanopartikel-Agglomeraten aufgezeigt wurde. i Als erstes widmete sich eine Vorstndie der Genauigkeit des Scanning Mobility Particle Sizer (SMPS). Dieses Messgerät zeigte in Präsenz von Nanopartikel Agglorneraten unsinnige bimodale Partikelgrössenverteilung an. Eine Serie von kurzen Experimenten folgte, welche sich auf andere Messgeräte und deren Probleme beim Messen von Nanopartikel-Agglomeraten konzentrierten. Der Condensation Particle Counter (CPC), der portable aerosol spectrometer (PAS), ein Gerät zur Schàtzung des aerodynamischen Durchniessers von Teilchen, sowie der Diffusion Size Classifier wurden getestet. Einige erste Machbarkeitstests zur Ermittlnng der Effizienz von tilterbasierter Messung von luftgetragenen Carbon Nanotubes (CNT) wnrden als letztes durchgeiührt. Die Pilotstudie hat ein detailliiertes Bild der Typen und Mengen von genutzten Nanopartikel in Schweizer Unternehmen geliefert, und hat den Stand des Wissens der interviewten Gesundheitsschntz und Sicherheitsfachleute aufgezeigt. Folgende Typen von Nanopaitikeln wurden von den kontaktierten Firmen als Maximalmengen angegeben (> 1'000 kg pro Jahr / Unternehrnen): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, und ZnO (hauptsächlich Nanopartikel der ersten Generation). Die Quantitäten von eingesetzten Nanopartikeln waren stark verschieden mit einem ein Median von 100 kg pro Jahr. ln der quantitativen Fragebogenstudie wurden l'626 Unternehmen brieflich kontaktiert; allesamt Klienten der Schweizerischen Unfallversicherringsanstalt (SUVA). Die Resultate der Umfrage erlaubten eine Abschätzung der Anzahl von Unternehmen und Arbeiter, welche Nanopartikel in der Schweiz anwenden. Die Hochrechnung auf den Schweizer lndnstriesektor hat folgendes Bild ergeben: ln 586 Unternehmen (95% Vertrauensintervallz 145 bis 1'027 Unternehmen) sind 1'309 Arbeiter potentiell gegenüber Nanopartikel exponiert (95%-Vl: l'073 bis l'545). Diese Zahlen stehen für 0.6% der Schweizer Unternehmen (95%-Vl: 0.2% bis 1.1%) und 0.08% der Arbeiternehmerschaft (95%-V1: 0.06% bis 0.09%). Es gibt einige gut etablierte Technologien um die Luftkonzentration von Submikrometerpartikel zu messen. Es besteht jedoch Zweifel daran, inwiefern sich diese Technologien auch für die Messurrg von künstlich hergestellten Nanopartikeln verwenden lassen. Aus diesem Grund folcussierten die vorbereitenden Studien für die Arbeitsplatzbeurteilnngen auf die Messung von Pulverri, welche Nan0partike1-Agg10merate enthalten. Sie erlaubten die ldentifikation folgender rnöglicher Quellen von fehlerhaften Messungen an Arbeitsplätzen mit erhöhter Luft-K0nzentrati0n von Nanopartikel Agglomeratenz - Ein Standard SMPS zeigte eine unglaubwürdige bimodale Partikelgrössenverteilung wenn er grössere Nan0par'til<e1Agg10merate gemessen hat. - Grosse Unterschiede im Bereich von Faktor tausend wurden festgestellt zwischen einem Diffusion Size Classiîier und einigen CPC (beziehungsweise dem SMPS). - Die Unterschiede zwischen CPC/SMPS und dem PAS waren geringer, aber abhängig von Grosse oder Typ des gemessenen Pulvers waren sie dennoch in der Grössenordnung von einer guten Grössenordnung. - Spezifische Schwierigkeiten und Unsicherheiten im Bereich von Arbeitsplatzmessungen wurden identitiziert: Hintergrundpartikel können mit Partikeln interagieren die während einem Arbeitsprozess freigesetzt werden. Solche Interaktionen erschweren eine korrekte Einbettung der Hintergrunds-Partikel-Konzentration in die Messdaten. - Elektromotoren produzieren grosse Mengen von Nanopartikeln und können so die Messung der prozessbezogenen Exposition stören. Fazit: Die Umfragen zeigten, dass Nanopartikel bereits Realitàt sind in der Schweizer Industrie und dass einige Unternehmen bereits grosse Mengen davon einsetzen. Die repräsentative Umfrage hat diese explosive Nachricht jedoch etwas moderiert, indem sie aufgezeigt hat, dass die Zahl der Unternehmen in der gesamtschweizerischen Industrie relativ gering ist. In den meisten Branchen (vor allem ausserhalb der Chemischen Industrie) wurden wenig oder keine Anwendungen gefunden, was schliessen last, dass die Einführung dieser neuen Technologie erst am Anfang einer Entwicklung steht. Auch wenn die Zahl der potentiell exponierten Arbeiter immer noch relativ gering ist, so unterstreicht die Studie dennoch die Notwendigkeit von Expositionsmessungen an diesen Arbeitsplätzen. Kenntnisse um die Exposition und das Wissen, wie solche Exposition korrekt zu messen, sind sehr wichtig, vor allem weil die möglichen Auswirkungen auf die Gesundheit noch nicht völlig verstanden sind. Die Evaluation einiger Geräte und Methoden zeigte jedoch, dass hier noch Nachholbedarf herrscht. Bevor grössere Mess-Studien durgefîihrt werden können, müssen die Geräte und Methodem für den Einsatz mit Nanopartikel-Agglomeraten validiert werden.
Resumo:
The present research deals with the review of the analysis and modeling of Swiss franc interest rate curves (IRC) by using unsupervised (SOM, Gaussian Mixtures) and supervised machine (MLP) learning algorithms. IRC are considered as objects embedded into different feature spaces: maturities; maturity-date, parameters of Nelson-Siegel model (NSM). Analysis of NSM parameters and their temporal and clustering structures helps to understand the relevance of model and its potential use for the forecasting. Mapping of IRC in a maturity-date feature space is presented and analyzed for the visualization and forecasting purposes.
Resumo:
Scientific reporting and communication is a challenging topic for which traditional study programs do not offer structured learning activities on a regular basis. This paper reports on the development and implementation of a web application and associated learning activities that intend to raise the awareness of reporting and communication issues among students in forensic science and law. The project covers interdisciplinary case studies based on a library of written reports about forensic examinations. Special features of the web framework, in particular a report annotation tool, support the design of various individual and group learning activities that focus on the development of knowledge and competence in dealing with reporting and communication challenges in the students' future areas of professional activity.
Resumo:
Portable (roll-out) stop signs are used at school crossings in over 300 cities in Iowa. Their use conforms to the Code of Iowa, although it is not consistent with the provisions of the Manual on Uniform Traffic Control Devices adopted for nationwide application. A survey indicated that most users in Iowa believe that portable stop signs provide effective protection at school crossings, and favor their continued use. Other non-uniform signs that fold or rotate to display a STOP message only during certain hours are used at school crossings in over 60 cities in Iowa. Their use does not conform to either the Code of Iowa or the Manual on Uniform Traffic Control Devices. Users of these devices also tend to favor their continued use. A survey of other states indicated that use of temporary devices similar to those used in Iowa is not generally sanctioned. Some unsanctioned use apparently occurs in several states, however. A different type of portable stop sign for school crossings is authorized and widely used in one state. Portable stop signs similar to those used in Iowa are authorized in another state, although their use is quite limited. A few reports in the literature reviewed for this research discussed the use of portable stop signs. The authors of these reports uniformly recommended against the use of portable or temporary traffic control devices. Various reasons for this recommendation were given, although data to support the recommendation were not offered. As part of this research, field surveys were conducted at 54 locations in 33 communities where temporary stop control devices were in use at school crossings. Research personnel observed the obedience to stop control and measured the vehicular delay incurred. Stopped delay averaged 1.89 seconds/entering vehicle. Only 36.6 percent of the vehicles were observed to come to a complete stop at the study locations controlled by temporary stop control devices. However, this level of obedience does not differ from that observed at intersections controlled by permanent stop signs. Accident experience was compiled for 76 intersections in 33 communities in Iowa where temporary stop signs were used and, for comparative purposes, at 76 comparable intersections having other forms of control or operating without stop control. There were no significant differences in accident experience An economic analysis of vehicle operating costs, delay costs, and other costs indicated that temporary stop control generated costs only about 12 percent as great as permanent stop control for a street having a school crossing. Midblock pedestrian-actuated signals were shown to be cost effective in comparison with temporary stop signs under the conditions of use assumed. Such signals could be used effectively at a number of locations where temporary stop signs are being used. The results of this research do not provide a basis for recommending that use of portable stop signs be prohibited. However, erratic patterns of use of these devices and inadequate designs suggest that improved standards for their use are needed. Accordingly, nine recommendations are presented to enhance the efficiency of vehicular flow at school crossings, without causing a decline in the level of pedestrian protection being afforded.
Resumo:
Glucose-dependent insulinotropic polypeptide (GIP) is a key incretin hormone, released from intestine after a meal, producing a glucose-dependent insulin secretion. The GIP receptor (GIPR) is expressed on pyramidal neurons in the cortex and hippocampus, and GIP is synthesized in a subset of neurons in the brain. However, the role of the GIPR in neuronal signaling is not clear. In this study, we used a mouse strain with GIPR gene deletion (GIPR KO) to elucidate the role of the GIPR in neuronal communication and brain function. Compared with C57BL/6 control mice, GIPR KO mice displayed higher locomotor activity in an open-field task. Impairment of recognition and spatial learning and memory of GIPR KO mice were found in the object recognition task and a spatial water maze task, respectively. In an object location task, no impairment was found. GIPR KO mice also showed impaired synaptic plasticity in paired-pulse facilitation and a block of long-term potentiation in area CA1 of the hippocampus. Moreover, a large decrease in the number of neuronal progenitor cells was found in the dentate gyrus of transgenic mice, although the numbers of young neurons was not changed. Together the results suggest that GIP receptors play an important role in cognition, neurotransmission, and cell proliferation.
Resumo:
Individual learning (e.g., trial-and-error) and social learning (e.g., imitation) are alternative ways of acquiring and expressing the appropriate phenotype in an environment. The optimal choice between using individual learning and/or social learning may be dictated by the life-stage or age of an organism. Of special interest is a learning schedule in which social learning precedes individual learning, because such a schedule is apparently a necessary condition for cumulative culture. Assuming two obligatory learning stages per discrete generation, we obtain the evolutionarily stable learning schedules for the three situations where the environment is constant, fluctuates between generations, or fluctuates within generations. During each learning stage, we assume that an organism may target the optimal phenotype in the current environment by individual learning, and/or the mature phenotype of the previous generation by oblique social learning. In the absence of exogenous costs to learning, the evolutionarily stable learning schedules are predicted to be either pure social learning followed by pure individual learning ("bang-bang" control) or pure individual learning at both stages ("flat" control). Moreover, we find for each situation that the evolutionarily stable learning schedule is also the one that optimizes the learned phenotype at equilibrium.
Resumo:
The orchestration of collaborative learning processes in face-to-facephysical settings, such as classrooms, requires teachers to coordinate students indicating them who belong to each group, which collaboration areas areassigned to each group, and how they should distribute the resources or roles within the group. In this paper we present an Orchestration Signal system,composed of wearable Personal Signal devices and an Orchestration Signal manager. Teachers can configure color signals in the manager so that they are transmitted to the wearable devices to indicate different orchestration aspects.In particular, the paper describes how the system has been used to carry out a Jigsaw collaborative learning flow in a classroom where students received signals indicating which documents they should read, in which group they were and in which area of the classroom they were expected to collaborate. The evaluation results show that the proposed system facilitates a dynamic, visual and flexible orchestration.
Resumo:
This paper presents a customizable system used to develop a collaborative multi-user problem solving game. It addresses the increasing demand for appealing informal learning experiences in museum-like settings. The system facilitates remote collaboration by allowing groups of learners tocommunicate through a videoconferencing system and by allowing them to simultaneously interact through a shared multi-touch interactive surface. A user study with 20 user groups indicates that the game facilitates collaboration between local and remote groups of learners. The videoconference and multitouch surface acted as communication channels, attracted students’ interest, facilitated engagement, and promoted inter- and intra-group collaboration—favoring intra-group collaboration. Our findings suggest that augmentingvideoconferencing systems with a shared multitouch space offers newpossibilities and scenarios for remote collaborative environments and collaborative learning.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
Learning object repositories are a basic piece of virtual learning environments used for content management. Nevertheless, learning objects have special characteristics that make traditional solutions for content management ine ective. In particular, browsing and searching for learning objects cannot be based on the typical authoritative meta-data used for describing content, such as author, title or publicationdate, among others. We propose to build a social layer on top of a learning object repository, providing nal users with additional services fordescribing, rating and curating learning objects from a teaching perspective. All these interactions among users, services and resources can be captured and further analyzed, so both browsing and searching can be personalized according to user pro le and the educational context, helping users to nd the most valuable resources for their learning process. In this paper we propose to use reputation schemes and collaborative filtering techniques for improving the user interface of a DSpace based learning object repository.