926 resultados para Complex learning
Resumo:
Students are now involved in a vastly different textual landscape than many English scholars, one that relies on the “reading” and interpretation of multiple channels of simultaneous information. As a response to these new kinds of literate practices, my dissertation adds to the growing body of research on multimodal literacies, narratology in new media, and rhetoric through an examination of the place of video games in English teaching and research. I describe in this dissertation a hybridized theoretical basis for incorporating video games in English classrooms. This framework for textual analysis includes elements from narrative theory in literary study, rhetorical theory, and literacy theory, and when combined to account for the multiple modalities and complexities of gaming, can provide new insights about those theories and practices across all kinds of media, whether in written texts, films, or video games. In creating this framework, I hope to encourage students to view texts from a meta-level perspective, encompassing textual construction, use, and interpretation. In order to foster meta-level learning in an English course, I use specific theoretical frameworks from the fields of literary studies, narratology, film theory, aural theory, reader-response criticism, game studies, and multiliteracies theory to analyze a particular video game: World of Goo. These theoretical frameworks inform pedagogical practices used in the classroom for textual analysis of multiple media. Examining a video game from these perspectives, I use analytical methods from each, including close reading, explication, textual analysis, and individual elements of multiliteracies theory and pedagogy. In undertaking an in-depth analysis of World of Goo, I demonstrate the possibilities for classroom instruction with a complex blend of theories and pedagogies in English courses. This blend of theories and practices is meant to foster literacy learning across media, helping students develop metaknowledge of their own literate practices in multiple modes. Finally, I outline a design for a multiliteracies course that would allow English scholars to use video games along with other texts to interrogate texts as systems of information. In doing so, students can hopefully view and transform systems in their own lives as audiences, citizens, and workers.
Resumo:
Many schools do not begin to introduce college students to software engineering until they have had at least one semester of programming. Since software engineering is a large, complex, and abstract subject it is difficult to construct active learning exercises that build on the students’ elementary knowledge of programming and still teach basic software engineering principles. It is also the case that beginning students typically know how to construct small programs, but they have little experience with the techniques necessary to produce reliable and long-term maintainable modules. I have addressed these two concerns by defining a local standard (Montana Tech Method (MTM) Software Development Standard for Small Modules Template) that step-by-step directs students toward the construction of highly reliable small modules using well known, best-practices software engineering techniques. “Small module” is here defined as a coherent development task that can be unit tested, and can be car ried out by a single (or a pair of) software engineer(s) in at most a few weeks. The standard describes the process to be used and also provides a template for the top-level documentation. The instructional module’s sequence of mini-lectures and exercises associated with the use of this (and other) local standards are used throughout the course, which perforce covers more abstract software engineering material using traditional reading and writing assignments. The sequence of mini-lectures and hands-on assignments (many of which are done in small groups) constitutes an instructional module that can be used in any similar software engineering course.
Resumo:
Person-to-stock order picking is highly flexible and requires minimal investment costs in comparison to automated picking solutions. For these reasons, tradi-tional picking is widespread in distribution and production logistics. Due to its typically large proportion of manual activities, picking causes the highest operative personnel costs of all intralogistics process. The required personnel capacity in picking varies short- and mid-term due to capacity requirement fluctuations. These dynamics are often balanced by employing minimal permanent staff and using seasonal help when needed. The resulting high personnel fluctuation necessitates the frequent training of new pickers, which, in combination with in-creasingly complex work contents, highlights the im-portance of learning processes in picking. In industrial settings, learning is often quantified based on diminishing processing time and cost requirements with increasing experience. The best-known industrial learning curve models include those from Wright, de Jong, Baloff and Crossman, which are typically applied to the learning effects of an entire work crew rather than of individuals. These models have been validated in largely static work environments with homogeneous work contents. Little is known of learning effects in picking systems. Here, work contents are heterogeneous and individual work strategies vary among employees. A mix of temporary and steady employees with varying degrees of experience necessitates the observation of individual learning curves. In this paper, the individual picking performance development of temporary employees is analyzed and compared to that of steady employees in the same working environment.
Resumo:
Tuberous sclerosis complex (TSC) is a multisystem, autosomal dominant disorder affecting approximately 1 in 6000 births. Developmental brain abnormalities cause substantial morbidity and mortality and often lead to neurological disease including epilepsy, cognitive disabilities, and autism. TSC is caused by inactivating mutations in either TSC1 or TSC2, whose protein products are known inhibitors of mTORC1, an important kinase regulating translation and cell growth. Nonetheless, neither the pathophysiology of the neurological manifestations of TSC nor the extent of mTORC1 involvement in the development of these lesions is known. Murine models would greatly advance the study of this debilitating disorder. This thesis will describe the generation and characterization of a novel brain-specific mouse model of TSC, Tsc2flox/ko;hGFAP-Cre. In this model, the Tsc2 gene has been removed from most neurons and glia of the cortex and hippocampus by targeted Cre-mediated deletion in radial glial neuroprogenitor cells. The Tsc2flox/ko;hGFAP-Cre mice fail to thrive beginning postnatal day 8 and die from seizures around 23 days. Further characterization of these mice demonstrated megalencephaly, enlarged neurons, abnormal neuronal migration, altered progenitor pools, hypomyelination, and an astrogliosis. The similarity of these defects to those of TSC patients establishes this mouse as an excellent model for the study of the neuropathology of TSC and testing novel therapies. We further describe the use of this mouse model to assess the therapeutic potential of the macrolide rapamycin, an inhibitor of mTORC1. We demonstrate that rapamycin administered from postnatal day 10 can extend the life of the mutant animals 5 fold. Since TSC is a neurodevelopmental disorder, we also assessed in utero and/or immediate postnatal treatment of the animals with rapamycin. Amazingly, combined in utero and postnatal rapamycin effected a histologic rescue that was almost indistinguishable from control animals, indicating that dysregulation of mTORC1 plays a large role in TSC neuropathology. In spite of the almost complete histologic rescue, behavioral studies demonstrated that combined treatment resulted in poorer learning and memory than postnatal treatment alone. Postnatally-treated animals behaved similarly to treated controls, suggesting that immediate human treatment in the newborn period might provide the most opportune developmental timepoint for rapamycin administration.
Resumo:
Spike timing dependent plasticity (STDP) is a phenomenon in which the precise timing of spikes affects the sign and magnitude of changes in synaptic strength. STDP is often interpreted as the comprehensive learning rule for a synapse - the "first law" of synaptic plasticity. This interpretation is made explicit in theoretical models in which the total plasticity produced by complex spike patterns results from a superposition of the effects of all spike pairs. Although such models are appealing for their simplicity, they can fail dramatically. For example, the measured single-spike learning rule between hippocampal CA3 and CA1 pyramidal neurons does not predict the existence of long-term potentiation one of the best-known forms of synaptic plasticity. Layers of complexity have been added to the basic STDP model to repair predictive failures, but they have been outstripped by experimental data. We propose an alternate first law: neural activity triggers changes in key biochemical intermediates, which act as a more direct trigger of plasticity mechanisms. One particularly successful model uses intracellular calcium as the intermediate and can account for many observed properties of bidirectional plasticity. In this formulation, STDP is not itself the basis for explaining other forms of plasticity, but is instead a consequence of changes in the biochemical intermediate, calcium. Eventually a mechanism-based framework for learning rules should include other messengers, discrete change at individual synapses, spread of plasticity among neighboring synapses, and priming of hidden processes that change a synapse's susceptibility to future change. Mechanism-based models provide a rich framework for the computational representation of synaptic plasticity.
Resumo:
This paper applies a policy analysis approach to the question of how to effectively regulate micropollution in a sustainable manner. Micropollution is a complex policy problem characterized by a huge number and diversity of chemical substances, as well as various entry paths into the aquatic environment. It challenges traditional water quality management by calling for new technologies in wastewater treatment and behavioral changes in industry, agriculture and civil society. In light of such challenges, the question arises as to how to regulate such a complex phenomenon to ensure water quality is maintained in the future? What can we learn from past experiences in water quality regulation? To answer these questions, policy analysis strongly focuses on the design and choice of policy instruments and the mix of such measures. In this paper, we review instruments commonly used in past water quality regulation. We evaluate their ability to respond to the characteristics of a more recent water quality problem, i.e., micropollution, in a sustainable way. This way, we develop a new framework that integrates both the problem dimension (i.e., causes and effects of a problem) as well as the sustainability dimension (e.g., long-term, cross-sectoral and multi-level) to assess which policy instruments are best suited to regulate micropollution. We thus conclude that sustainability criteria help to identify an appropriate instrument mix of end-of-pipe and source-directed measures to reduce aquatic micropollution.
Resumo:
Should a firm stay focused or should it rather adopt a broader strategic perspective? This dissertation summarizes and extends the existing knowledge base on entrepreneurial, market, and learning orientation. Building on multiple theoretical perspectives, empirical evidence from prior studies, as well as on survey and archival data collected in two economic contexts, performance effects from individual orientations, their dimensions and combinations are explored. Results reveal that the three strategic orientations are highly interrelated and that their relationship to firm performance is more complex than previously assumed.
Resumo:
Games that simulate complex realities to be dealt with in teams are an effective tool for fostering interactive learning processes. they link different levels of decision-making in the household, community and societal contexts. Negotiation and harmonisation of different perceptions and interests, be it within or between different households, form the basis of a common strategy for sustainable development.
Resumo:
While sequence learning research models complex phenomena, previous studies have mostly focused on unimodal sequences. The goal of the current experiment is to put implicit sequence learning into a multimodal context: to test whether it can operate across different modalities. We used the Task Sequence Learning paradigm to test whether sequence learning varies across modalities, and whether participants are able to learn multimodal sequences. Our results show that implicit sequence learning is very similar regardless of the source modality. However, the presence of correlated task and response sequences was required for learning to take place. The experiment provides new evidence for implicit sequence learning of abstract conceptual representations. In general, the results suggest that correlated sequences are necessary for implicit sequence learning to occur. Moreover, they show that elements from different modalities can be automatically integrated into one unitary multimodal sequence.
Resumo:
Recent developments in federal policy have prompted the creation of state evaluation frameworks for principals and teachers that hold educators accountable for effective practices and student outcomes. These changes have created a demand for formative evaluation instruments that reflect current accountability pressures and can be used by schools to focus school improvement and leadership development efforts. The Comprehensive Assessment of Leadership for Learning (CALL) is a next generation, 360-degree on-line assessment and feedback system that reflect best practices in feedback design. Some unique characteristics of CALL include a focus on: leadership distributed throughout the school rather than as carried out by an individual leader; assessment of leadership tasks rather than perceptions of leadership practice; a focus on larger complex systems of middle and high school; and transparency of assessment design. This paper describes research contributing to the design and validation of the CALL survey instrument.
Resumo:
This article presents the findings of a field research, not experimental, observational, correlating, basic, of mixed data, micro sociologic, leading to a study of surveys.The object of study is to find learning kinds, and the unit of analysis were 529 high school students between 16 and 21 years old. Its purpose is to understand the impact of learning by rote, guided, self learned and meaningful learning and its achievement degree besides the learning outcomes of differentiated curriculum based on David Ausubel's thoughts, associated with different economic specialties (MINEDUC, 1998) where the population of the study is trained. To collect data, the test TADA - DO2 was used, this test has a reliability index of 0.911 according to Cronbach. From the hits it can be stated from the null hypothesis that there is a significant association (a = 0,05) between the learning kinds and the learning expected of differentiated training plan for both, male and female. It is complex to state that the training of the middle-level technicians leads to a successful employment.
Resumo:
This article presents the findings of a field research, not experimental, observational, correlating, basic, of mixed data, micro sociologic, leading to a study of surveys.The object of study is to find learning kinds, and the unit of analysis were 529 high school students between 16 and 21 years old. Its purpose is to understand the impact of learning by rote, guided, self learned and meaningful learning and its achievement degree besides the learning outcomes of differentiated curriculum based on David Ausubel's thoughts, associated with different economic specialties (MINEDUC, 1998) where the population of the study is trained. To collect data, the test TADA - DO2 was used, this test has a reliability index of 0.911 according to Cronbach. From the hits it can be stated from the null hypothesis that there is a significant association (a = 0,05) between the learning kinds and the learning expected of differentiated training plan for both, male and female. It is complex to state that the training of the middle-level technicians leads to a successful employment.
Resumo:
This article presents the findings of a field research, not experimental, observational, correlating, basic, of mixed data, micro sociologic, leading to a study of surveys.The object of study is to find learning kinds, and the unit of analysis were 529 high school students between 16 and 21 years old. Its purpose is to understand the impact of learning by rote, guided, self learned and meaningful learning and its achievement degree besides the learning outcomes of differentiated curriculum based on David Ausubel's thoughts, associated with different economic specialties (MINEDUC, 1998) where the population of the study is trained. To collect data, the test TADA - DO2 was used, this test has a reliability index of 0.911 according to Cronbach. From the hits it can be stated from the null hypothesis that there is a significant association (a = 0,05) between the learning kinds and the learning expected of differentiated training plan for both, male and female. It is complex to state that the training of the middle-level technicians leads to a successful employment.
Resumo:
A good and early fault detection and isolation system along with efficient alarm management and fine sensor validation systems are very important in today¿s complex process plants, specially in terms of safety enhancement and costs reduction. This paper presents a methodology for fault characterization. This is a self-learning approach developed in two phases. An initial, learning phase, where the simulation of process units, without and with different faults, will let the system (in an automated way) to detect the key variables that characterize the faults. This will be used in a second (on line) phase, where these key variables will be monitored in order to diagnose possible faults. Using this scheme the faults will be diagnosed and isolated in an early stage where the fault still has not turned into a failure.