24 resultados para Transformative Mappings
Resumo:
Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.
Resumo:
Purpose – This paper aims to focus on developing critical understanding in human resource management (HRM) students in Aston Business School, UK. The paper reveals that innovative teaching methods encourage deep approaches to study, an indicator of students reaching their own understanding of material and ideas. This improves student employability and satisfies employer need. Design/methodology/approach – Student response to two second year business modules, matched for high student approval rating, was collected through focus group discussion. One module was taught using EBL and the story method, whilst the other used traditional teaching methods. Transcripts were analysed and compared using the structure of the ASSIST measure. Findings – Critical understanding and transformative learning can be developed through the innovative teaching methods of enquiry-based learning (EBL) and the story method. Research limitations/implications – The limitation is that this is a single case study comparing and contrasting two business modules. The implication is that the study should be replicated and developed in different learning settings, so that there are multiple data sets to confirm the research finding. Practical implications – Future curriculum development, especially in terms of HE, still needs to encourage students and lecturers to understand more about the nature of knowledge and how to learn. The application of EBL and the story method is described in a module case study – “Strategy for Future Leaders”. Originality/value – This is a systematic and comparative study to improve understanding of how students and lecturers learn and of the context in which the learning takes place.
Resumo:
This paper seeks to advance research and practice related to the role of employers in all stages of the assessment process of work-based learning (WBL) within a tripartite relationship of higher education institution (HEI), student and employer. It proposes a research-informed quality enhancement framework to develop good practice in engaging employers as partners in assessment. The Enhancement Framework comprises three dimensions, each of which includes elements and questions generated by the experiences of WBL students, HEI staff and employers. The three dimensions of the Enhancement Framework are: 1. ‘premises of assessment’ encompassing issues of learning, inclusion, standards and value; 2. ‘practice’, encompassing stages of assessment made up of course design, assessment task, responsibilities, support, grading and feedback; 3. ‘communication of assessment’ with the emphasis on role clarity, language and pathways. With its prompt questions, the Enhancement Framework may be used as a capacity-building tool for promoting, sustaining, benchmarking and evaluating productive dialogue and critical reflection about assessment between WBL partners. The paper concludes by emphasising the need for professional development as well as policy and research development, so that assessment in WBL can more closely correspond to the potentially transformative nature of the learning experience.
Resumo:
There is a strongly rooted assumption that foreign policy is an executive domain and rarely plays a role in the electoral struggle. Germany is something of an exception and this paper looks at the way foreign policy has provided a site for electoral competition in Germany. Its principal focus is on the two Grand Coalitions with an especial emphasis on the contest between Chancellor Merkel and Foreign Minister Steinmeier to turn foreign policy profile into party advantage. It concludes that this contest was won overwhelmingly by Chancellor Merkel but that in contrast to the first Grand Coalition that foreign policy was not transformative. © 2010 Association for the Study of German Politics.
Resumo:
Multiple transformative forces target marketing, many of which derive from new technologies that allow us to sample thinking in real time (i.e., brain imaging), or to look at large aggregations of decisions (i.e., big data). There has been an inclination to refer to the intersection of these technologies with the general topic of marketing as “neuromarketing”. There has not been a serious effort to frame neuromarketing, which is the goal of this paper. Neuromarketing can be compared to neuroeconomics, wherein neuroeconomics is generally focused on how individuals make “choices”, and represent distributions of choices. Neuromarketing, in contrast, focuses on how a distribution of choices can be shifted or “influenced”, which can occur at multiple “scales” of behavior (e.g., individual, group, or market/society). Given influence can affect choice through many cognitive modalities, and not just that of valuation of choice options, a science of influence also implies a need to develop a model of cognitive function integrating attention, memory, and reward/aversion function. The paper concludes with a brief description of three domains of neuromarketing application for studying influence, and their caveats.
Resumo:
The thesis investigates transport properties in high temperature plasmas with symplectic mappings. A formalism is developed to derive such maps from symplectic integrators. Concrete maps are given and analyzed.
Resumo:
This chapter contributes to the anthology on learning to research - researching to learn because it emphases a need to design curricula that enables living research, and on-going researcher development, rather than one that restricts student and staff activities, within a marketised approach towards time. In recent decades higher education (HE) has come to be valued for its contribution to the global economy. Referred to as the neo-liberal university, a strong prioritisation has been placed on meeting the needs of industry by providing a better workforce. This perspective emphasises the role of a degree in HE to secure future material affluence, rather than to study as an on-going investment in the self (Molesworth , Nixon & Scullion, 2009: 280). Students are treated primarily as consumers in this model, where through their tuition fees they purchase a product, rather than benefit from the transformative potential university education offers for the whole of life.Given that HE is now measured by the numbers of students it attracts, and later places into well-paid jobs, there is an intense pressure on time, which has led to a method where the learning experiences of students are broken down into discrete modules. Whilst this provides consistency, students can come to view research processes in a fragmented way within the modular system. Topics are presented chronologically, week-by-week and students simply complete a set of tasks to ‘have a degree’, rather than to ‘be learners’ (Molesworth , Nixon & Scullion, 2009: 277) who are living their research, in relation to their own past, present and future. The idea of living research in this context is my own adaptation of an approach suggested by C. Wright Mills (1959) in The Sociological Imagination. Mills advises that successful scholars do not split their work from the rest of their lives, but treat scholarship as a choice of how to live, as well as a choice of career. The marketised slant in HE thus creates a tension firstly, for students who are learning to research. Mills would encourage them to be creative, not instrumental, in their use of time, yet they are journeying through a system that is structured for a swift progression towards a high paid job, rather than crafted for reflexive inquiry, that transforms their understanding throughout life. Many universities are placing a strong focus on discrete skills for student employability, but I suggest that embedding the transformative skills emphasised by Mills empowers students and builds their confidence to help them make connections that aid their employability. Secondly, the marketised approach creates a problem for staff designing the curriculum, if students do not easily make links across time over their years of study and whole programmes. By researching to learn, staff can discover new methods to apply in their design of the curriculum, to help students make important and creative connections across their programmes of study.
Resumo:
Background: Synthetic phonics is the widely accepted approach for teaching reading in English: children are taught to sound out the letters in a word then blend these sounds together. Aims: We compared the impact of two synthetic phonics programmes on early reading.SampleChildren received Letters and Sounds (L&S; 7 schools) which teaches multiple letter-sound mappings or Early Reading Research (ERR; 10 schools) which teaches only the most consistent mappings plus frequent words by sight.MethodWe measured phonological awareness (PA) and reading from school entry to the end of the second (all schools) or third school year (4 ERR, 3 L&S schools). Results: PA was significantly related to all reading measures for the whole sample. However, there was a closer relationship between PA and exception word reading for children receiving the L&S programme. The programmes were equally effective overall, but their impact on reading significantly interacted with school-entry PA: children with poor PA at school entry achieved higher reading attainments under ERR (significant group difference on exception word reading at the end of the first year), whereas children with good PA performed equally well under either programme. Conclusions: The more intensive phonics programme (L&S) heightened the association between PA and exception word reading. Although the programmes were equally effective for most children, results indicate potential benefits of ERR for children with poor PA. We suggest that phonics programmes could be simplified to teach only the most consistent mappings plus frequent words by sight.
Resumo:
The focus of this thesis is the extension of topographic visualisation mappings to allow for the incorporation of uncertainty. Few visualisation algorithms in the literature are capable of mapping uncertain data with fewer able to represent observation uncertainties in visualisations. As such, modifications are made to NeuroScale, Locally Linear Embedding, Isomap and Laplacian Eigenmaps to incorporate uncertainty in the observation and visualisation spaces. The proposed mappings are then called Normally-distributed NeuroScale (N-NS), T-distributed NeuroScale (T-NS), Probabilistic LLE (PLLE), Probabilistic Isomap (PIso) and Probabilistic Weighted Neighbourhood Mapping (PWNM). These algorithms generate a probabilistic visualisation space with each latent visualised point transformed to a multivariate Gaussian or T-distribution, using a feed-forward RBF network. Two types of uncertainty are then characterised dependent on the data and mapping procedure. Data dependent uncertainty is the inherent observation uncertainty. Whereas, mapping uncertainty is defined by the Fisher Information of a visualised distribution. This indicates how well the data has been interpolated, offering a level of ‘surprise’ for each observation. These new probabilistic mappings are tested on three datasets of vectorial observations and three datasets of real world time series observations for anomaly detection. In order to visualise the time series data, a method for analysing observed signals and noise distributions, Residual Modelling, is introduced. The performance of the new algorithms on the tested datasets is compared qualitatively with the latent space generated by the Gaussian Process Latent Variable Model (GPLVM). A quantitative comparison using existing evaluation measures from the literature allows performance of each mapping function to be compared. Finally, the mapping uncertainty measure is combined with NeuroScale to build a deep learning classifier, the Cascading RBF. This new structure is tested on the MNist dataset achieving world record performance whilst avoiding the flaws seen in other Deep Learning Machines.