24 resultados para Transformative Mappings
em Aston University Research Archive
Resumo:
This thesis is a study of the generation of topographic mappings - dimension reducing transformations of data that preserve some element of geometric structure - with feed-forward neural networks. As an alternative to established methods, a transformational variant of Sammon's method is proposed, where the projection is effected by a radial basis function neural network. This approach is related to the statistical field of multidimensional scaling, and from that the concept of a 'subjective metric' is defined, which permits the exploitation of additional prior knowledge concerning the data in the mapping process. This then enables the generation of more appropriate feature spaces for the purposes of enhanced visualisation or subsequent classification. A comparison with established methods for feature extraction is given for data taken from the 1992 Research Assessment Exercise for higher educational institutions in the United Kingdom. This is a difficult high-dimensional dataset, and illustrates well the benefit of the new topographic technique. A generalisation of the proposed model is considered for implementation of the classical multidimensional scaling (¸mds}) routine. This is related to Oja's principal subspace neural network, whose learning rule is shown to descend the error surface of the proposed ¸mds model. Some of the technical issues concerning the design and training of topographic neural networks are investigated. It is shown that neural network models can be less sensitive to entrapment in the sub-optimal global minima that badly affect the standard Sammon algorithm, and tend to exhibit good generalisation as a result of implicit weight decay in the training process. It is further argued that for ideal structure retention, the network transformation should be perfectly smooth for all inter-data directions in input space. Finally, there is a critique of optimisation techniques for topographic mappings, and a new training algorithm is proposed. A convergence proof is given, and the method is shown to produce lower-error mappings more rapidly than previous algorithms.
Resumo:
Detailed transport studies in plasmas require the solution of the time evolution of many different initial positions of test particles in the phase space of the systems to be investigated. To reduce this amount of numerical work, one would like to replace the integration of the time-continues system with a mapping.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
A recent novel approach to the visualisation and analysis of datasets, and one which is particularly applicable to those of a high dimension, is discussed in the context of real applications. A feed-forward neural network is utilised to effect a topographic, structure-preserving, dimension-reducing transformation of the data, with an additional facility to incorporate different degrees of associated subjective information. The properties of this transformation are illustrated on synthetic and real datasets, including the 1992 UK Research Assessment Exercise for funding in higher education. The method is compared and contrasted to established techniques for feature extraction, and related to topographic mappings, the Sammon projection and the statistical field of multidimensional scaling.
Resumo:
The paper examines the impact of the OECD’s Development Assistance Committee (DAC) on the emerging foreign aid policies of the Central and Eastern European (CEEs) countries. The Czech Republic, Poland, Slovakia and Slovenia joined the DAC in 2013, and the committee has aimed to socialise them into the norms of the international development system. Generally, however, there is little evidence of impact due to the soft nature of the DAC’s policy recommendations, and the fact that the committee, reacting to the challenges to its legitimacy from non-Western donors, has become much less demanding towards potential members than in the past. The paper, however, argues that one must examine the processes of how the norm and policy recommendations of the DAC are mediated domestically. The case of the Czechepublic’s reforms in its foreign aid policy between 2007 and 2010 shows that domestic actors can use the OECD strategically to build support for their own cause and thus achieve seemingly difficult policy reform.
Resumo:
A conventional neural network approach to regression problems approximates the conditional mean of the output vector. For mappings which are multi-valued this approach breaks down, since the average of two solutions is not necessarily a valid solution. In this article mixture density networks, a principled method to model conditional probability density functions, are applied to retrieving Cartesian wind vector components from satellite scatterometer data. A hybrid mixture density network is implemented to incorporate prior knowledge of the predominantly bimodal function branches. An advantage of a fully probabilistic model is that more sophisticated and principled methods can be used to resolve ambiguities.
Resumo:
We consider the problem of illusory or artefactual structure from the visualisation of high-dimensional structureless data. In particular we examine the role of the distance metric in the use of topographic mappings based on the statistical field of multidimensional scaling. We show that the use of a squared Euclidean metric (i.e. the SSTRESs measure) gives rise to an annular structure when the input data is drawn from a high-dimensional isotropic distribution, and we provide a theoretical justification for this observation.
Resumo:
A conventional neural network approach to regression problems approximates the conditional mean of the output vector. For mappings which are multi-valued this approach breaks down, since the average of two solutions is not necessarily a valid solution. In this article mixture density networks, a principled method to model conditional probability density functions, are applied to retrieving Cartesian wind vector components from satellite scatterometer data. A hybrid mixture density network is implemented to incorporate prior knowledge of the predominantly bimodal function branches. An advantage of a fully probabilistic model is that more sophisticated and principled methods can be used to resolve ambiguities.
Resumo:
This paper, based on the reflections of two academic social scientists, offers a starting point for dialogue about the importance of critical pedagogy within the university today, and about the potentially transformative possibilities of higher education more generally. We first explain how the current context of HE, framed through neoliberal restructuring, is reshaping opportunities for alternative forms of education and knowledge production to emerge. We then consider how insights from both critical pedagogy and popular education inform our work in this climate. Against this backdrop, we consider the effects of our efforts to realise the ideals of critical pedagogy in our teaching to date and ask how we might build more productive links between classroom and activist practices. Finally, we suggest that doing so can help facilitate a more fully articulated reconsideration of the meanings, purposes and practices of HE in contemporary society. This paper also includes responses from two educational developers, Janet Strivens and Ranald Macdonald, with the aim of creating a dialogue on the role of critical pedagogy in higher education.
Resumo:
This paper departs from this point to consider whether and how crisis thinking contributes to practices of affirmative critique and transformative social action in late-capitalist societies. I argue that different deployments of crisis thinking have different ‘affect-effects’ and consequences for ethical and political practice. Some work to mobilize political action through articulating a politics of fear, assuming that people take most responsibility for the future when they fear the alternatives. Other forms of crisis thinking work to heighten critical awareness by disrupting existential certainty, asserting an ‘ethics of ambiguity’ which assumes that the continuous production of uncertain futures is a fundamental part of the human condition (de Beauvoir, 2000). In this paper, I hope to illustrate that the first deployment of crisis thinking can easily justify the closing down of political debate, discouraging radical experimentation and critique for the sake of resolving problems in a timely and decisive way. The second approach to crisis thinking, on the other hand, has greater potential to enable intellectual and political alterity in everyday life—but one that poses considerable challenges for our understandings of and responses to climate change...
Resumo:
The scaling problems which afflict attempts to optimise neural networks (NNs) with genetic algorithms (GAs) are disclosed. A novel GA-NN hybrid is introduced, based on the bumptree, a little-used connectionist model. As well as being computationally efficient, the bumptree is shown to be more amenable to genetic coding lthan other NN models. A hierarchical genetic coding scheme is developed for the bumptree and shown to have low redundancy, as well as being complete and closed with respect to the search space. When applied to optimising bumptree architectures for classification problems the GA discovers bumptrees which significantly out-perform those constructed using a standard algorithm. The fields of artificial life, control and robotics are identified as likely application areas for the evolutionary optimisation of NNs. An artificial life case-study is presented and discussed. Experiments are reported which show that the GA-bumptree is able to learn simulated pole balancing and car parking tasks using only limited environmental feedback. A simple modification of the fitness function allows the GA-bumptree to learn mappings which are multi-modal, such as robot arm inverse kinematics. The dynamics of the 'geographic speciation' selection model used by the GA-bumptree are investigated empirically and the convergence profile is introduced as an analytical tool. The relationships between the rate of genetic convergence and the phenomena of speciation, genetic drift and punctuated equilibrium arc discussed. The importance of genetic linkage to GA design is discussed and two new recombination operators arc introduced. The first, linkage mapped crossover (LMX) is shown to be a generalisation of existing crossover operators. LMX provides a new framework for incorporating prior knowledge into GAs.Its adaptive form, ALMX, is shown to be able to infer linkage relationships automatically during genetic search.
Resumo:
Throughout the nineteenth century, German classical music production was an aesthetic point of reference for British concert audiences. As a consequence, a sizeable number of German musicians were to be found in Britain as performers, conductors, teachers, musicologists and managers. They acted as agents of intercultural transfer, disseminating performance and organisational practices which had a transformative effect on British musical life. This article moves away from a focus on high-profile visiting artists such as Mendelssohn Bartholdy or Wagner and argues that the extent to which transfer took place can be better assessed by concentrating on the cohort of those artists who remained permanently. Some of these are all but forgotten today, but were household names in Victorian Britain. The case studies have been selected for the range of genres they represent and include Joseph Mainzer (choral singing), Carl Rosa (opera), August Manns, Carl Hallé and Julius Seligmann (orchestral music), and Friedrich Niecks (musicology). On a theoretical level, the concept of ‘intercultural transfer’ is applied in order to determine aspects such as diffusion, adaptation or sustainability of artistic elements within the new cultural context. The approach confirms that ‘national’ cultures do not develop indigenously but always through cross-national interaction. Während des neunzehnten Jahrhunderts war die klassische Musikszene Deutschlands ästhetischer Bezugpunkt für das britische Konzertpublikum. Dies hatte zur Folge, dass vermehrt Deutsche als Musiker, Dirigenten, Lehrer, Musikwissenschaftler und Manager in Großbritannien tätig wurden. Sie fungierten als Vermittler interkulturellen Transfers, indem sie aufführungs- und organisationstechnische Praktiken verbreiteten und damit zu einer Transformation des britischen Musiklebens beitrugen. Vorliegender Artikel konzentriert sich weniger auf bekannte Künstler mit kurzfristigen Engagements (z. B. Mendelssohn Bartholdy, Wagner), und argumentiert vielmehr, dass sich das Ausmaß des Transfers besser über solche Musiker feststellen lässt, die sich längerfristig ansiedelten. Einige davon waren allgemein bekannte Persönlichkeiten im Königreich, sind heute aber vergessen. Die Auswahl der Fallstudien gibt einen Überblick über verschiedene Gattungen und beinhaltet Joseph Mainzer (Chorgesang), Carl Rosa (Oper), August Manns, Carl Hallé und Julius Seligmann (Orchestermusik), sowie Friedrich Niecks (Musikwissenschaft). Auf der Theorieebene wird das Konzept des ‘interkulturellen Transfers’ herangezogen, um Aspekte wie Diffusion, Anpassung oder Nachhaltigkeit künstlerischer Elemente im neuen kulturellen Kontext zu beleuchten. Der Ansatz bestätigt, dass sich ‘nationale’ Kulturen nicht indigen entwickeln sondern immer im Austausch mit anderen Kulturen
Resumo:
The mappings from grapheme to phoneme are much less consistent in English than they are for most other languages. Therefore, the differences found between English-speaking dyslexics and controls on sensory measures of temporal processing might be related more to the irregularities of English orthography than to a general deficit affecting reading ability in all languages. However, here we show that poor readers of Norwegian, a language with a relatively regular orthography, are less sensitive than controls to dynamic visual and auditory stimuli. Consistent with results from previous studies of English-readers, detection thresholds for visual motion and auditory frequency modulation (FM) were significantly higher in 19 poor readers of Norwegian compared to 22 control readers of the same age. Over two-thirds (68.4%) of the children identified as poor readers were less sensitive than controls to either or both of the visual coherent motion or auditory 2Hz FM stimuli. © 2003 Elsevier Science (USA). All rights reserved.
Resumo:
The complexity and multifaceted nature of sustainable lifelong learning can be effectively addressed by a broad network of providers working co-operatively and collaboratively. Such a network involving the third, public and private sector bodies must realise the full potential of accredited flexible and blended formal learning, contextual opportunities offered by enablers of informal and non formal learning and the affordances derived from the various loose and open spaces that can make social learning effective. Such a conception informs the new Lifelong Learning Network Consortium on Sustainable Communities, Urban Regeneration and Environmental Technologies established and led by the Lifelong Learning Centre at Aston University. This paper offers a radical, reflective and political evaluation of its first year in development arguing that networked learning of this type could prefigure a new model for lifelong learning and sustainable education that renders the city itself a creative medium for transformative learning and sustainability.