47 resultados para Programming pedagogy
em Aston University Research Archive
Resumo:
This paper, based on the reflections of two academic social scientists, offers a starting point for dialogue about the importance of critical pedagogy within the university today, and about the potentially transformative possibilities of higher education more generally. We first explain how the current context of HE, framed through neoliberal restructuring, is reshaping opportunities for alternative forms of education and knowledge production to emerge. We then consider how insights from both critical pedagogy and popular education inform our work in this climate. Against this backdrop, we consider the effects of our efforts to realise the ideals of critical pedagogy in our teaching to date and ask how we might build more productive links between classroom and activist practices. Finally, we suggest that doing so can help facilitate a more fully articulated reconsideration of the meanings, purposes and practices of HE in contemporary society. This paper also includes responses from two educational developers, Janet Strivens and Ranald Macdonald, with the aim of creating a dialogue on the role of critical pedagogy in higher education.
Resumo:
Contemporary Higher Education Institutions must adapt to address government funded calls for expansion and widened participation. The adoption of e-learning strategies, such as the use of the podcasts, can facilitate flexible learning around the needs and expectations of students. In this article we outline a number of e-learning developments at Aston University collectively referred to as the Virtual Pedagogy Initiative. Each of the strands, podcasts, vodcasts, mobile telephony and the campus wide remote broadcasts, are described pedagogically as well as technically. Where possible data highlighting the student response and experience are included. The article begins with the contention that contemporary undergraduates may be qualitatively different and can considered „digital natives?.
Resumo:
Logistics distribution network design is one of the major decision problems arising in contemporary supply chain management. The decision involves many quantitative and qualitative factors that may be conflicting in nature. This paper applies an integrated multiple criteria decision making approach to design an optimal distribution network. In the approach, the analytic hierarchy process (AHP) is used first to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, the goal programming (GP) model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. In this paper, two commercial packages are used: Expert Choice for determining the AHP priorities of the warehouses, and LINDO for solving the GP model. © 2007 IEEE.
Resumo:
Few today doubt that English Higher Education (HE), like the wider world in which it is located, is in crisis. This is, in part, an economic crisis, as the government response to the current recession seems to be that of introducing the kind of neoliberal ‘shock doctrine’ (Klein 2007) or ‘shock therapy’ (Harvey 2005) that previously resulted in swingeing cuts in public services in Southern nations. Our aim in producing this volume is that these contributions help develop a collective response to the seeming limits of these conditions. We view the strength of these contributions in part as providing palpable evidence of how we and our colleagues are acting with critical hope under current conditions so that we might encourage others to work with us to build, together, more progressive formal and informal education systems that address and seek to redress multiple injustices of the world today.
Resumo:
For a submitted query to multiple search engines finding relevant results is an important task. This paper formulates the problem of aggregation and ranking of multiple search engines results in the form of a minimax linear programming model. Besides the novel application, this study detects the most relevant information among a return set of ranked lists of documents retrieved by distinct search engines. Furthermore, two numerical examples aree used to illustrate the usefulness of the proposed approach.
Resumo:
A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.
Resumo:
Few today doubt that English Higher Education (HE), like the wider world in which it is located, is in crisis. This is, in part, an economic crisis, as the government response to the current recession seems to be that of introducing the kind of neoliberal ‘shock doctrine’ (Klein 2007) or ‘shock therapy’ (Harvey 2005) that previously resulted in swingeing cuts in public services in Southern nations. Our aim in producing this volume is that these contributions help develop a collective response to the seeming limits of these conditions. We view the strength of these contributions in part as providing palpable evidence of how we and our colleagues are acting with critical hope under current conditions so that we might encourage others to work with us to build, together, more progressive formal and informal education systems that address and seek to redress multiple injustices of the world today.
The effective use of implicit parallelism through the use of an object-oriented programming language
Resumo:
This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.
Resumo:
Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.
Resumo:
This thesis addresses the problem of offline identification of salient patterns in genetic programming individuals. It discusses the main issues related to automatic pattern identification systems, namely that these (a) should help in understanding the final solutions of the evolutionary run, (b) should give insight into the course of evolution and (c) should be helpful in optimizing future runs. Moreover, it proposes an algorithm, Extended Pattern Growing Algorithm ([E]PGA) to extract, filter and sort the identified patterns so that these fulfill as many as possible of the following criteria: (a) they are representative for the evolutionary run and/or search space, (b) they are human-friendly and (c) their numbers are within reasonable limits. The results are demonstrated on six problems from different domains.