24 resultados para Visualization Using Computer Algebra Tools
Resumo:
The Student Experience of E-Learning project (SEEL) was an institutional response to the university’s HEA/JISC Benchmarking exercise (Ryan and Kandler, 2007). The study had a social constructivist approach which recognised the importance of listening to the student voice (JISC 2007) within the University of Greenwich context, to interpret the student experience of e-learning. Nearly 1000 students responded to an online survey on their approaches to, and their use of, learning technology. The quantitative and qualitative questions used included identifying study patterns, using specific online tools, within the context of learning and beyond, and student’s attitudes towards using e-learning in their studies. Initially, individual responses to questions were analysed in depth, giving a general indication of the student experience. Further depth was applied through a filtering mechanism, beginning with a cross-slicing of individual student responses to produce cameos. Audio logs and individual interviews were drawn from these cameos. Analysis of the cameos is in progress but has already revealed some unexpected results. There was a mismatch between students’ expectations of the university’s use of technology and their experiences and awareness of its possible use in other contexts. Students recognised the importance of social interaction as a vehicle for learning (Vygotsky 1978, Bruner 2006) but expressed polarised views on the use of social networking sites such as Facebook for e-learning. Their experiences in commercial contexts led them to see the university VLE as unimaginative and the tutors’ use of it as lacking in vision. Whereas analysis of the individual questions provided a limited picture, the cameos gave a truer reflection of the students lived experiences and identified a gulf between the university’s provision and the students’ expectation of e-learning and their customary use of technology. However it is recognised that the very nature of an online survey necessarily excludes students who chose not to engage, either through lack of skills or through disillusionment and this would constitute a separate area for study.
Resumo:
Computer Aided Parallelisation Tools (CAPTools) is a toolkit designed to automate as much as possible of the process of parallelising scalar FORTRAN 77 codes. The toolkit combines a very powerful dependence analysis together with user supplied knowledge to build an extremely comprehensive and accurate dependence graph. The initial version has been targeted at structured mesh computational mechanics codes (eg. heat transfer, Computational Fluid Dynamics (CFD)) and the associated simple mesh decomposition paradigm is utilised in the automatic code partition, execution control mask generation and communication call insertion. In this, the first of a series of papers [1–3] the authors discuss the parallelisations of a number of case study codes showing how the various component tools may be used to develop a highly efficient parallel implementation in a few hours or days. The details of the parallelisation of the TEAMKE1 CFD code are described together with the results of three other numerical codes. The resulting parallel implementations are then tested on workstation clusters using PVM and an i860-based parallel system showing efficiencies well over 80%.
Resumo:
This work proceeds from the assumption that a European environmental information and communication system (EEICS) is already established. In the context of primary users (land-use planners, conservationists, and environmental researchers) we ask what use may be made of the EEICS for building models and tools which is of use in building decision support systems for the land-use planner. The complex task facing the next generation of environmental and forest modellers is described, and a range of relevant modelling approaches are reviewed. These include visualization and GIS; statistical tabulation and database SQL, MDA and OLAP methods. The major problem of noncomparability of the definitions and measures of forest area and timber volume is introduced and the possibility of a model-based solution is considered. The possibility of using an ambitious and challenging biogeochemical modelling approach to understanding and managing European forests sustainably is discussed. It is emphasised that all modern methodological disciplines must be brought to bear, and a heuristic hybrid modelling approach should be used so as to ensure that the benefits of practical empirical modelling approaches are utilised in addition to the scientifically well-founded and holistic ecosystem and environmental modelling. The data and information system required is likely to end up as a grid-based-framework because of the heavy use of computationally intensive model-based facilities.
Resumo:
User supplied knowledge and interaction is a vital component of a toolkit for producing high quality parallel implementations of scalar FORTRAN numerical code. In this paper we consider the necessary components that such a parallelisation toolkit should possess to provide an effective environment to identify, extract and embed user relevant user knowledge. We also examine to what extent these facilities are available in leading parallelisation tools; in particular we discuss how these issues have been addressed in the development of the user interface of the Computer Aided Parallelisation Tools (CAPTools). The CAPTools environment has been designed to enable user exploration, interaction and insertion of user knowledge to facilitate the automatic generation of very efficient parallel code. A key issue in the user's interaction is control of the volume of information so that the user is focused on only that which is needed. User control over the level and extent of information revealed at any phase is supplied using a wide variety of filters. Another issue is the way in which information is communicated. Dependence analysis and its resulting graphs involve a lot of sophisticated rather abstract concepts unlikely to be familiar to most users of parallelising tools. As such, considerable effort has been made to communicate with the user in terms that they will understand. These features, amongst others, and their use in the parallelisation process are described and their effectiveness discussed.
Resumo:
Computer based mathematical models describing aircraft fire have a role to play in the design and development of safer aircraft, in the implementation of safer and more rigorous certification criteria and in post mortuum accident investigation. As the cost involved in performing large-scale fire experiments for the next generation 'Ultra High Capacity Aircraft' (UHCA) are expected to be prohibitively high, the development and use of these modelling tools may become essential if these aircraft are to prove a safe and viable reality. By describing the present capabilities and limitations of aircraft fire models, this paper will examine the future development of these models in the areas of large scale applications through parallel computing, combustion modelling and extinguishment modelling.
Resumo:
The shared-memory programming model can be an effective way to achieve parallelism on shared memory parallel computers. Historically however, the lack of a programming standard using directives and the limited scalability have affected its take-up. Recent advances in hardware and software technologies have resulted in improvements to both the performance of parallel programs with compiler directives and the issue of portability with the introduction of OpenMP. In this study, the Computer Aided Parallelisation Toolkit has been extended to automatically generate OpenMP-based parallel programs with nominal user assistance. We categorize the different loop types and show how efficient directives can be placed using the toolkit's in-depth interprocedural analysis. Examples are taken from the NAS parallel benchmarks and a number of real-world application codes. This demonstrates the great potential of using the toolkit to quickly parallelise serial programs as well as the good performance achievable on up to 300 processors for hybrid message passing-directive parallelisations.
Resumo:
The parallelization of real-world compute intensive Fortran application codes is generally not a trivial task. If the time to complete the parallelization is to be significantly reduced then an environment is needed that will assist the programmer in the various tasks of code parallelization. In this paper the authors present a code parallelization environment where a number of tools that address the main tasks such as code parallelization, debugging and optimization are available. The ParaWise and CAPO parallelization tools are discussed which enable the near automatic parallelization of real-world scientific application codes for shared and distributed memory-based parallel systems. As user involvement in the parallelization process can introduce errors, a relative debugging tool (P2d2) is also available and can be used to perform nearly automatic relative debugging of a program that has been parallelized using the tools. A high quality interprocedural dependence analysis as well as user-tool interaction are also highlighted and are vital to the generation of efficient parallel code and in the optimization of the backtracking and speculation process used in relative debugging. Results of benchmark and real-world application codes parallelized are presented and show the benefits of using the environment
Resumo:
A Concise Intro to Image Processing using C++ presents state-of-the-art image processing methodology, including current industrial practices for image compression, image de-noising methods based on partial differential equations, and new image compression methods such as fractal image compression and wavelet compression. It includes elementary concepts of image processing and related fundamental tools with coding examples as well as exercises. With a particular emphasis on illustrating fractal and wavelet compression algorithms, the text covers image segmentation, object recognition, and morphology. An accompanying CD-ROM contains code for all algorithms.
Resumo:
An innovative methodology has been used for the formulation development of Cyclosporine A (CyA) nanoparticles. In the present study the static mixer technique, which is a novel method for producing nanoparticles, was employed. The formulation optimum was calculated by the modified Shepard's method (MSM), an advanced data analysis technique not adopted so far in pharmaceutical applications. Controlled precipitation was achieved injecting the organic CyA solution rapidly into an aqueous protective solution by means of a static mixer. Furthermore the computer based MSM was implemented for data analysis, visualization, and application development. For the optimization studies, the gelatin/lipoid S75 amounts and the organic/aqueous phase were selected as independent variables while the obtained particle size as a dependent variable. The optimum predicted formulation was characterized by cryo-TEM microscopy, particle size measurements, stability, and in vitro release. The produced nanoparticles contain drug in amorphous state and decreased amounts of stabilizing agents. The dissolution rate of the lyophilized powder was significantly enhanced in the first 2 h. MSM was proved capable to interpret in detail and to predict with high accuracy the optimum formulation. The mixer technique was proved capable to develop CyA nanoparticulate formulations.