11 resultados para telecommunication and computer networks
em Greenwich Academic Literature Archive - UK
Resumo:
There is concern in the Cross-Channel region of Nord-Pas-de-Calais (France) and Kent (Great Britain), regarding the extent of atmospheric pollution detected in the area from emitted gaseous (VOC, NOx, S02)and particulate substances. In particular, the air quality of the Cross-Channel or "Trans-Manche" region is highly affected by the heavily industrial area of Dunkerque, in addition to transportation sources linked to cross-channel traffic in Kent and Calais, posing threats to the environment and human health. In the framework of the cross-border EU Interreg IIIA activity, the joint Anglo-French project, ATTMA, has been commissioned to study Aerosol Transport in the Trans-Manche Atmosphere. Using ground monitoring data from UK and French networks and with the assistance of satellite images the project aims to determine dispersion patterns. and identify sources responsible for the pollutants. The findings of this study will increase awareness and have a bearing on future air quality policy in the region. Public interest is evident by the presence of local authorities on both sides of the English Channel as collaborators. The research is based on pollution transport simulations using (a) Lagrangian Particle Dispersion (LPD) models, (b) an Eulerian Receptor Based model. This paper is concerned with part (a), the LPD Models. Lagrangian Particle Dispersion (LPD) models are often used to numerically simulate the dispersion of a passive tracer in the planetary boundary layer by calculating the Lagrangian trajectories of thousands of notional particles. In this contribution, the project investigated the use of two widely used particle dispersion models: the Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model and the model FLEXPART. In both models forward tracking and inverse (or·. receptor-based) modes are possible. Certain distinct pollution episodes have been selected from the monitor database EXPER/PF and from UK monitoring stations, and their likely trajectory predicted using prevailing weather data. Global meteorological datasets were downloaded from the ECMWF MARS archive. Part of the difficulty in identifying pollution sources arises from the fact that much of the pollution outside the monitoring area. For example heightened particulate concentrations are to originate from sand storms in the Sahara, or volcanic activity in Iceland or the Caribbean work identifies such long range influences. The output of the simulations shows that there are notable differences between the formulations of and Hysplit, although both models used the same meteorological data and source input, suggesting that the identification of the primary emissions during air pollution episodes may be rather uncertain.
Resumo:
Probe-based scanning microscopes, such as the STM and the AFM, are used to obtain the topographical and electronic structure maps of material surfaces, and to modify their morphologies on nanoscopic scales. They have generated new areas of research in condensed matter physics and materials science. We will review some examples from the fields of experimental nano-mechanics, nano-electronics and nano-magnetism. These now form the basis of the emerging field of Nano-technology. A parallel development has been brought about in the field of Computational Nano-science, using quantum-mechanical techniques and computer-based numerical modelling, such as the Molecular Dynamics (MD) simulation method. We will report on the simulation of nucleation and growth of nano-phase films on supporting substrates. Furthermore, a theoretical modelling of the formation of STM images of metallic clusters on metallic substrates will also be discussed within the non-equilibrium Keldysh Green function method to study the effects of coherent tunnelling through different atomic orbitals in a tip-sample geometry.
Resumo:
When designing a new passenger ship or modifiying an existing design, how do we ensure that the proposed design is safe from an evacuation point of view? In the building and aviation industries, computer based evacuation models are being used to tackle similar issues. In these industries, the traditonal restrictive prescriptive approach to design is making way for performance based design methodologies using risk assessment and computer simulation. In the maritime industry, ship evacuation models off the promise to quickly and efficiently bring these considerations into the design phase, while the ship is "on the drawing board". This paper describes the development of evacuation models with applications to passenger ships and further discusses issues concerning data requirements and validation.
Resumo:
Based upon relevant literature, this study investigated the assessment policy and practices for the BSc (Hons) Computing Science programme at the University of Greenwich (UOG), contextualising these in terms of broad social and educational purposes. It discusses Assessment, and then proceeds to give a critical evaluation of the assessment policy and practices at the UOG. Although this is one case study, because any of the features of the programme are generic to other programmes and institutions, it is of wider value and has further implications. The study was concluded in the summer of 2002. It concludes that overall, the programme's assessment policy and practices are well considered in terms of broad social and educational purposes, although it identifies and outlines several possible improvements, as well as raising some major issues still to be addressed which go beyond assessment practices.
Resumo:
In the flip-chip assembly process, no-flow underfill materials have a particular advantage over traditional underfill: the application and curing of the former can be undertaken before and during the reflow process. This advantage can be exploited to increase the flip-chip manufacturing throughput. However, adopting a no-flow underfill process may introduce reliability issues such as underfill entrapment, delamination at interfaces between underfill and other materials, and lower solder joint fatigue life. This paper presents an analysis on the assembly and the reliability of flip-chips with no-flow underfill. The methodology adopted in the work is a combination of experimental and computer-modeling methods. Two types of no-flow underfill materials have been used for the flip chips. The samples have been inspected with X-ray and scanning acoustic microscope inspection systems to find voids and other defects. Eleven samples for each type of underfill material have been subjected to thermal shock test and the number of cycles to failure for these flip chips have been found. In the computer modeling part of the work, a comprehensive parametric study has provided details on the relationship between the material properties and reliability, and on how underfill entrapment may affect the thermal–mechanical fatigue life of flip chips with no-flow underfill.
Resumo:
The use of games technology in education is not a new phenomenon. Even back in the days of 286 processors, PCs were used in some schools along with (what looks like now) primitive simulation software to teach a range of different skills and techniques – from basic programming using Logo (the turtle style car with a pen at the back that could be used to draw on the floor – always a good way of attracting the attention of school kids!) up to quite sophisticated replications of physical problems, such as working out the trajectory of a missile to blow up an enemies’ tank. So why are games not more widely used in education (especially in FE and HE)? Can they help to support learners even at this advanced stage in their education? We aim to provide in this article an overview of the use of game technologies in education (almost as a small literature review for interested parties) and then go more in depth into one particular example we aim to introduce from this coming academic year (Sept. 2006) to help with teaching and assessment of one area of our Multimedia curriculum. Of course, we will not be able to fully provide the reader with data on how successful this is but we will be running a blog (http://themoviesineducation.blogspot.com/) to keep interested parties up to date with the progress of the project and to hopefully help others to set up similar solutions themselves. We will also only consider a small element of the implementation here and cover how the use of such assessment processes could be used in a broader context. The use of a game to aid learning and improve achievement is suggested because traditional methods of engagement are currently failing on some levels. By this it is meant that various parts of the production process we normally cover in our Multimedia degree are becoming difficult to monitor and continually assess.
Resumo:
With emergence of "Semantic Web" there has been much discussion about the impact of technologies such as XML and RDF on the way we use the Web for developing e-learning applications and perhaps more importantly on how we can personalise these applications. Personalisation of e-learning is viewed by many authors (see amongst others Eklund & Brusilovsky, 1998; Kurzel, Slay, & Hagenus, 2003; Martinez, 2000; Sampson, Karagiannidis, & Kinshuk, 2002; Voigt & Swatman, 2003) as the key challenge for the learning technologists. According to Kurzel (2004) the tailoring of e-learning applications can have an impact on content and how it's accesses; the media forms used; method of instruction employed and the learning styles supported. This paper will report on a research project currently underway at the eCentre in University of Greenwich which is exploring different approaches and methodologies to create an e-learning platform with personalisation built-in. This personalisation is proposed to be set from different levels of within the system starting from being guided by the information that the user inputs into the system down to the lower level of being set using information inferred by the system's processing engine.
Resumo:
Clear assessment deadlines and severe penalties for late submission of coursework are a feature of a number of UK universities. This presents a severe challenge for any online upload system. Evidence from a range of different implementations at the School of Computing and Mathematical Sciences at the University of Greenwich over the past few years is examined to assess the impact of a zero-tolerance deadline policy on the way students work and the problems that arise. Suggestions are made on how to minimise any possible negative impact of a zero-tolerance deadline policy on the administration of the system and on staff and students.
Resumo:
The use by students of an e-learning system that enhances traditional learning in a large university computing school where there are clear assessment deadlines and severe penalties for late submission of coursework is examined to assess the impact of changes to the deadline model on the way students use the system and on the results they achieve. It is demonstrated that the grade a student achieves is partly dependent on the time before the deadline when the work is completed - in general, students who submit earlier gain higher grades. Possible reasons for this are explored. Analysis of the data from a range of different implementations of deadline policies is presented. Suggestions are made on how to minimise any possible negative impact of the assessment policy on the student's overall learning.
Resumo:
This paper presents a framework to integrate requirements management and design knowledge reuse. The research approach begins with a literature review in design reuse and requirements management to identify appropriate methods within each domain. A framework is proposed based on the identified requirements. The framework is then demonstrated using a case study example: vacuum pump design. Requirements are presented as a component of the integrated design knowledge framework. The proposed framework enables the application of requirements management as a dynamic process, including capture, analysis and recording of requirements. It takes account of the evolving requirements and the dynamic nature of the interaction between requirements and product structure through the various stages of product development.