935 resultados para Computer adventure games.
Resumo:
Tony Mann provides a review of the book: Theory of Games and Economic Behavior, John von Neumann and Oskar Morgenstern, Princeton University Press, 1944.
Resumo:
Computer equipment, once viewed as leading edge, is quickly condemned as obsolete and banished to basement store rooms or rubbish bins. The magpie instincts of some of the academics and technicians at the University of Greenwich, London, preserved some such relics in cluttered offices and garages to the dismay of colleagues and partners. When the University moved into its new campus in the historic buildings of the Old Royal Naval College in the center of Greenwich, corridor space in King William Court provided an opportunity to display some of this equipment so that students could see these objects and gain a more vivid appreciation of their subject's history.
Resumo:
The shared-memory programming model can be an effective way to achieve parallelism on shared memory parallel computers. Historically however, the lack of a programming standard using directives and the limited scalability have affected its take-up. Recent advances in hardware and software technologies have resulted in improvements to both the performance of parallel programs with compiler directives and the issue of portability with the introduction of OpenMP. In this study, the Computer Aided Parallelisation Toolkit has been extended to automatically generate OpenMP-based parallel programs with nominal user assistance. We categorize the different loop types and show how efficient directives can be placed using the toolkit's in-depth interprocedural analysis. Examples are taken from the NAS parallel benchmarks and a number of real-world application codes. This demonstrates the great potential of using the toolkit to quickly parallelise serial programs as well as the good performance achievable on up to 300 processors for hybrid message passing-directive parallelisations.
Resumo:
Flip-chip assembly, developed in the early 1960s, is now being positioned as a key joining technology to achieve high-density mounting of electronic components on to printed circuit boards for high-volume, low-cost products. Computer models are now being used early within the product design stage to ensure that optimal process conditions are used. These models capture the governing physics taking place during the assembly process and they can also predict relevant defects that may occur. Describes the application of computational modelling techniques that have the ability to predict a range of interacting physical phenomena associated with the manufacturing process. For example, in the flip-chip assembly process we have solder paste deposition, solder joint shape formation, heat transfer, solidification and thermal stress. Illustrates the application of modelling technology being used as part of a larger UK study aiming to establish a process route for high-volume, low-cost, sub-100-micron pitch flip-chip assembly.
Resumo:
Recently, research has been carried out to test a novel bumping method which omits the under bump metallurgy forming process by bonding copper columns directly onto the Al pads of the silicon dies. This bumping method could be adopted to simplify the flip chip manufacturing process, increase the productivity and achieve a higher I/O count. This paper describes an investigation of the solder joint reliability of flip-chips based on this new bumping process. Computer modelling methods are used to predict the shape of solder joints and response of flip chips to thermal cyclic loading. The accumulated plastic strain energy at the comer solder joints is used as the damage indicator. Models with a range of design parameters have been compared for their reliability. The parameters that have been investigated are the copper column height, radius and solder volume. The ranking of the relative importance of these parameters is given. For most of the results presented in the paper, the solder material has been assumed to be the lead-free 96.5Sn3.5Ag alloy but some results for 60Sn40Pb solder joints have also been presented.
Resumo:
This paper concerns a preliminary numerical simulation study of the evacuation of the World Trade Centre North Tower on 11 September 2001 using the buildingEXODUS evacuation simulation software. The analysis makes use of response time data derived from a study of survivor accounts appearing in the public domain. While exact geometric details of the building were not available for this study, the building geometry was approximated from descriptions available in the public domain. The study attempts to reproduce the events of 11 September 2001 and pursue several ‘what if’ questions concerning the evacuation. In particular, the study explores the likely outcome had a single staircase survived in tact from top to bottom.
Resumo:
The anticipated rewards of adaptive approaches will only be fully realised when autonomic algorithms can take configuration and deployment decisions that match and exceed those of human engineers. Such decisions are typically characterised as being based on a foundation of experience and knowledge. In humans, these underpinnings are themselves founded on the ashes of failure, the exuberance of courage and (sometimes) the outrageousness of fortune. In this paper we describe an application framework that will allow the incorporation of similarly risky, error prone and downright dangerous software artefacts into live systems – without undermining the certainty of correctness at application level. We achieve this by introducing the notion of application dreaming.
Resumo:
In this paper a methodology for the application of computer simulation to the evacuation certification of aircraft is suggested. The methodology suggested here involves the use of computer simulation, historic certification data, component testing and full-scale certification trials. The proposed methodology sets out a protocol for how computer simulation should be undertaken in a certification environment and draws on experience from both the marine and building industries. Along with the suggested protocol, a phased introduction of computer models to certification is suggested. Given the sceptical nature of the aviation community regarding any certification methodology change in general, this would involve as a first step the use of computer simulation in conjunction with full-scale testing. The computer model would be used to reproduce a probability distribution of likely aircraft performance under current certification conditions and in addition, several other more challenging scenarios could be developed. The combination of full-scale trial, computer simulation (and if necessary component testing) would provide better insight into the actual performance capabilities of the aircraft by generating a performance probability distribution or performance envelope rather than a single datum. Once further confidence in the technique is established, the second step would only involve computer simulation and component testing. This would only be contemplated after sufficient experience and confidence in the use of computer models have been developed. The third step in the adoption of computer simulation for certification would involve the introduction of several scenarios based on for example exit availability instructed by accident analysis. The final step would be the introduction of more realistic accident scenarios into the certification process. This would require the continued development of aircraft evacuation modelling technology to include additional behavioural features common in real accident scenarios.
Resumo:
This paper reports on research work undertaken for the European Commission funded study GMA2/2000/32039 Very Large Transport Aircraft (VLTA) Emergency Requirements Research Evacuation Study (VERRES). A particular focus was on evacuation issues with a detailed study of evacuation performance using computer models being undertaken as part of Work Package 2. This paper describes this work and investigates the use of internal stairs during evacuation using computer simulation.
Proposed methodology for the use of computer simulation to enhance aircraft evacuation certification
Resumo:
In this paper a methodology for the application of computer simulation to evacuation certification of aircraft is suggested. This involves the use of computer simulation, historic certification data, component testing, and full-scale certification trials. The methodology sets out a framework for how computer simulation should be undertaken in a certification environment and draws on experience from both the marine and building industries. In addition, a phased introduction of computer models to certification is suggested. This involves as a first step the use of computer simulation in conjunction with full-scale testing. The combination of full-scale trial, computer simulation (and if necessary component testing) provides better insight into aircraft evacuation performance capabilities by generating a performance probability distribution rather than a single datum. Once further confidence in the technique is established the requirement for the full-scale demonstration could be dropped. The second step in the adoption of computer simulation for certification involves the introduction of several scenarios based on, for example, exit availability, instructed by accident analysis. The final step would be the introduction of more realistic accident scenarios. This would require the continued development of aircraft evacuation modeling technology to include additional behavioral features common in real accident scenarios.
Resumo:
The Guardian newspaper (21st October 2005) informed its readers that: "Stanford University in California is to make its course content available on iTunes...The service, Stanford on iTunes, will provide…downloads of faculty lectures, campus events, performances, book readings, music recorded by Stanford students and even podcasts of Stanford football games". The emergence of Podcasting as means of sending audio data to users has clearly excited educational technologists around the world. This paper will explore the technologies behind Podcasting and how this could be used to develop and deliver new E-Learning material. The paper refers to the work done to create Podcasts of lectures for University of Greenwich students.
Resumo:
The use of games technology in education is not a new phenomenon. Even back in the days of 286 processors, PCs were used in some schools along with (what looks like now) primitive simulation software to teach a range of different skills and techniques – from basic programming using Logo (the turtle style car with a pen at the back that could be used to draw on the floor – always a good way of attracting the attention of school kids!) up to quite sophisticated replications of physical problems, such as working out the trajectory of a missile to blow up an enemies’ tank. So why are games not more widely used in education (especially in FE and HE)? Can they help to support learners even at this advanced stage in their education? We aim to provide in this article an overview of the use of game technologies in education (almost as a small literature review for interested parties) and then go more in depth into one particular example we aim to introduce from this coming academic year (Sept. 2006) to help with teaching and assessment of one area of our Multimedia curriculum. Of course, we will not be able to fully provide the reader with data on how successful this is but we will be running a blog (http://themoviesineducation.blogspot.com/) to keep interested parties up to date with the progress of the project and to hopefully help others to set up similar solutions themselves. We will also only consider a small element of the implementation here and cover how the use of such assessment processes could be used in a broader context. The use of a game to aid learning and improve achievement is suggested because traditional methods of engagement are currently failing on some levels. By this it is meant that various parts of the production process we normally cover in our Multimedia degree are becoming difficult to monitor and continually assess.
Resumo:
[This abstract is based on the authors' abstract.]Three new standards to be applied when adopting commercial computer off-the-shelf (COTS) software solutions are discussed. The first standard is for a COTS software life cycle, the second for a software solution user requirements life cycle, and the third is a checklist to help in completing the requirements. The standards are based on recent major COTS software solution implementations.
Resumo:
The results of a finite element computer modelling analysis of a micro-manufactured one-turn magnetic inductor using the software package ANSYS 10.0 are presented. The inductor is designed for a DC-DC converter used in microelectronic devices. It consists of a copper conductor with a rectangular cross-section plated with an insulation layer and a layer of magnetic core. The analysis has focused on the effects of the frequency and the air gaps on the on the inductance values and the Joule losses in the core and conductor. It has been found that an inductor with small multiple air gaps has lower losses than an inductor with a single larger gap
Resumo:
This work explores the impact of response time distributions on high-rise building evacuation. The analysis utilises response times extracted from printed accounts and interviews of evacuees from the WTC North Tower evacuation of 11 September 2001. Evacuation simulations produced using these “real” response time distributions are compared with simulations produced using instant and engineering response time distributions. Results suggest that while typical engineering approximations to the response time distribution may produce reasonable evacuation times for up to 90% of the building population, using this approach may underestimate total evacuation times by as much as 61%. These observations are applicable to situations involving large high-rise buildings in which travel times are generally expected to be greater than response times