30 resultados para computer tomography
em Greenwich Academic Literature Archive - UK
Resumo:
Computer Aided Parallelisation Tools (CAPTools) is a toolkit designed to automate as much as possible of the process of parallelising scalar FORTRAN 77 codes. The toolkit combines a very powerful dependence analysis together with user supplied knowledge to build an extremely comprehensive and accurate dependence graph. The initial version has been targeted at structured mesh computational mechanics codes (eg. heat transfer, Computational Fluid Dynamics (CFD)) and the associated simple mesh decomposition paradigm is utilised in the automatic code partition, execution control mask generation and communication call insertion. In this, the first of a series of papers [1–3] the authors discuss the parallelisations of a number of case study codes showing how the various component tools may be used to develop a highly efficient parallel implementation in a few hours or days. The details of the parallelisation of the TEAMKE1 CFD code are described together with the results of three other numerical codes. The resulting parallel implementations are then tested on workstation clusters using PVM and an i860-based parallel system showing efficiencies well over 80%.
Resumo:
The mathematical simulation of the evacuation process has a wide and largely untapped scope of application within the aircraft industry. The function of the mathematical model is to provide insight into complex behaviour by allowing designers, legislators, and investigators to ask ‘what if’ questions. Such a model, EXODUS, is currently under development, and this paper describes its evolution and potential applications. EXODUS is an egress model designed to simulate the evacuation of large numbers of individuals from an enclosure, such as an aircraft. The model tracks the trajectory of each individual as they make their way out of the enclosure or are overcome by fire hazards, such as heat and toxic gases. The software is expert system-based, the progressive motion and behaviour of each individual being determined by a set of heuristics or rules. EXODUS comprises five core interacting components: (i) the Movement Submodel — controls the physical movement of individual passengers from their current position to the most suitable neighbouring location; (ii) the Behaviour Submodel — determines an individual's response to the current prevailing situation; (iii) the Passenger Submodel — describes an individual as a collection of 22 defining attributes and variables; (iv) the Hazard Submodel — controls the atmospheric and physical environment; and (v) the Toxicity Submodel — determines the effects on an individual exposed to the fire products, heat, and narcotic gases through the Fractional Effective Dose calculations. These components are briefly described and their capabilities and limitations are demonstrated through comparison with experimental data and several hypothetical evacuation scenarios.
Resumo:
Computer based analysis of evacuation can be performed using one of three different approaches, namely optimisation, simulation or risk assessment. Furthermore, within each approach different means of representing the enclosure, the population, and the behaviour of the population are possible. The myriad of approaches which are available has led to the development of some 22 different evacuation models. This article attempts to describe each of the modelling approaches adopted and critically review the inherent capabilities of each approach. The review is based on available published literature.
Resumo:
The demands of the process of engineering design, particularly for structural integrity, have exploited computational modelling techniques and software tools for decades. Frequently, the shape of structural components or assemblies is determined to optimise the flow distribution or heat transfer characteristics, and to ensure that the structural performance in service is adequate. From the perspective of computational modelling these activities are typically separated into: • fluid flow and the associated heat transfer analysis (possibly with chemical reactions), based upon Computational Fluid Dynamics (CFD) technology • structural analysis again possibly with heat transfer, based upon finite element analysis (FEA) techniques.
Resumo:
A review of the atomistic modelling of the behaviour of nano-scale structures and processes via molecular dynamics (MD) simulation method of a canonical ensemble is presented. Three areas of application in condensed matter physics are considered. We focus on the adhesive and indentation properties of the solid surfaces in nano-contacts, the nucleation and growth of nano-phase metallic and semi-conducting atomic and molecular films on supporting substrates, and the nano- and multi-scale crack propagation properties of metallic lattices. A set of simulations selected from these fields are discussed, together with a brief introduction to the methodology of the MD simulation. The pertinent inter-atomic potentials that model the energetics of the metallic and semi-conducting systems are also given.
Resumo:
A flip chip component is a silicon chip mounted to a substrate with the active area facing the substrate. This paper presents the results of an investigation into the relationship between a number of important material properties and geometric parameters on the thermal-mechanical fatigue reliability of a standard flip chip design and a flip chip design with the use of microvias. Computer modeling has been used to analyze the mechanical conditions of flip chips under cyclic thermal loading where the Coffin-Manson empirical relationship has been used to predict the life time of the solder interconnects. The material properties and geometry parameters that have been investigated are the Young's modulus, the coefficient of thermal expansion (CTE) of the underfill, the out-of-plane CTE (CTEz) of the substrate, the thickness of the substrate, and the standoff height. When these parameters vary, the predicted life-times are calculated and some of the features of the results are explained. By comparing the predicted lifetimes of the two designs and the strain conditions under thermal loading, the local CTE mismatch has been found to be one of most important factors in defining the reliability of flip chips with microvias.
Resumo:
When designing a new passenger ship or modifiying an existing design, how do we ensure that the proposed design is safe from an evacuation point of view? In the building and aviation industries, computer based evacuation models are being used to tackle similar issues. In these industries, the traditonal restrictive prescriptive approach to design is making way for performance based design methodologies using risk assessment and computer simulation. In the maritime industry, ship evacuation models off the promise to quickly and efficiently bring these considerations into the design phase, while the ship is "on the drawing board". This paper describes the development of evacuation models with applications to passenger ships and further discusses issues concerning data requirements and validation.
Resumo:
Computer equipment, once viewed as leading edge, is quickly condemned as obsolete and banished to basement store rooms or rubbish bins. The magpie instincts of some of the academics and technicians at the University of Greenwich, London, preserved some such relics in cluttered offices and garages to the dismay of colleagues and partners. When the University moved into its new campus in the historic buildings of the Old Royal Naval College in the center of Greenwich, corridor space in King William Court provided an opportunity to display some of this equipment so that students could see these objects and gain a more vivid appreciation of their subject's history.
Resumo:
The shared-memory programming model can be an effective way to achieve parallelism on shared memory parallel computers. Historically however, the lack of a programming standard using directives and the limited scalability have affected its take-up. Recent advances in hardware and software technologies have resulted in improvements to both the performance of parallel programs with compiler directives and the issue of portability with the introduction of OpenMP. In this study, the Computer Aided Parallelisation Toolkit has been extended to automatically generate OpenMP-based parallel programs with nominal user assistance. We categorize the different loop types and show how efficient directives can be placed using the toolkit's in-depth interprocedural analysis. Examples are taken from the NAS parallel benchmarks and a number of real-world application codes. This demonstrates the great potential of using the toolkit to quickly parallelise serial programs as well as the good performance achievable on up to 300 processors for hybrid message passing-directive parallelisations.
Resumo:
Flip-chip assembly, developed in the early 1960s, is now being positioned as a key joining technology to achieve high-density mounting of electronic components on to printed circuit boards for high-volume, low-cost products. Computer models are now being used early within the product design stage to ensure that optimal process conditions are used. These models capture the governing physics taking place during the assembly process and they can also predict relevant defects that may occur. Describes the application of computational modelling techniques that have the ability to predict a range of interacting physical phenomena associated with the manufacturing process. For example, in the flip-chip assembly process we have solder paste deposition, solder joint shape formation, heat transfer, solidification and thermal stress. Illustrates the application of modelling technology being used as part of a larger UK study aiming to establish a process route for high-volume, low-cost, sub-100-micron pitch flip-chip assembly.
Resumo:
Recently, research has been carried out to test a novel bumping method which omits the under bump metallurgy forming process by bonding copper columns directly onto the Al pads of the silicon dies. This bumping method could be adopted to simplify the flip chip manufacturing process, increase the productivity and achieve a higher I/O count. This paper describes an investigation of the solder joint reliability of flip-chips based on this new bumping process. Computer modelling methods are used to predict the shape of solder joints and response of flip chips to thermal cyclic loading. The accumulated plastic strain energy at the comer solder joints is used as the damage indicator. Models with a range of design parameters have been compared for their reliability. The parameters that have been investigated are the copper column height, radius and solder volume. The ranking of the relative importance of these parameters is given. For most of the results presented in the paper, the solder material has been assumed to be the lead-free 96.5Sn3.5Ag alloy but some results for 60Sn40Pb solder joints have also been presented.
Resumo:
This paper concerns a preliminary numerical simulation study of the evacuation of the World Trade Centre North Tower on 11 September 2001 using the buildingEXODUS evacuation simulation software. The analysis makes use of response time data derived from a study of survivor accounts appearing in the public domain. While exact geometric details of the building were not available for this study, the building geometry was approximated from descriptions available in the public domain. The study attempts to reproduce the events of 11 September 2001 and pursue several ‘what if’ questions concerning the evacuation. In particular, the study explores the likely outcome had a single staircase survived in tact from top to bottom.
Resumo:
The anticipated rewards of adaptive approaches will only be fully realised when autonomic algorithms can take configuration and deployment decisions that match and exceed those of human engineers. Such decisions are typically characterised as being based on a foundation of experience and knowledge. In humans, these underpinnings are themselves founded on the ashes of failure, the exuberance of courage and (sometimes) the outrageousness of fortune. In this paper we describe an application framework that will allow the incorporation of similarly risky, error prone and downright dangerous software artefacts into live systems – without undermining the certainty of correctness at application level. We achieve this by introducing the notion of application dreaming.