8 resultados para computer forensics tools
em Greenwich Academic Literature Archive - UK
Resumo:
Computer Aided Parallelisation Tools (CAPTools) is a toolkit designed to automate as much as possible of the process of parallelising scalar FORTRAN 77 codes. The toolkit combines a very powerful dependence analysis together with user supplied knowledge to build an extremely comprehensive and accurate dependence graph. The initial version has been targeted at structured mesh computational mechanics codes (eg. heat transfer, Computational Fluid Dynamics (CFD)) and the associated simple mesh decomposition paradigm is utilised in the automatic code partition, execution control mask generation and communication call insertion. In this, the first of a series of papers [1–3] the authors discuss the parallelisations of a number of case study codes showing how the various component tools may be used to develop a highly efficient parallel implementation in a few hours or days. The details of the parallelisation of the TEAMKE1 CFD code are described together with the results of three other numerical codes. The resulting parallel implementations are then tested on workstation clusters using PVM and an i860-based parallel system showing efficiencies well over 80%.
Resumo:
Abstract not available
Resumo:
User supplied knowledge and interaction is a vital component of a toolkit for producing high quality parallel implementations of scalar FORTRAN numerical code. In this paper we consider the necessary components that such a parallelisation toolkit should possess to provide an effective environment to identify, extract and embed user relevant user knowledge. We also examine to what extent these facilities are available in leading parallelisation tools; in particular we discuss how these issues have been addressed in the development of the user interface of the Computer Aided Parallelisation Tools (CAPTools). The CAPTools environment has been designed to enable user exploration, interaction and insertion of user knowledge to facilitate the automatic generation of very efficient parallel code. A key issue in the user's interaction is control of the volume of information so that the user is focused on only that which is needed. User control over the level and extent of information revealed at any phase is supplied using a wide variety of filters. Another issue is the way in which information is communicated. Dependence analysis and its resulting graphs involve a lot of sophisticated rather abstract concepts unlikely to be familiar to most users of parallelising tools. As such, considerable effort has been made to communicate with the user in terms that they will understand. These features, amongst others, and their use in the parallelisation process are described and their effectiveness discussed.
Resumo:
The demands of the process of engineering design, particularly for structural integrity, have exploited computational modelling techniques and software tools for decades. Frequently, the shape of structural components or assemblies is determined to optimise the flow distribution or heat transfer characteristics, and to ensure that the structural performance in service is adequate. From the perspective of computational modelling these activities are typically separated into: • fluid flow and the associated heat transfer analysis (possibly with chemical reactions), based upon Computational Fluid Dynamics (CFD) technology • structural analysis again possibly with heat transfer, based upon finite element analysis (FEA) techniques.
Resumo:
This work proceeds from the assumption that a European environmental information and communication system (EEICS) is already established. In the context of primary users (land-use planners, conservationists, and environmental researchers) we ask what use may be made of the EEICS for building models and tools which is of use in building decision support systems for the land-use planner. The complex task facing the next generation of environmental and forest modellers is described, and a range of relevant modelling approaches are reviewed. These include visualization and GIS; statistical tabulation and database SQL, MDA and OLAP methods. The major problem of noncomparability of the definitions and measures of forest area and timber volume is introduced and the possibility of a model-based solution is considered. The possibility of using an ambitious and challenging biogeochemical modelling approach to understanding and managing European forests sustainably is discussed. It is emphasised that all modern methodological disciplines must be brought to bear, and a heuristic hybrid modelling approach should be used so as to ensure that the benefits of practical empirical modelling approaches are utilised in addition to the scientifically well-founded and holistic ecosystem and environmental modelling. The data and information system required is likely to end up as a grid-based-framework because of the heavy use of computationally intensive model-based facilities.
Resumo:
A comprehensive solution of solidification/melting processes requires the simultaneous representation of free surface fluid flow, heat transfer, phase change, nonlinear solid mechanics and, possibly, electromagnetics together with their interactions, in what is now known as multiphysics simulation. Such simulations are computationally intensive and the implementation of solution strategies for multiphysics calculations must embed their effective parallelization. For some years, together with our collaborators, we have been involved in the development of numerical software tools for multiphysics modeling on parallel cluster systems. This research has involved a combination of algorithmic procedures, parallel strategies and tools, plus the design of a computational modeling software environment and its deployment in a range of real world applications. One output from this research is the three-dimensional parallel multiphysics code, PHYSICA. In this paper we report on an assessment of its parallel scalability on a range of increasingly complex models drawn from actual industrial problems, on three contemporary parallel cluster systems.
Resumo:
Within the building evacuation context, wayfinding describes the process in which an individual located within an arbitrarily complex enclosure attempts to find a path which leads them to relative safety, usually the exterior of the enclosure. Within most evacuation modelling tools, wayfinding is completely ignored; agents are either assigned the shortest distance path or use a potential field to find the shortest path to the exits. In this paper a novel wayfinding technique that attempts to represent the manner in which people wayfind within structures is introduced and demonstrated through two examples. The first step is to encode the spatial information of the enclosure in terms of a graph. The second step is to apply search algorithms to the graph to find possible routes to the destination and assign a cost to the routes based on their personal route preferences such as "least time" or "least distance" or a combination of criteria. The third step is the route execution and refinement. In this step, the agent moves along the chosen route and reassesses the route at regular intervals and may decide to take an alternative path if the agent determines that an alternate route is more favourable e.g. initial path is highly congested or is blocked due to fire.