13 resultados para Forensic Tools
em Greenwich Academic Literature Archive - UK
Resumo:
Computer Aided Parallelisation Tools (CAPTools) is a toolkit designed to automate as much as possible of the process of parallelising scalar FORTRAN 77 codes. The toolkit combines a very powerful dependence analysis together with user supplied knowledge to build an extremely comprehensive and accurate dependence graph. The initial version has been targeted at structured mesh computational mechanics codes (eg. heat transfer, Computational Fluid Dynamics (CFD)) and the associated simple mesh decomposition paradigm is utilised in the automatic code partition, execution control mask generation and communication call insertion. In this, the first of a series of papers [1–3] the authors discuss the parallelisations of a number of case study codes showing how the various component tools may be used to develop a highly efficient parallel implementation in a few hours or days. The details of the parallelisation of the TEAMKE1 CFD code are described together with the results of three other numerical codes. The resulting parallel implementations are then tested on workstation clusters using PVM and an i860-based parallel system showing efficiencies well over 80%.
Resumo:
User supplied knowledge and interaction is a vital component of a toolkit for producing high quality parallel implementations of scalar FORTRAN numerical code. In this paper we consider the necessary components that such a parallelisation toolkit should possess to provide an effective environment to identify, extract and embed user relevant user knowledge. We also examine to what extent these facilities are available in leading parallelisation tools; in particular we discuss how these issues have been addressed in the development of the user interface of the Computer Aided Parallelisation Tools (CAPTools). The CAPTools environment has been designed to enable user exploration, interaction and insertion of user knowledge to facilitate the automatic generation of very efficient parallel code. A key issue in the user's interaction is control of the volume of information so that the user is focused on only that which is needed. User control over the level and extent of information revealed at any phase is supplied using a wide variety of filters. Another issue is the way in which information is communicated. Dependence analysis and its resulting graphs involve a lot of sophisticated rather abstract concepts unlikely to be familiar to most users of parallelising tools. As such, considerable effort has been made to communicate with the user in terms that they will understand. These features, amongst others, and their use in the parallelisation process are described and their effectiveness discussed.
Resumo:
The consecutive, partly overlapping emergence of expert systems and then neural computation methods among intelligent technologies, is reflected in the evolving scene of their application to nuclear engineering. This paper provides a bird's eye view of the state of the application in the domain, along with a review of a particular task, the one perhaps economically more important: refueling design in nuclear power reactors.
Resumo:
The Production Workstation developed at the University of Greenwich is evaluated as a tool for assisting all those concerned with production. It enables the producer, director, and cinematographer to explore the quality of the images obtainable when using a plethora of tools. Users are free to explore many possible choices, ranging from 35mm to DV, and combine them with the many image manipulation tools of the cinematographer. The validation required for the system is explicitly examined, concerning the accuracy of the resulting imagery. Copyright © 1999 by the Society of Motion Picture and Television Engineers, Inc.
Resumo:
This work proceeds from the assumption that a European environmental information and communication system (EEICS) is already established. In the context of primary users (land-use planners, conservationists, and environmental researchers) we ask what use may be made of the EEICS for building models and tools which is of use in building decision support systems for the land-use planner. The complex task facing the next generation of environmental and forest modellers is described, and a range of relevant modelling approaches are reviewed. These include visualization and GIS; statistical tabulation and database SQL, MDA and OLAP methods. The major problem of noncomparability of the definitions and measures of forest area and timber volume is introduced and the possibility of a model-based solution is considered. The possibility of using an ambitious and challenging biogeochemical modelling approach to understanding and managing European forests sustainably is discussed. It is emphasised that all modern methodological disciplines must be brought to bear, and a heuristic hybrid modelling approach should be used so as to ensure that the benefits of practical empirical modelling approaches are utilised in addition to the scientifically well-founded and holistic ecosystem and environmental modelling. The data and information system required is likely to end up as a grid-based-framework because of the heavy use of computationally intensive model-based facilities.
Resumo:
A comprehensive solution of solidification/melting processes requires the simultaneous representation of free surface fluid flow, heat transfer, phase change, nonlinear solid mechanics and, possibly, electromagnetics together with their interactions, in what is now known as multiphysics simulation. Such simulations are computationally intensive and the implementation of solution strategies for multiphysics calculations must embed their effective parallelization. For some years, together with our collaborators, we have been involved in the development of numerical software tools for multiphysics modeling on parallel cluster systems. This research has involved a combination of algorithmic procedures, parallel strategies and tools, plus the design of a computational modeling software environment and its deployment in a range of real world applications. One output from this research is the three-dimensional parallel multiphysics code, PHYSICA. In this paper we report on an assessment of its parallel scalability on a range of increasingly complex models drawn from actual industrial problems, on three contemporary parallel cluster systems.
Resumo:
Book review of: Eisen, M.L., Quas, J.A. & Goodman, G.S. (Eds.) (2002). Memory and suggestibility in the forensic interview. Mahwah, NJ: Lawrence Erlbaum Associates. ISBN: 0-8058-3080-4/$55.00 Special Prepaid Price
Resumo:
The main goal of a cell stability MHD model like MHD-Valdis is to help locate the busbars around the cell in a way which leads to the generation of a magnetic field inside the cell that itself leads to a stable cell operation. Yet as far as the cell stability is concerned, the uniformity of the current density in the metal pad is also extremely important and can only be achieved with a correct busbar network sizing. This work compares the usage of a detailed ANSYS based 3D thermo-electric model with the one of the versatile 1D part of MHD-Valdis to help design a well balanced busbar network.
Resumo:
Within the building evacuation context, wayfinding describes the process in which an individual located within an arbitrarily complex enclosure attempts to find a path which leads them to relative safety, usually the exterior of the enclosure. Within most evacuation modelling tools, wayfinding is completely ignored; agents are either assigned the shortest distance path or use a potential field to find the shortest path to the exits. In this paper a novel wayfinding technique that attempts to represent the manner in which people wayfind within structures is introduced and demonstrated through two examples. The first step is to encode the spatial information of the enclosure in terms of a graph. The second step is to apply search algorithms to the graph to find possible routes to the destination and assign a cost to the routes based on their personal route preferences such as "least time" or "least distance" or a combination of criteria. The third step is the route execution and refinement. In this step, the agent moves along the chosen route and reassesses the route at regular intervals and may decide to take an alternative path if the agent determines that an alternate route is more favourable e.g. initial path is highly congested or is blocked due to fire.
Resumo:
Major and trace elemental composition provides a powerful basis for forensic comparison of soils, sediments and rocks. However, it is important that the potential 'errors' associated with the procedures are fully understood and quantified, and that standard protocols are applied for sample preparation and analysis. This paper describes such a standard procedure and reports results both for instrumental measurement precision (repeatability) and overall 'method' precision (reproducibility). Results obtained both for certified reference materials and example soils show that the instrumental measurement precision (defined by the coefficient of variation, CV) for most elements is better than 2-3%. When different solutions were prepared from the same sample powder, and from different sub-sample powders prepared from the same parent sample, the CV increased to c. 5-6% for many elements. The largest variation was found in results for certified reference materials generated from 23 instrument runs over an 18 month period (mean CV=c. 11%). Some elements were more variable than others. W was found to be the most variable and the elements V, Cr, Co, Cu, Ni and Pb also showed higher than average variability. SiO2, CaO, Al2O3 and Fe2O3, Rb, Sr, La, Ce, Nd and Sm generally showed lower than average variability, and therefore provided the most reliable basis for inter-sample comparison. It is recommended that, whenever possible, samples relating to the same investigation should be analysed in the same sample run, or at least sequential runs.