15 resultados para Computer aided network analysis
em Greenwich Academic Literature Archive - UK
Resumo:
Computer Aided Parallelisation Tools (CAPTools) is a toolkit designed to automate as much as possible of the process of parallelising scalar FORTRAN 77 codes. The toolkit combines a very powerful dependence analysis together with user supplied knowledge to build an extremely comprehensive and accurate dependence graph. The initial version has been targeted at structured mesh computational mechanics codes (eg. heat transfer, Computational Fluid Dynamics (CFD)) and the associated simple mesh decomposition paradigm is utilised in the automatic code partition, execution control mask generation and communication call insertion. In this, the first of a series of papers [1–3] the authors discuss the parallelisations of a number of case study codes showing how the various component tools may be used to develop a highly efficient parallel implementation in a few hours or days. The details of the parallelisation of the TEAMKE1 CFD code are described together with the results of three other numerical codes. The resulting parallel implementations are then tested on workstation clusters using PVM and an i860-based parallel system showing efficiencies well over 80%.
Resumo:
The shared-memory programming model can be an effective way to achieve parallelism on shared memory parallel computers. Historically however, the lack of a programming standard using directives and the limited scalability have affected its take-up. Recent advances in hardware and software technologies have resulted in improvements to both the performance of parallel programs with compiler directives and the issue of portability with the introduction of OpenMP. In this study, the Computer Aided Parallelisation Toolkit has been extended to automatically generate OpenMP-based parallel programs with nominal user assistance. We categorize the different loop types and show how efficient directives can be placed using the toolkit's in-depth interprocedural analysis. Examples are taken from the NAS parallel benchmarks and a number of real-world application codes. This demonstrates the great potential of using the toolkit to quickly parallelise serial programs as well as the good performance achievable on up to 300 processors for hybrid message passing-directive parallelisations.
Resumo:
Network analysis is distinguished from traditional social science by the dyadic nature of the standard data set. Whereas in traditional social science we study monadic attributes of individuals, in network analysis we study dyadic attributes of pairs of individuals. These dyadic attributes (e.g. social relations) may be represented in matrix form by a square 1-mode matrix. In contrast, the data in traditional social science are represented as 2-mode matrices. However, network analysis is not completely divorced from traditional social science, and often has occasion to collect and analyze 2-mode matrices. Furthermore, some of the methods developed in network analysis have uses in analysing non-network data. This paper presents and discusses ways of applying and interpreting traditional network analytic techniques to 2-mode data, as well as developing new techniques. Three areas are covered in detail: displaying 2-mode data as networks, detecting clusters and measuring centrality.
Resumo:
Abstract not available
Resumo:
Abstract not available
Resumo:
The problems of collaborative engineering design and knowledge management at the conceptual stage in a network of dissimilar enterprises was investigated. This issue in engineering design is a result of the supply chain and virtual enterprise (VE) oriented industry that demands faster time to market and accurate cost/manufacturing analysis from conception. The solution consisted of a de-centralised super-peer net architecture to establish and maintain communications between enterprises in a VE. In the solution outlined below, the enterprises are able to share knowledge in a common format and nomenclature via the building-block shareable super-ontology that can be tailored on a project by project basis, whilst maintaining the common nomenclature of the ‘super-ontology’ eliminating knowledge interpretation issues. The two-tier architecture layout of the solution glues together the peer-peer and super-ontologies to form a coherent system for both internal and virtual enterprise knowledge management and product development.
Resumo:
User supplied knowledge and interaction is a vital component of a toolkit for producing high quality parallel implementations of scalar FORTRAN numerical code. In this paper we consider the necessary components that such a parallelisation toolkit should possess to provide an effective environment to identify, extract and embed user relevant user knowledge. We also examine to what extent these facilities are available in leading parallelisation tools; in particular we discuss how these issues have been addressed in the development of the user interface of the Computer Aided Parallelisation Tools (CAPTools). The CAPTools environment has been designed to enable user exploration, interaction and insertion of user knowledge to facilitate the automatic generation of very efficient parallel code. A key issue in the user's interaction is control of the volume of information so that the user is focused on only that which is needed. User control over the level and extent of information revealed at any phase is supplied using a wide variety of filters. Another issue is the way in which information is communicated. Dependence analysis and its resulting graphs involve a lot of sophisticated rather abstract concepts unlikely to be familiar to most users of parallelising tools. As such, considerable effort has been made to communicate with the user in terms that they will understand. These features, amongst others, and their use in the parallelisation process are described and their effectiveness discussed.
Resumo:
The UK government started the UK eUniversities project in order to create a virtual campus for online education provisions, competing in a global market. The UKeU (WWW.ukeu.com) claims to "have created a new approach to e-learning" which "opens up a range of exciting opportunities for students, business and industry worldwide" to obtain both postgraduate and undergraduate qualifications. Although there has been many promises about the e-learning revolution using state-of-the-art multimedia technology, closer scrutiny of what is being delivered reveals that many of the e-learning models currently being used are little more than the old text based computer aided learning running on a global network. As part of the UKeU project a consortium of universities have been involved in developing a two year foundation degree from 2004. We look at the approach taken by the consortium in developing global e-learning provisions and the problems and the pitfalls that lay ahead.
Resumo:
This paper presents the results of a research project aimed at evaluating (HAL) as a mode of course delivery. More specifically the paper will deal with: • Developing a hypermedia courseware for students studying research methods; and • Evaluating hypermedia courseware as a method of delivery against traditional methods. This paper concentrates on pedagogical issues regarding computer aided learning and reports that this research gives tentative indications that hypermedia based learning (either through CD-ROM or the, as means of course delivery could be as effective as traditional modes of course delivery.
Resumo:
This paper will analyse two of the likely damage mechanisms present in a paper fibre matrix when placed under controlled stress conditions: fibre/fibre bond failure and fibre failure. The failure process associated with each damage mechanism will be presented in detail focusing on the change in mechanical and acoustic properties of the surrounding fibre structure before and after failure. To present this complex process mathematically, geometrically simple fibre arrangements will be chosen based on certain assumptions regarding the structure and strength of paper, to model the damage mechanisms. The fibre structures are then formulated in terms of a hybrid vibro-acoustic model based on a coupled mass/spring system and the pressure wave equation. The model will be presented in detail in the paper. The simulation of the simple fibre structures serves two purposes; it highlights the physical and acoustic differences of each damage mechanism before and after failure, and also shows the differences in the two damage mechanisms when compared with one another. The results of the simulations are given in the form of pressure wave contours, time-frequency graphs and the Continuous Wavelet Transform (CWT) diagrams. The analysis of the results leads to criteria by which the two damage mechanisms can be identified. Using these criteria it was possible to verify the results of the simulations against experimental acoustic data. The models developed in this study are of specific practical interest in the paper-making industry, where acoustic sensors may be used to monitor continuous paper production. The same techniques may be adopted more generally to correlate acoustic signals to damage mechanisms in other fibre-based structures.
Resumo:
This paper discusses the reliability of power electronics modules. The approach taken combines numerical modeling techniques with experimentation and accelerated testing to identify failure modes and mechanisms for the power module structure and most importantly the root cause of a potential failure. The paper details results for two types of failure (i) wire bond fatigue and (ii) substrate delamination. Finite element method modeling techniques have been used to predict the stress distribution within the module structures. A response surface optimisation approach has been employed to enable the optimal design and parameter sensitivity to be determined. The response surface is used by a Monte Carlo method to determine the effects of uncertainty in the design.
Resumo:
At present the vast majority of Computer-Aided- Engineering (CAE) analysis calculations for microelectronic and microsystems technologies are undertaken using software tools that focus on single aspects of the physics taking place. For example, the design engineer may use one code to predict the airflow and thermal behavior of an electronic package, then another code to predict the stress in solder joints, and then yet another code to predict electromagnetic radiation throughout the system. The reason for this focus of mesh-based codes on separate parts of the governing physics is essentially due to the numerical technologies used to solve the partial differential equations, combined with the subsequent heritage structure in the software codes. Using different software tools, that each requires model build and meshing, leads to a large investment in time, and hence cost, to undertake each of the simulations. During the last ten years there has been significant developments in the modelling community around multi- physics analysis. These developments are being followed by many of the code vendors who are now providing multi-physics capabilities in their software tools. This paper illustrates current capabilities of multi-physics technology and highlights some of the future challenges
Resumo:
This paper investigated the thermal design of the light emitting diode (LED)onto the board and its packaging. The LED was a 6-lead MultiLED with three chips designed for LCD backlighting and other lighting purposes. A 3D finite element model of this LED was built up and thermal analysis was carried out using the multi physics software package PHYSICA. The modeling results were presented as temperature distributions in each LED, and the predicted junction temperature was used for thermal resistance calculation. The results for the board structure indicated that (1) removing the foil attach decreased the thermal resistance, (2) Increasing the copper foil thickness reduced the thermal resistance. package design indicated that the SMT designed LED with integrated slug gave lower thermal resistance. Pb-free solder material gave lower thermal resistance and junction temperature when compared with conductive adhesive
Resumo:
This study presents a CFD analysis constructed around PHYSICA, an open framework for multi-physics computational continuum mechanics modelling, to investigate the water movement in unsaturated porous media. The modelling environment is based on a cell-centred finite-volume discretisation technique. A number of test cases are performed in order to validate the correct implementation of Richard's equation for compressible and incompressible fluids. The pressure head form of the equation is used together with the constitutive relationships between pressure, volumetric water content and hydraulic conductivity described by Haverkamp and Van Genuchten models. The flow problems presented are associated with infiltration into initially dry soils with homogeneous or layered geologic settings. Comparison of results with the problems selected from literature shows a good agreement and validates the approach and the implementation.