14 resultados para Analysis Tools

em Greenwich Academic Literature Archive - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A parallel time-domain algorithm is described for the time-dependent nonlinear Black-Scholes equation, which may be used to build financial analysis tools to help traders making rapid and systematic evaluation of buy/sell contracts. The algorithm is particularly suitable for problems that do not require fine details at each intermediate time step, and hence the method applies well for the present problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Today most of the IC and board designs are undertaken using two-dimensional graphics tools and rule checks. System-in-package is driving three-dimensional design concepts and this is posing a number of challenges for electronic design automation (EDA) software vendors. System-in-package requires three-dimensional EDA tools and design collaboration systems with appropriate manufacturing and assembly rules for these expanding technologies. Simulation and Analysis tools today focus on one aspect of the design requirement, for example, thermal, electrical or mechanical. System-in-Package requires analysis and simulation tools that can easily capture the complex three dimensional structures and provided integrated fast solutions to issues such as thermal management, reliability, electromagnetic interference, etc. This paper discusses some of the challenges faced by the design and analysis community in providing appropriate tools to engineers for System-in-Package design

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Numerical modelling technology and software is now being used to underwrite the design of many microelectronic and microsystems components. The demands for greater capability of these analysis tools are increasing dramatically, as the user community is faced with the challenge of producing reliable products in ever shorter lead times. This leads to the requirement for analysis tools to represent the interactions amongst the distinct phenomena and physics at multiple length and timescales. Multi-physics and Multi-scale technology is now becoming a reality with many code vendors. This chapter discusses the current status of modelling tools that assess the impact of nano-technology on the fabrication/packaging and testing of microsystems. The chapter is broken down into three sections: Modelling Technologies, Modelling Application to Fabrication, and Modelling Application to Assembly/Packing and Modelling Applied for Test and Metrology.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Future analysis tools that predict the behavior of electronic components, both during qualification testing and in-service lifetime assessment, will be very important in predicting product reliability and identifying when to undertake maintenance. This paper will discuss some of these techniques and illustrate these with examples. The paper will also discuss future challenges for these techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Financial modelling in the area of option pricing involves the understanding of the correlations between asset and movements of buy/sell in order to reduce risk in investment. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. In turn, analysis tools rely on fast numerical algorithms for the solution of financial mathematical models. There are many different financial activities apart from shares buy/sell activities. The main aim of this chapter is to discuss a distributed algorithm for the numerical solution of a European option. Both linear and non-linear cases are considered. The algorithm is based on the concept of the Laplace transform and its numerical inverse. The scalability of the algorithm is examined. Numerical tests are used to demonstrate the effectiveness of the algorithm for financial analysis. Time dependent functions for volatility and interest rates are also discussed. Applications of the algorithm to non-linear Black-Scholes equation where the volatility and the interest rate are functions of the option value are included. Some qualitative results of the convergence behaviour of the algorithm is examined. This chapter also examines the various computational issues of the Laplace transformation method in terms of distributed computing. The idea of using a two-level temporal mesh in order to achieve distributed computation along the temporal axis is introduced. Finally, the chapter ends with some conclusions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Finance is one of the fastest growing areas in modern applied mathematics with real world applications. The interest of this branch of applied mathematics is best described by an example involving shares. Shareholders of a company receive dividends which come from the profit made by the company. The proceeds of the company, once it is taken over or wound up, will also be distributed to shareholders. Therefore shares have a value that reflects the views of investors about the likely dividend payments and capital growth of the company. Obviously such value will be quantified by the share price on stock exchanges. Therefore financial modelling serves to understand the correlations between asset and movements of buy/sell in order to reduce risk. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. There are other financial activities and it is not an intention of this paper to discuss all of these activities. The main concern of this paper is to propose a parallel algorithm for the numerical solution of an European option. This paper is organised as follows. First, a brief introduction is given of a simple mathematical model for European options and possible numerical schemes of solving such mathematical model. Second, Laplace transform is applied to the mathematical model which leads to a set of parametric equations where solutions of different parametric equations may be found concurrently. Numerical inverse Laplace transform is done by means of an inversion algorithm developed by Stehfast. The scalability of the algorithm in a distributed environment is demonstrated. Third, a performance analysis of the present algorithm is compared with a spatial domain decomposition developed particularly for time-dependent heat equation. Finally, a number of issues are discussed and future work suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer Aided Parallelisation Tools (CAPTools) is a toolkit designed to automate as much as possible of the process of parallelising scalar FORTRAN 77 codes. The toolkit combines a very powerful dependence analysis together with user supplied knowledge to build an extremely comprehensive and accurate dependence graph. The initial version has been targeted at structured mesh computational mechanics codes (eg. heat transfer, Computational Fluid Dynamics (CFD)) and the associated simple mesh decomposition paradigm is utilised in the automatic code partition, execution control mask generation and communication call insertion. In this, the first of a series of papers [1–3] the authors discuss the parallelisations of a number of case study codes showing how the various component tools may be used to develop a highly efficient parallel implementation in a few hours or days. The details of the parallelisation of the TEAMKE1 CFD code are described together with the results of three other numerical codes. The resulting parallel implementations are then tested on workstation clusters using PVM and an i860-based parallel system showing efficiencies well over 80%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The requirement for a very accurate dependence analysis to underpin software tools to aid the generation of efficient parallel implementations of scalar code is argued. The current status of dependence analysis is shown to be inadequate for the generation of efficient parallel code, causing too many conservative assumptions to be made. This paper summarises the limitations of conventional dependence analysis techniques, and then describes a series of extensions which enable the production of a much more accurate dependence graph. The extensions include analysis of symbolic variables, the development of a symbolic inequality disproof algorithm and its exploitation in a symbolic Banerjee inequality test; the use of inference engine proofs; the exploitation of exact dependence and dependence pre-domination attributes; interprocedural array analysis; conditional variable definition tracing; integer array tracing and division calculations. Analysis case studies on typical numerical code is shown to reduce the total dependencies estimated from conventional analysis by up to 50%. The techniques described in this paper have been embedded within a suite of tools, CAPTools, which combines analysis with user knowledge to produce efficient parallel implementations of numerical mesh based codes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

User supplied knowledge and interaction is a vital component of a toolkit for producing high quality parallel implementations of scalar FORTRAN numerical code. In this paper we consider the necessary components that such a parallelisation toolkit should possess to provide an effective environment to identify, extract and embed user relevant user knowledge. We also examine to what extent these facilities are available in leading parallelisation tools; in particular we discuss how these issues have been addressed in the development of the user interface of the Computer Aided Parallelisation Tools (CAPTools). The CAPTools environment has been designed to enable user exploration, interaction and insertion of user knowledge to facilitate the automatic generation of very efficient parallel code. A key issue in the user's interaction is control of the volume of information so that the user is focused on only that which is needed. User control over the level and extent of information revealed at any phase is supplied using a wide variety of filters. Another issue is the way in which information is communicated. Dependence analysis and its resulting graphs involve a lot of sophisticated rather abstract concepts unlikely to be familiar to most users of parallelising tools. As such, considerable effort has been made to communicate with the user in terms that they will understand. These features, amongst others, and their use in the parallelisation process are described and their effectiveness discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper demonstrates a modeling and design approach that couples computational mechanics techniques with numerical optimisation and statistical models for virtual prototyping and testing in different application areas concerning reliability of eletronic packages. The integrated software modules provide a design engineer in the electronic manufacturing sector with fast design and process solutions by optimizing key parameters and taking into account complexity of certain operational conditions. The integrated modeling framework is obtained by coupling the multi-phsyics finite element framework - PHYSICA - with the numerical optimisation tool - VisualDOC into a fully automated design tool for solutions of electronic packaging problems. Response Surface Modeling Methodolgy and Design of Experiments statistical tools plus numerical optimisaiton techniques are demonstrated as a part of the modeling framework. Two different problems are discussed and solved using the integrated numerical FEM-Optimisation tool. First, an example of thermal management of an electronic package on a board is illustrated. Location of the device is optimized to ensure reduced junction temperature and stress in the die subject to certain cooling air profile and other heat dissipating active components. In the second example thermo-mechanical simulations of solder creep deformations are presented to predict flip-chip reliability and subsequently used to optimise the life-time of solder interconnects under thermal cycling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Problems in the preservation of the quality of granular material products are complex and arise from a series of sources during transport and storage. In either designing a new plant or, more likely, analysing problems that give rise to product quality degradation in existing operations, practical measurement and simulation tools and technologies are required to support the process engineer. These technologies are required to help in both identifying the source of such problems and then designing them out. As part of a major research programme on quality in particulate manufacturing computational models have been developed for segregation in silos, degradation in pneumatic conveyors, and the development of caking during storage, which use where possible, micro-mechanical relationships to characterize the behaviour of granular materials. The objective of the work presented here is to demonstrate the use of these computational models of unit processes involved in the analysis of large-scale processes involving the handling of granular materials. This paper presents a set of simulations of a complete large-scale granular materials handling operation, involving the discharge of the materials from a silo, its transport through a dilute-phase pneumatic conveyor, and the material storage in a big bag under varying environmental temperature and humidity conditions. Conclusions are drawn on the capability of the computational models to represent key granular processes, including particle size segregation, degradation, and moisture migration caking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the process control of micro and nano-electronics based manufacturing processes is presented in this paper. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to help understand how a pre-defined geometry of micro- and nano- structures can be achieved using this technology. The process performance is characterised on the basis of developed Reduced Order Models (ROM) and are generated using results from a mathematical model of the Focused Ion Beam and Design of Experiment (DoE) methods. Two ion beam sources, Argon and Gallium ions, have been used to compare and quantify the process variable uncertainties that can be observed during the milling process. The evaluations of the process performance takes into account the uncertainties and variations of the process variables and are used to identify their impact on the reliability and quality of the fabricated structure. An optimisation based design task is to identify the optimal process conditions, by varying the process variables, so that certain quality objectives and requirements are achieved and imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, coupled fire and evacuation simulation tools are used to simulate the Station Nightclub fire. This study differs from the analysis conducted by NIST in three key areas; (1)an enhanced flame spread model and (2)a toxicity generation model are used, (3)the evacuation is coupled to the fire simulation. Predicted early burning locations in the full-scale fire simulation are in line with photographic evidence and the predicted onset of flashover is similar to that produced by NIST. However, it is suggested that both predictions of the flashover time are approximately 15 sec earlier than actually occurred. Three evacuation scenarios are then considered, two of which are coupled with the fire simulation. The coupled fire and evacuation simulation suggests that 180 fatalities result from a building population of 460. With a 15 sec delay in the fire timeline, the evacuation simulation produces 84 fatalities which are in good agreement with actual number of fatalities. An important observation resulting from this work is that traditional fire engineering ASET/RSET calculations which do not couple the fire and evacuation simulations have the potential to be considerably over optimistic in terms of the level of safety achieved by building designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evacuation models have been playing an important function in the transition process from prescriptive fire safety codes to performance-based ones over the last three decades. In fact, such models became also useful tools in different tasks within fire safety engineering field, such as fire risks assessment and fire investigation. However, there are some difficulties in this process when using these models. For instance, during the evacuation modelling analysis, a common problem faced by fire safety engineers concerns the number of simulations which needs to be performed. In other terms, which fire designs (i.e., scenarios) should be investigated using the evacuation models? This type of question becomes more complex when specific issues such as the optimal positioning of exits within an arbitrarily structure needs to be addressed. Therefore, this paper presents a methodology which combines the use of evacuation models with numerical techniques used in the operational research field, such as Design of Experiments (DoE), Response Surface Models (RSM) and the numerical optimisation techniques. The methodology here presented is restricted to evacuation modelling analysis, nevertheless this same concept can be extended to fire modelling analysis.