19 resultados para Time domain simulation tools
em Greenwich Academic Literature Archive - UK
Resumo:
A parallel time-domain algorithm is described for the time-dependent nonlinear Black-Scholes equation, which may be used to build financial analysis tools to help traders making rapid and systematic evaluation of buy/sell contracts. The algorithm is particularly suitable for problems that do not require fine details at each intermediate time step, and hence the method applies well for the present problem.
Resumo:
Finance is one of the fastest growing areas in modern applied mathematics with real world applications. The interest of this branch of applied mathematics is best described by an example involving shares. Shareholders of a company receive dividends which come from the profit made by the company. The proceeds of the company, once it is taken over or wound up, will also be distributed to shareholders. Therefore shares have a value that reflects the views of investors about the likely dividend payments and capital growth of the company. Obviously such value will be quantified by the share price on stock exchanges. Therefore financial modelling serves to understand the correlations between asset and movements of buy/sell in order to reduce risk. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. There are other financial activities and it is not an intention of this paper to discuss all of these activities. The main concern of this paper is to propose a parallel algorithm for the numerical solution of an European option. This paper is organised as follows. First, a brief introduction is given of a simple mathematical model for European options and possible numerical schemes of solving such mathematical model. Second, Laplace transform is applied to the mathematical model which leads to a set of parametric equations where solutions of different parametric equations may be found concurrently. Numerical inverse Laplace transform is done by means of an inversion algorithm developed by Stehfast. The scalability of the algorithm in a distributed environment is demonstrated. Third, a performance analysis of the present algorithm is compared with a spatial domain decomposition developed particularly for time-dependent heat equation. Finally, a number of issues are discussed and future work suggested.
Resumo:
Computational results for the microwave heating of a porous material are presented in this paper. Combined finite difference time domain and finite volume methods were used to solve equations that describe the electromagnetic field and heat and mass transfer in porous media. The coupling between the two schemes is through a change in dielectric properties which were assumed to be dependent both on temperature and moisture content. The model was able to reflect the evolution of temperature and moisture fields as the moisture in the porous medium evaporates. Moisture movement results from internal pressure gradients produced by the internal heating and phase change.
Resumo:
Purpose – This paper aims to present an open-ended microwave curing system for microelectronics components and a numerical analysis framework for virtual testing and prototyping of the system, enabling design of physical prototypes to be optimized, expediting the development process. Design/methodology/approach – An open-ended microwave oven system able to enhance the cure process for thermosetting polymer materials utilised in microelectronics applications is presented. The system is designed to be mounted on a precision placement machine enabling curing of individual components on a circuit board. The design of the system allows the heating pattern and heating rate to be carefully controlled optimising cure rate and cure quality. A multi-physics analysis approach has been adopted to form a numerical model capable of capturing the complex coupling that exists between physical processes. Electromagnetic analysis has been performed using a Yee finite-difference time-domain scheme, while an unstructured finite volume method has been utilized to perform thermophysical analysis. The two solvers are coupled using a sampling-based cross-mapping algorithm. Findings – The numerical results obtained demonstrate that the numerical model is able to obtain solutions for distribution of temperature, rate of cure, degree of cure and thermally induced stresses within an idealised polymer load heated by the proposed microwave system. Research limitations/implications – The work is limited by the absence of experimentally derived material property data and comparative experimental results. However, the model demonstrates that the proposed microwave system would seem to be a feasible method of expediting the cure rate of polymer materials. Originality/value – The findings of this paper will help to provide an understanding of the behaviour of thermosetting polymer materials during microwave cure processing.
Resumo:
The newly formed Escape and Evacuation Naval Authority regulates the provision of abandonment equipment and procedures for all Ministry of Defence Vessels. As such, it assures that access routes on board are evaluated early in the design process to maximize their efficiency and to eliminate, as far as possible, any congestion that might occur during escape. This analysis can be undertaken using a computer-based simulation for given escape scenarios and replicates the layout of the vessel and the interactions between each individual and the ship structure. One such software tool that facilitates this type of analysis is maritimeEXODUS. This tool, through large scale testing and validation, emulates human shipboard behaviour during emergency scenarios; however it is largely based around the behaviour of civilian passengers and fixtures and fittings of merchant vessels. Hence there existed a clear requirement to understand the behaviour of well-trained naval personnel as opposed to civilian passengers and be able to model the fixtures and fittings that are exclusive to warships, thus allowing improvements to both maritimeEXODUS and other software products. Human factor trials using the Royal Navy training facilities at Whale Island, Portsmouth were recently undertaken to collect data that improves our understanding of the aforementioned differences. It is hoped that this data will form the basis of a long-term improvement package that will provide global validation of these simulation tools and assist in the development of specific Escape and Evacuation standards for warships. © 2005: Royal Institution of Naval Architects.
Resumo:
A simulation program has been developed to calculate the power-spectral density of thin avalanche photodiodes, which are used in optical networks. The program extends the time-domain analysis of the dead-space multiplication model to compute the autocorrelation function of the APD impulse response. However, the computation requires a large amount of memory space and is very time consuming. We describe our experiences in parallelizing the code using both MPI and OpenMP. Several array partitioning schemes and scheduling policies are implemented and tested Our results show that the OpenMP code is scalable up to 64 processors on an SGI Origin 2000 machine and has small average errors.
Resumo:
Problems in the preservation of the quality of granular material products are complex and arise from a series of sources during transport and storage. In either designing a new plant or, more likely, analysing problems that give rise to product quality degradation in existing operations, practical measurement and simulation tools and technologies are required to support the process engineer. These technologies are required to help in both identifying the source of such problems and then designing them out. As part of a major research programme on quality in particulate manufacturing computational models have been developed for segregation in silos, degradation in pneumatic conveyors, and the development of caking during storage, which use where possible, micro-mechanical relationships to characterize the behaviour of granular materials. The objective of the work presented here is to demonstrate the use of these computational models of unit processes involved in the analysis of large-scale processes involving the handling of granular materials. This paper presents a set of simulations of a complete large-scale granular materials handling operation, involving the discharge of the materials from a silo, its transport through a dilute-phase pneumatic conveyor, and the material storage in a big bag under varying environmental temperature and humidity conditions. Conclusions are drawn on the capability of the computational models to represent key granular processes, including particle size segregation, degradation, and moisture migration caking.
Resumo:
Developing temperature fields in frozen cheese sauce undergoing microwave heating were simulated and measured. Two scenarios were investigated: a centric and offset placement on the rotating turntable. Numerical modeling was performed using a dedicated electromagnetic Finite Difference Time Domain (FDTD) module that was two-way coupled to the PHYSICA multiphysics package. Two meshes were used: the food material and container were meshed for the heat transfer and the microwave oven cavity and waveguide were meshed for the microwave field. Power densities obtained on the structured FDTD mesh were mapped onto the unstructured finite volume method mesh for each time-step/turntable position. On heating for each specified time-step the temperature field was mapped back onto the FDTD mesh and the electromagnetic properties were updated accordingly. Changes in thermal/electric properties associated with the phase transition were fully accounted for as well as heat losses from product to cavity. Detailed comparisons were carried out for the centric and offset placements, comparing experimental temperature profiles during microwave thawing with those obtained by numerical simulation.
Resumo:
In this paper, coupled fire and evacuation simulation tools are used to simulate the Station Nightclub fire. This study differs from the analysis conducted by NIST in three key areas; (1)an enhanced flame spread model and (2)a toxicity generation model are used, (3)the evacuation is coupled to the fire simulation. Predicted early burning locations in the full-scale fire simulation are in line with photographic evidence and the predicted onset of flashover is similar to that produced by NIST. However, it is suggested that both predictions of the flashover time are approximately 15 sec earlier than actually occurred. Three evacuation scenarios are then considered, two of which are coupled with the fire simulation. The coupled fire and evacuation simulation suggests that 180 fatalities result from a building population of 460. With a 15 sec delay in the fire timeline, the evacuation simulation produces 84 fatalities which are in good agreement with actual number of fatalities. An important observation resulting from this work is that traditional fire engineering ASET/RSET calculations which do not couple the fire and evacuation simulations have the potential to be considerably over optimistic in terms of the level of safety achieved by building designs.
Resumo:
Metals casting is a process governed by the interaction of a range of physical phenomena. Most computational models of this process address only what are conventionally regarded as the primary phenomena-heat conduction and solidification. However, to predict the formation of porosity (a factor of key importance in cast quality) requires the modelling of the interaction of the fluid flow, heat transfer, solidification and the development of stress-deformation in the solidified part of a component. In this paper, a model of the casting process is described which addresses all the main continuum phenomena involved in a coupled manner. The model is solved numerically using novel finite volume unstructured mesh techniques, and then applied to both the prediction of shape deformation (plus the subsequent formation of a gap at the metal-mould interface and its impact on the heat transfer behaviour) and porosity formation in solidifying metal components. Although the porosity prediction model is phenomenologically simplistic it is based on the interaction of the continuum phenomena and yields good comparisons with available experimental results. This work represents the first of the next generation of casting simulation tools to predict aspects of the structure of cast components.
Resumo:
We report on practical experience using the Oxford BSP Library to parallelize a large electromagnetic code, the British Aerospace finite-difference time-domain code EMMA T:FD3D. The Oxford BS Library is one of the first realizations of the Bulk Synchronous Parallel computational model to be targeted at numerically intensive scientific (typically Fortran) computing. The BAe EMMA code is one of the first large-scale applications to be parallelized using this library, and it is an important demonstration of the cost effectiveness of the BSP approach. We illustrate how BSP cost-modelling techniques can be used to predict and optimize performance for single-source programs across different parallel platforms. We provide predicted and observed performance figures for an industrial-strength, single-source parallel code for a variety of real parallel architectures: shared memory multiprocessors, workstation clusters and massively parallel platforms.
Resumo:
Computational results for the intensive microwave heating of porous materials are presented in this work. A multi-phase porous media model has been developed to predict the heating mechanism. Combined finite difference time-domain and finite volume methods were used to solve equations that describe the electromagnetic field and heat and mass transfer in porous media. The coupling between the two schemes is through a change in dielectric properties which were assumed to be dependent both on temperature and moisture content. The model was able to reflect the evolution of both temperature and moisture fields as well as energy penetration as the moisture in the porous medium evaporates. Moisture movement results from internal pressure gradients produced by the internal heating and phase change.
Resumo:
Computational results for the microwave heating of a porous material are presented in this paper. Combined finite difference time domain and finite volume methods were used to solve equations that describe the electromagnetic field and heat and mass transfer in porous media. The coupling between the two schemes is through a change in dielectric properties which were assumed to be dependent on both temperature and moisture content. The model was able to reflect the evolution of both temperature and moisture fields as well as energy penetration as the moisture in the porous medium evaporates. Moisture movement results from internal pressure gradients produced by the internal heating and phase change.
Resumo:
An important factor for high-speed optical communication is the availability of ultrafast and low-noise photodetectors. Among the semiconductor photodetectors that are commonly used in today’s long-haul and metro-area fiber-optic systems, avalanche photodiodes (APDs) are often preferred over p-i-n photodiodes due to their internal gain, which significantly improves the receiver sensitivity and alleviates the need for optical pre-amplification. Unfortunately, the random nature of the very process of carrier impact ionization, which generates the gain, is inherently noisy and results in fluctuations not only in the gain but also in the time response. Recently, a theory characterizing the autocorrelation function of APDs has been developed by us which incorporates the dead-space effect, an effect that is very significant in thin, high-performance APDs. The research extends the time-domain analysis of the dead-space multiplication model to compute the autocorrelation function of the APD impulse response. However, the computation requires a large amount of memory space and is very time consuming. In this research, we describe our experiences in parallelizing the code in MPI and OpenMP using CAPTools. Several array partitioning schemes and scheduling policies are implemented and tested. Our results show that the code is scalable up to 64 processors on a SGI Origin 2000 machine and has small average errors.
Resumo:
Evacuation analysis of passenger and commercial shipping can be undertaken using computer-based simulation tools such as maritimeEXODUS. These tools emulate human shipboard behaviour during emergency scenarios; however it is largely based around the behaviour of civilian passengers and fixtures and fittings of merchant vessels. If these tools and procedures are to be applied to naval vessels there is a clear requirement to understand the behaviour of well-trained naval personnel interacting with the fixtures and fittings that are exclusive to warships. Human factor trials using Royal Navy training facilities were recently undertaken to collect data to improve our understanding of the performance of naval personnel in warship environments. The trials were designed and conducted by staff from the Fire Safety Engineering Group (FSEG) of the University of Greenwich on behalf of the Sea Technology Group (STG), Defence Procurement Agency. The trials involved a selection of RN volunteers with sea-going experience in warships, operating and traversing structural components under different angles of heel. This paper describes the trials and some of the collected data.