15 resultados para reduced order models

em Greenwich Academic Literature Archive - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses a reliability based optimisation modelling approach demonstrated for the design of a SiP structure integrated by stacking dies one upon the other. In this investigation the focus is on the strategy for handling the uncertainties in the package design inputs and their implementation into the design optimisation modelling framework. The analysis of fhermo-mechanical behaviour of the package is utilised to predict the fatigue life-time of the lead-free board level solder interconnects and warpage of the package under thermal cycling. The SiP characterisation is obtained through the exploitation of Reduced Order Models (ROM) constructed using high fidelity analysis and Design of Experiments (DoE) methods. The design task is to identify the optimal SiP design specification by varying several package input parameters so that a specified target reliability of the solder joints is achieved and in the same time design requirements and package performance criteria are met

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the process control of micro and nano-electronics based manufacturing processes is presented in this paper. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to help understand how a pre-defined geometry of micro- and nano- structures can be achieved using this technology. The process performance is characterised on the basis of developed Reduced Order Models (ROM) and are generated using results from a mathematical model of the Focused Ion Beam and Design of Experiment (DoE) methods. Two ion beam sources, Argon and Gallium ions, have been used to compare and quantify the process variable uncertainties that can be observed during the milling process. The evaluations of the process performance takes into account the uncertainties and variations of the process variables and are used to identify their impact on the reliability and quality of the fabricated structure. An optimisation based design task is to identify the optimal process conditions, by varying the process variables, so that certain quality objectives and requirements are achieved and imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the development of new advanced technologies in the area of micro and nano systems. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to provide knowledge of how a pre-defined geometry can be achieved through this direct milling. The geometry characterisation is obtained using a Reduced Order Models (ROM), generated from the results of a mathematical model of the Focused Ion Beam, and Design of Experiment (DoE) methods. In this work, the focus is on the design flow methodology which includes an approach on how to include process parameter uncertainties into the process optimisation modelling framework. A discussion on the impact of the process parameters, and their variations, on the quality and performance of the fabricated structure is also presented. The design task is to identify the optimal process conditions, by altering the process parameters, so that certain reliability and confidence of the application is achieved and the imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The latest advances in multi-physics modelling both using high fidelity techniques and reduced order and behavioural models will be discussed. Particular focus will be given to the application and validation of these techniques for modelling the fabrication, packaging and subsequent reliability of micro-systems based components. The paper will discuss results from a number of research projects with particular emphasis on the techniques being developed in a major UK Goverment funded project - 3D-MINTEGRATION (www.3d-mintegration.com).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, a method for the integration of several numerical analytical techniques that are used in microsystems design and failure analysis is presented. The analytical techniques are categorized into four groups in the discussion, namely the high-fidelity analytical tools, i.e. finite element (FE) method, the fast analytical tools referring to reduced order modeling (ROM); the optimization tools, and probability based analytical tools. The characteristics of these four tools are investigated. The interactions between the four tools are discussed and a methodology for the coupling of these four tools is offered. This methodology consists of three stages, namely reduced order modeling, deterministic optimization and probabilistic optimization. Using this methodology, a case study for optimization of a solder joint is conducted. It is shown that these analysis techniques have mutual relationship of interaction and complementation. Synthetic application of these techniques can fully utilize the advantages of these techniques and satisfy various design requirements. The case study shows that the coupling method of different tools provided by this paper is effective and efficient and it is highly relevant in the design and reliability analysis of microsystems

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes a framework that is being developed for the prediction and analysis of electronics power module reliability both for qualification testing and in-service lifetime prediction. Physics of failure (PoF) reliability methodology using multi-physics high-fidelity and reduced order computer modelling, as well as numerical optimization techniques, are integrated in a dedicated computer modelling environment to meet the needs of the power module designers and manufacturers as well as end-users for both design and maintenance purposes. An example of lifetime prediction for a power module solder interconnect structure is described. Another example is the lifetime prediction of a power module for a railway traction control application. Also in the paper a combined physics of failure and data trending prognostic methodology for the health monitoring of power modules is discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study information rates of time-varying flat-fading channels (FFC) modeled as finite-state Markov channels (FSMC). FSMCs have two main applications for FFCs: modeling channel error bursts and decoding at the receiver. Our main finding in the first application is that receiver observation noise can more adversely affect higher-order FSMCs than lower-order FSMCs, resulting in lower capacities. This is despite the fact that the underlying higher-order FFC and its corresponding FSMC are more predictable. Numerical analysis shows that at low to medium SNR conditions (SNR lsim 12 dB) and at medium to fast normalized fading rates (0.01 lsim fDT lsim 0.10), FSMC information rates are non-increasing functions of memory order. We conclude that BERs obtained by low-order FSMC modeling can provide optimistic results. To explain the capacity behavior, we present a methodology that enables analytical comparison of FSMC capacities with different memory orders. We establish sufficient conditions that predict higher/lower capacity of a reduced-order FSMC, compared to its original high-order FSMC counterpart. Finally, we investigate the achievable information rates in FSMC-based receivers for FFCs. We observe that high-order FSMC modeling at the receiver side results in a negligible information rate increase for normalized fading rates fDT lsim 0.01.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies two models of two-stage processing with no-wait in process. The first model is the two-machine flow shop, and the other is the assembly model. For both models we consider the problem of minimizing the makespan, provided that the setup and removal times are separated from the processing times. Each of these scheduling problems is reduced to the Traveling Salesman Problem (TSP). We show that, in general, the assembly problem is NP-hard in the strong sense. On the other hand, the two-machine flow shop problem reduces to the Gilmore-Gomory TSP, and is solvable in polynomial time. The same holds for the assembly problem under some reasonable assumptions. Using these and existing results, we provide a complete complexity classification of the relevant two-stage no-wait scheduling models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A discretized series of events is a binary time series that indicates whether or not events of a point process in the line occur in successive intervals. Such data are common in environmental applications. We describe a class of models for them, based on an unobserved continuous-time discrete-state Markov process, which determines the rate of a doubly stochastic Poisson process, from which the binary time series is constructed by discretization. We discuss likelihood inference for these processes and their second-order properties and extend them to multiple series. An application involves modeling the times of exposures to air pollution at a number of receptors in Western Europe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes how modeling technology has been used in providing fatigue life time data of two flip-chip models. Full-scale three-dimensional modeling of flip-chips under cyclic thermal loading has been combined with solder joint stand-off height prediction to analyze the stress and strain conditions in the two models. The Coffin-Manson empirical relationship is employed to predict the fatigue life times of the solder interconnects. In order to help designers in selecting the underfill material and the printed circuit board, the Young's modulus and the coefficient of thermal expansion of the underfill, as well as the thickness of the printed circuit boards are treated as variable parameters. Fatigue life times are therefore calculated over a range of these material and geometry parameters. In this paper we will also describe how the use of micro-via technology may affect fatigue life

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational Fluid Dynamics (CFD) is gradually becoming a powerful and almost essential tool for the design, development and optimization of engineering applications. However the mathematical modelling of the erratic turbulent motion remains the key issue when tackling such flow phenomena. The reliability of CFD analysis depends heavily on the turbulence model employed together with the wall functions implemented. In order to resolve the abrupt changes in the turbulent energy and other parameters situated at near wall regions a particularly fine mesh is necessary which inevitably increases the computer storage and run-time requirements. Turbulence modelling can be considered to be one of the three key elements in CFD. Precise mathematical theories have evolved for the other two key elements, grid generation and algorithm development. The principal objective of turbulence modelling is to enhance computational procedures of efficient accuracy to reproduce the main structures of three dimensional fluid flows. The flow within an electronic system can be characterized as being in a transitional state due to the low velocities and relatively small dimensions encountered. This paper presents simulated CFD results for an investigation into the predictive capability of turbulence models when considering both fluid flow and heat transfer phenomena. Also a new two-layer hybrid kε / kl turbulence model for electronic application areas will be presented which holds the advantages of being cheap in terms of the computational mesh required and is also economical with regards to run-time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A practical CFD method is presented in this study to predict the generation of toxic gases in enclosure fires. The model makes use of local combustion conditions to determine the yield of carbon monoxide, carbon dioxide, hydrocarbon, soot and oxygen. The local conditions used in the determination of these species are the local equivalence ratio (LER) and the local temperature. The heat released from combustion is calculated using the volumetric heat source model or the eddy dissipation model (EDM). The model is then used to simulate a range of reduced-scale and full-scale fire experiments. The model predictions for most of the predicted species are then shown to be in good agreement with the test results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Financial modelling in the area of option pricing involves the understanding of the correlations between asset and movements of buy/sell in order to reduce risk in investment. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. In turn, analysis tools rely on fast numerical algorithms for the solution of financial mathematical models. There are many different financial activities apart from shares buy/sell activities. The main aim of this chapter is to discuss a distributed algorithm for the numerical solution of a European option. Both linear and non-linear cases are considered. The algorithm is based on the concept of the Laplace transform and its numerical inverse. The scalability of the algorithm is examined. Numerical tests are used to demonstrate the effectiveness of the algorithm for financial analysis. Time dependent functions for volatility and interest rates are also discussed. Applications of the algorithm to non-linear Black-Scholes equation where the volatility and the interest rate are functions of the option value are included. Some qualitative results of the convergence behaviour of the algorithm is examined. This chapter also examines the various computational issues of the Laplace transformation method in terms of distributed computing. The idea of using a two-level temporal mesh in order to achieve distributed computation along the temporal axis is introduced. Finally, the chapter ends with some conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Host-parasitoid models including integrated pest management (IPM) interventions with impulsive effects at both fixed and unfixed times were analyzed with regard to host-eradication, host-parasitoid persistence and host-outbreak solutions. The host-eradication periodic solution with fixed moments is globally stable if the host's intrinsic growth rate is less than the summation of the mean host-killing rate and the mean parasitization rate during the impulsive period. Solutions for all three categories can coexist, with switch-like transitions among their attractors showing that varying dosages and frequencies of insecticide applications and the numbers of parasitoids released are crucial. Periodic solutions also exist for models with unfixed moments for which the maximum amplitude of the host is less than the economic threshold. The dosages and frequencies of IPM interventions for these solutions are much reduced in comparison with the pest-eradication periodic solution. Our results, which are robust to inclusion of stochastic effects and with a wide range of parameter values, confirm that IPM is more effective than any single control tactic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advertising standardisation versus adaptation has been discussed in some detail in the marketing literature. Despite previous attempts, there is still no widely-used decision-making model available that has been accepted by marketing practitioners and academics. This paper examines the development of this important area by reviewing six prominent models in the advertising standardisation/adaptation literature. It shows why there has been a lack of development in the current literature and why it is crucial to address this problem. Important areas for future research are suggested in order to find a solution