52 resultados para software component
Resumo:
The powerful general Pacala-Hassell host-parasitoid model for a patchy environment, which allows host density–dependent heterogeneity (HDD) to be distinguished from between-patch, host density–independent heterogeneity (HDI), is reformulated within the class of the generalized linear model (GLM) family. This improves accessibility through the provision of general software within well–known statistical systems, and allows a rich variety of models to be formulated. Covariates such as age class, host density and abiotic factors may be included easily. For the case where there is no HDI, the formulation is a simple GLM. When there is HDI in addition to HDD, the formulation is a hierarchical generalized linear model. Two forms of HDI model are considered, both with between-patch variability: one has binomial variation within patches and one has extra-binomial, overdispersed variation within patches. Examples are given demonstrating parameter estimation with standard errors, and hypothesis testing. For one example given, the extra-binomial component of the HDI heterogeneity in parasitism is itself shown to be strongly density dependent.
Resumo:
Software technology that predicts stress in electronic systems and packages, developed as part of TCS Programme, is described. The software is closely integrated within a thermal design tool providing the ability to simulate the coupled effects of airflow, temperature and stress on product performance. This integrated approach to analysis will help decrease the number of design cycles.
Resumo:
When designing a new passenger ship or modifying an existing design, how do we ensure that the proposed design and crew emergency procedures are safe from an evacuation point of view? In the wake of major maritime disasters such as the Herald of Free Enterprise and the Estonia and in light of the growth in the numbers of high density, high-speed ferries and large capacity cruise ships, issues concerned with the evacuation of passengers and crew at sea are receiving renewed interest. In the maritime industry, ship evacuation models offer the promise to quickly and efficiently bring evacuation considerations into the design phase, while the ship is "on the drawing board". maritimeEXODUS-winner of the BCS, CITIS and RINA awards - is such a model. Features such as the ability to realistically simulate human response to fire, the capability to model human performance in heeled orientations, a virtual reality environment that produces realistic visualisations of the modelled scenarios and with an integrated abandonment model, make maritimeEXODUS a truly unique tool for assessing the evacuation capabilities of all types of vessels under a variety of conditions. This paper describes the maritimeEXODUS model, the SHEBA facility from which data concerning passenger/crew performance in conditions of heel is derived and an example application demonstrating the models use in performing an evacuation analysis for a large passenger ship partially based on the requirements of MSC circular 1033.
Resumo:
The issues surrounding collision of projectiles with structures has gained a high profile since the events of 11th September 2001. In such collision problems, the projectile penetrates the stucture so that tracking the interface between one material and another becomes very complex, especially if the projectile is essentially a vessel containing a fluid, e.g. fuel load. The subsequent combustion, heat transfer and melting and re-solidification process in the structure render this a very challenging computational modelling problem. The conventional approaches to the analysis of collision processes involves a Lagrangian-Lagrangian contact driven methodology. This approach suffers from a number of disadvantages in its implementation, most of which are associated with the challenges of the contact analysis component of the calculations. This paper describes a 'two fluid' approach to high speed impact between solid structures, where the objective is to overcome the problems of penetration and re-meshing. The work has been carried out using the finite volume, unstructured mesh multi-physics code PHYSICA+, where the three dimensional fluid flow, free surface, heat transfer, combustion, melting and re-solidification algorithms are approximated using cell-centred finite volume, unstructured mesh techniques on a collocated mesh. The basic procedure is illustrated for two cases of Newtonian and non-Newtonian flow to test various of its component capabilities in the analysis of problems of industrial interest.
Resumo:
Many code generation tools exist to aid developers in carrying out common mappings, such as from Object to XML or from Object to relational database. Such generated code tends to possess a high binding between the Object code and the target mapping, making integration into a broader application tedious or even impossible. In this paper we suggest XML technologies and the multiple inheritance capabilities of interface based languages such as Java, offer a means to unify such executable specifications, thus building complete, consistent and useful object models declaratively, without sacrificing component flexibility.
Resumo:
The two-stage assembly scheduling problem is a model for production processes that involve the assembly of final or intermediate products from basic components. In our model, there are m machines at the first stage that work in parallel, and each produces a component of a job. When all components of a job are ready, an assembly machine at the second stage completes the job by assembling the components. We study problems with the objective of minimizing the makespan, under two different types of batching that occur in some manufacturing environments. For one type, the time to process a batch on a machine is equal to the maximum of the processing times of its operations. For the other type, the batch processing time is defined as the sum of the processing times of its operations, and a setup time is required on a machine before each batch. For both models, we assume a batch availability policy, i.e., the completion times of the operations in a batch are defined to be equal to the batch completion time. We provide a fairly comprehensive complexity classification of the problems under the first type of batching, and we present a heuristic and its worst-case analysis under the second type of batching.
Resumo:
The first phase in the sign, development and implementation of a comprehensive computational model of a copper stockpile leach process is presented. The model accounts for transport phenomena through the stockpile, reaction kinetics for the important mineral species, oxgen and bacterial effects on the leach reactions, plus heat, energy and acid balances for the overall leach process. The paper describes the formulation of the leach process model and its implementation in PHYSICA+, a computational fluid dynamic (CFD) software environment. The model draws on a number of phenomena to represent the competing physical and chemical features active in the process model. The phenomena are essentially represented by a three-phased (solid liquid gas) multi-component transport system; novel algorithms and procedures are required to solve the model equations, including a methodology for dealing with multiple chemical species with different reaction rates in ore represented by multiple particle size fractions. Some initial validation results and application simulations are shown to illustrate the potential of the model.
Resumo:
Predicting the reliability of newly designed products, before manufacture, is obviously highly desirable for many organisations. Understanding the impact of various design variables on reliability allows companies to optimise expenditure and release a package in minimum time. Reliability predictions originated in the early years of the electronics industry. These predictions were based on historical field data which has evolved into industrial databases and specifications such as the famous MIL-HDBK-217 standard, plus numerous others. Unfortunately the accuracy of such techniques is highly questionable especially for newly designed packages. This paper discusses the use of modelling to predict the reliability of high density flip-chip and BGA components. A number of design parameters are investigated at the assembly stage, during testing, and in-service.
Resumo:
The electronics industry and the problems associated with the cooling of microelectronic equipment are developing rapidly. Thermal engineers now find it necessary to consider the complex area of equipment cooling at some level. This continually growing industry also faces heightened pressure from consumers to provide electronic product miniaturization, which in itself increases the demand for accurate thermal management predictions to assure product reliability. Computational fluid dynamics (CFD) is considered a powerful and almost essential tool for the design, development and optimization of engineering applications. CFD is now widely used within the electronics packaging design community to thermally characterize the performance of both the electronic component and system environment. This paper discusses CFD results for a large variety of investigated turbulence models. Comparison against experimental data illustrates the predictive accuracy of currently used models and highlights the growing demand for greater mathematical modelling accuracy with regards to thermal characterization. Also a newly formulated low Reynolds number (i.e. transitional) turbulence model is proposed with emphasis on hybrid techniques.
Resumo:
For sensitive optoelectronic components, traditional soldering techniques cannot be used because of their inherent sensitivity to thermal stresses. One such component is the Optoelectronic Butterfly Package which houses a laser diode chip aligned to a fibre-optic cable. Even sub-micron misalignment of the fibre optic and laser diode chip can significantly reduce the performance of the device. The high cost of each unit requires that the number of damaged components, via the laser soldering process, are kept to a minimum. Mathematical modelling is undertaken to better understand the laser soldering process and to optimize operational parameters such as solder paste volume, copper pad dimensions, laser solder times for each joint, laser intensity and absorption coefficient. Validation of the model against experimental data will be completed, and will lead to an optimization of the assembly process, through an iterative modelling cycle. This will ultimately reduce costs, improve the process development time and increase consistency in the laser soldering process.
Resumo:
The curing of conductive adhesives and underfills can save considerable time and offer cost benefits for the microsystems and electronics packaging industry. In contrast to conventional ovens, curing by microwave energy generates heat internally within each individual component of an assembly. The rate at which heat is generated is different for each of the components and depends on the material properties as well as the oven power and frequency. This leads to a very complex and transient thermal state, which is extremely difficult to measure experimentally. Conductive adhesives need to be raised to a minimum temperature to initiate the cross-linking of the resin polymers, whilst some advanced packaging materials currently under investigation impose a maximum temperature constraint to avoid damage. Thermal imagery equipment integrated with the microwave oven can offer some information on the thermal state but such data is based on the surface temperatures. This paper describes computational models that can simulate the internal temperatures within each component of an assembly including the critical region between the chip and substrate. The results obtained demonstrate that due to the small mass of adhesive used in the joints, the temperatures reached are highly dependent on the material properties of the adjacent chip and substrate.
Resumo:
The aim of integrating computational mechanics (FEA and CFD) and optimization tools is to speed up dramatically the design process in different application areas concerning reliability in electronic packaging. Design engineers in the electronics manufacturing sector may use these tools to predict key design parameters and configurations (i.e. material properties, product dimensions, design at PCB level. etc) that will guarantee the required product performance. In this paper a modeling strategy coupling computational mechanics techniques with numerical optimization is presented and demonstrated with two problems. The integrated modeling framework is obtained by coupling the multi-physics analysis tool PHYSICA - with the numerical optimization package - Visua/DOC into a fuJly automated design tool for applications in electronic packaging. Thermo-mechanical simulations of solder creep deformations are presented to predict flip-chip reliability and life-time under thermal cycling. Also a thermal management design based on multi-physics analysis with coupled thermal-flow-stress modeling is discussed. The Response Surface Modeling Approach in conjunction with Design of Experiments statistical tools is demonstrated and used subsequently by the numerical optimization techniques as a part of this modeling framework. Predictions for reliable electronic assemblies are achieved in an efficient and systematic manner.
Resumo:
In the flip-chip assembly process, no-flow underfill materials have a particular advantage over traditional underfills as the application and curing of this type of underfill can be undertaken before and during the reflow process - adding high volume throughput. Adopting a no-flow underfill process may result in underfill entrapment between solder and fluid, voiding in the underfill, a possible delamination between underfill and surrounding surfaces. The magnitude of these phenomena may adversely affect the reliability of the assembly in terms of solder joint thermal fatigue. This paper presents both an experimental and mdeling analysis investigating the reliabity of a flip-chip component and how the magnitude of underfill entrapment may affect thermal-mechanical fatigue life.
Resumo:
The domain decomposition method is directed to electronic packaging simulation in this article. The objective is to address the entire simulation process chain, to alleviate user interactions where they are heavy to mechanization by component approach to streamline the model simulation process.