865 resultados para Parallel processing (Electronic computers) - Research
Resumo:
A physically open, but electrically shielded, microwave open oven can be produced by virtue of the evanescent fields in a waveguide below cutoff. The below cutoff heating chamber is fed by a transverse magnetic resonance established in a dielectric-filled section of the waveguide exploiting continuity of normal electric flux. In order to optimize the fields and the performance of the oven, a thin layer of a dielectric material with higher permittivity is inserted at the interface. Analysis and synthesis of an optimized open oven predicts field enhancement in the heating chamber up to 9.4 dB. Results from experimental testing on two fabricated prototypes are in agreement with the simulated predictions, and demonstrate an up to tenfold improvement in the heating performance. The open-ended oven allows for simultaneous precision alignment, testing, and efficient curing of microelectronic devices, significantly increasing productivity gains.
Resumo:
Numerical modelling technology and software is now being used to underwrite the design of many microelectronic and microsystems components. The demands for greater capability of these analysis tools are increasing dramatically, as the user community is faced with the challenge of producing reliable products in ever shorter lead times. This leads to the requirement for analysis tools to represent the interactions amongst the distinct phenomena and physics at multiple length and timescales. Multi-physics and Multi-scale technology is now becoming a reality with many code vendors. This chapter discusses the current status of modelling tools that assess the impact of nano-technology on the fabrication/packaging and testing of microsystems. The chapter is broken down into three sections: Modelling Technologies, Modelling Application to Fabrication, and Modelling Application to Assembly/Packing and Modelling Applied for Test and Metrology.
Resumo:
A numerical modelling method for the analysis of solder joint damage and crack propagation has been described in this paper. The method is based on the disturbed state concept. Under cyclic thermal-mechanical loading conditions, the level of damage that occurs in solder joints is assumed to be a simple monotonic scalar function of the accumulated equivalent plastic strain. The increase of damage leads to crack initiation and propagation. By tracking the evolution of the damage level in solder joints, crack propagation path and rate can be simulated using Finite Element Analysis method. The discussions are focused on issues in the implementation of the method. The technique of speeding up the simulation and the mesh dependency issues are analysed. As an example of the application of this method, crack propagation in solder joints of power electronics modules under cyclic thermal-mechanical loading conditions has been analyzed and the predicted cracked area size after 3000 loading cycles is consistent with experimental results.
Resumo:
In this paper, computer modelling techniques are used to analyse the effects of globtops on the reliability of aluminium wirebonds in power electronics modules under cyclic thermal-mechanical loading conditions. The sensitivity of the wirehond reliability to the changes of the geometric and the material property parameters of wirebond globtop are evaluated and the optimal combination of the Young's modulus and the coefficient of thermal expansion have been predicted.
Resumo:
This paper describes a methodology for deploying flexible dynamic configuration into embedded systems whilst preserving the reliability advantages of static systems. The methodology is based on the concept of decision points (DP) which are strategically placed to achieve fine-grained distribution of self-management logic to meet application-specific requirements. DP logic can be changed easily, and independently of the host component, enabling self-management behavior to be deferred beyond the point of system deployment. A transparent Dynamic Wrapper mechanism (DW) automatically detects and handles problems arising from the evaluation of self-management logic within each DP and ensures that the dynamic aspects of the system collapse down to statically defined default behavior to ensure safety and correctness despite failures. Dynamic context management contributes to flexibility, and removes the need for design-time binding of context providers and consumers, thus facilitating run-time composition and incremental component upgrade.
Resumo:
This paper describes ways in which emergence engineering principles can be applied to the development of distributed applications. A distributed solution to the graph-colouring problem is used as a vehicle to illustrate some novel techniques. Each node acts autonomously to colour itself based only on its local view of its neighbourhood, and following a simple set of carefully tuned rules. Randomness breaks symmetry and thus enhances stability. The algorithm has been developed to enable self-configuration in wireless sensor networks, and to reflect real-world configurations the algorithm operates with 3 dimensional topologies (reflecting the propagation of radio waves and the placement of sensors in buildings, bridge structures etc.). The algorithm’s performance is evaluated and results presented. It is shown to be simultaneously highly stable and scalable whilst achieving low convergence times. The use of eavesdropping gives rise to low interaction complexity and high efficiency in terms of the communication overheads.
Resumo:
This paper presents an investigation into applying Case-Based Reasoning to Multiple Heterogeneous Case Bases using agents. The adaptive CBR process and the architecture of the system are presented. A case study is presented to illustrate and evaluate the approach. The process of creating and maintaining the dynamic data structures is discussed. The similarity metrics employed by the system are used to support the process of optimisation of the collaboration between the agents which is based on the use of a blackboard architecture. The blackboard architecture is shown to support the efficient collaboration between the agents to achieve an efficient overall CBR solution, while using case-based reasoning methods to allow the overall system to adapt and “learn” new collaborative strategies for achieving the aims of the overall CBR problem solving process.
Resumo:
An aerodynamic sound source extraction from a general flow field is applied to a number of model problems and to a problem of engineering interest. The extraction technique is based on a variable decomposition, which results to an acoustic correction method, of each of the flow variables into a dominant flow component and a perturbation component. The dominant flow component is obtained with a general-purpose Computational Fluid Dynamics (CFD) code which uses a cell-centred finite volume method to solve the Reynolds-averaged Navier–Stokes equations. The perturbations are calculated from a set of acoustic perturbation equations with source terms extracted from unsteady CFD solutions at each time step via the use of a staggered dispersion-relation-preserving (DRP) finite-difference scheme. Numerical experiments include (1) propagation of a 1-D acoustic pulse without mean flow, (2) propagation of a 2-D acoustic pulse with/without mean flow, (3) reflection of an acoustic pulse from a flat plate with mean flow, and (4) flow-induced noise generated by the an unsteady laminar flow past a 2-D cavity. The computational results demonstrate the accuracy for model problems and illustrate the feasibility for more complex aeroacoustic problems of the source extraction technique.
Resumo:
Image inpainting refers to restoring a damaged image with missing information. The total variation (TV) inpainting model is one such method that simultaneously fills in the regions with available information from their surroundings and eliminates noises. The method works well with small narrow inpainting domains. However there remains an urgent need to develop fast iterative solvers, as the underlying problem sizes are large. In addition one needs to tackle the imbalance of results between inpainting and denoising. When the inpainting regions are thick and large, the procedure of inpainting works quite slowly and usually requires a significant number of iterations and leads inevitably to oversmoothing in the outside of the inpainting domain. To overcome these difficulties, we propose a solution for TV inpainting method based on the nonlinear multi-grid algorithm.
Resumo:
This paper takes forward the discussion for the development of a Framework for e-Learning. It briefly describes how the discussion has progressed from the suggested development of a Framework and the findings of a study investigating the use of Blended Learning, to the application of PESTE factors from Sociology and the proposal of new PESTE factors for educational software and e-Learning, asking if the current use of Computer-Mediated Communication (CMC) is leading to the deskilling of professions, by the provision of direct, front-line service applications and the implications for e-Learning.
Resumo:
This paper briefly describes the methodologies employed in the collection and storage of first-hand accounts of evacuation experiences derived from face-to-face interviews with evacuees from the World Trade Center (WTC) Twin Towers complex on 11 September 2001 and the development of the High-rise Evacuation Evaluation Database (HEED). The main focus of the paper is to present an overview of preliminary analysis of data derived from the evacuation of the North Tower.
Resumo:
This paper takes forward the discussion for the development of a Framework for e-Learning. It briefly describes how the discussion has progressed from the suggested development of a Framework and the findings of a study investigating the use of Blended Learning, to the application of PESTE factors from Sociology and the proposal of new PESTE factors for educational software and e-Learning, asking if the current use of Computer-Mediated Communication (CMC) is leading to the deskilling of professions, by the provision of direct, front-line service applications and the implications for e-Learning.
Resumo:
The use by students of an e-learning system that enhances traditional learning in a large university computing school where there are clear assessment deadlines and severe penalties for late submission of coursework is examined to assess the impact of changes to the deadline model on the way students use the system and on the results they achieve. It is demonstrated that the grade a student achieves is partly dependent on the time before the deadline when the work is completed - in general, students who submit earlier gain higher grades. Possible reasons for this are explored. Analysis of the data from a range of different implementations of deadline policies is presented. Suggestions are made on how to minimise any possible negative impact of the assessment policy on the student's overall learning.
Resumo:
This short position paper considers issues in developing Data Architecture for the Internet of Things (IoT) through the medium of an exemplar project, Domain Expertise Capture in Authoring and Development Environments (DECADE). A brief discussion sets the background for IoT, and the development of the distinction between things and computers. The paper makes a strong argument to avoid reinvention of the wheel, and to reuse approaches to distributed heterogeneous data architectures and the lessons learned from that work, and apply them to this situation. DECADE requires an autonomous recording system, local data storage, semi-autonomous verification model, sign-off mechanism, qualitative and quantitative analysis carried out when and where required through web-service architecture, based on ontology and analytic agents, with a self-maintaining ontology model. To develop this, we describe a web-service architecture, combining a distributed data warehouse, web services for analysis agents, ontology agents and a verification engine, with a centrally verified outcome database maintained by certifying body for qualification/professional status.
Resumo:
Analysis of the generic attacks and countermeasures for block cipher based message authentication code algorithms (MAC) in sensor applications is undertaken; the conclusions are used in the design of two new MAC constructs Quicker Block Chaining MAC1 (QBC-MAC1) and Quicker Block Chaining MAC2 (QBC-MAC2). Using software simulation we show that our new constructs point to improvements in usage of CPU instruction clock cycle and energy requirement when benchmarked against the de facto Cipher Block Chaining MAC (CBC-MAC) based construct used in the TinySec security protocol for wireless sensor networks.