879 resultados para fidelity encouragement
Resumo:
This paper discusses a reliability based optimisation modelling approach demonstrated for the design of a SiP structure integrated by stacking dies one upon the other. In this investigation the focus is on the strategy for handling the uncertainties in the package design inputs and their implementation into the design optimisation modelling framework. The analysis of fhermo-mechanical behaviour of the package is utilised to predict the fatigue life-time of the lead-free board level solder interconnects and warpage of the package under thermal cycling. The SiP characterisation is obtained through the exploitation of Reduced Order Models (ROM) constructed using high fidelity analysis and Design of Experiments (DoE) methods. The design task is to identify the optimal SiP design specification by varying several package input parameters so that a specified target reliability of the solder joints is achieved and in the same time design requirements and package performance criteria are met
Resumo:
In this paper, a method for the integration of several numerical analytical techniques that are used in microsystems design and failure analysis is presented. The analytical techniques are categorized into four groups in the discussion, namely the high-fidelity analytical tools, i.e. finite element (FE) method, the fast analytical tools referring to reduced order modeling (ROM); the optimization tools, and probability based analytical tools. The characteristics of these four tools are investigated. The interactions between the four tools are discussed and a methodology for the coupling of these four tools is offered. This methodology consists of three stages, namely reduced order modeling, deterministic optimization and probabilistic optimization. Using this methodology, a case study for optimization of a solder joint is conducted. It is shown that these analysis techniques have mutual relationship of interaction and complementation. Synthetic application of these techniques can fully utilize the advantages of these techniques and satisfy various design requirements. The case study shows that the coupling method of different tools provided by this paper is effective and efficient and it is highly relevant in the design and reliability analysis of microsystems
Resumo:
The latest advances in multi-physics modelling both using high fidelity techniques and reduced order and behavioural models will be discussed. Particular focus will be given to the application and validation of these techniques for modelling the fabrication, packaging and subsequent reliability of micro-systems based components. The paper will discuss results from a number of research projects with particular emphasis on the techniques being developed in a major UK Goverment funded project - 3D-MINTEGRATION (www.3d-mintegration.com).
Resumo:
This paper describes a framework that is being developed for the prediction and analysis of electronics power module reliability both for qualification testing and in-service lifetime prediction. Physics of failure (PoF) reliability methodology using multi-physics high-fidelity and reduced order computer modelling, as well as numerical optimization techniques, are integrated in a dedicated computer modelling environment to meet the needs of the power module designers and manufacturers as well as end-users for both design and maintenance purposes. An example of lifetime prediction for a power module solder interconnect structure is described. Another example is the lifetime prediction of a power module for a railway traction control application. Also in the paper a combined physics of failure and data trending prognostic methodology for the health monitoring of power modules is discussed.
Resumo:
Purpose – Are women held back or holding back? Do women choose their jobs/careers or are they structurally or normatively constrained? The purpose of this paper is to shed fresh light on these questions and contribute to an on-going debate that has essentially focused on the extent to which part-time work is women’s choice, the role of structural and organisational constraints and the role of men in excluding women. Design/methodology/approach – The paper uses data from interviews with 80 working women – both full-time and part-time – performing diverse work roles in a range of organisations in the south east of England. Findings – It was found that many women do not make strategic job choices, rather they often ‘‘fall into’’ jobs that happen to be available to them. Some would not have aspired to their present jobs without male encouragement; many report incidents of male exclusion; and virtually all either know or suspect that they are paid less than comparable men. Those working reduced hours enjoy that facility, yet they are aware that reduced hours and senior roles are seen as incompatible. In short, they recognise both the positive and negative aspects of their jobs, whether they work full or part-time, whether they work in male-dominated or female-dominated occupations, and whatever their position in the organisational hierarchy. Accordingly, the paper argues that the concept of ‘‘satisficing’’, i.e. a decision which is good enough but not optimal, is a more appropriate way to view women’s working lives than are either choice or constraint theories. Originality/value – There is an ongoing, and often polarised, debate between those who maintain that women choose whether to give preference to work or home/family and others who maintain that women, far from being self-determining actors, are constrained structurally and normatively. Rather than supporting these choice or constraint theories, this paper argues that ‘‘satisficing’’ is a more appropriate and nuanced concept to explain women’s working lives.
Resumo:
Relative Evidential Supports (RES) was developed and justified several years ago as a non-numeric apparatus that allows us to compare evidential supports for alternative conclusions when making a decision. An extension called Graded Relative Evidence (GRE) of the RES concept of pairwise balancing and trading-off of evidence is reported here which keeps its basic features of simplicity and perspicacity but enriches its modelling fidelity by permitting very modest and intuitive variations in degrees of outweighing (which the essentially binary RES does not). The formal justification is very simply based on linkages to RES and to the Dempster - Shafer theory of evidence. The use of the simple extension is illustrated and to a small degree further justified empirically by application to a topical scientific debate about what is called the Congo Crossover Conjecture here. This decision-making instance is chosen because of the wealth of evidence that has been accumulated on both sides of the debate and the range of evidence strengths manifested in it. The conjecture is that the advent of Aids was in the late 1950s in the Congo when a vaccine for polio was allegedly cultivated in the kidneys of chimpanzees which allowed the Aids infection to cross over to humans from primates. © 2005 Springer.
Resumo:
An entangled two-mode coherent state is studied within the framework of 2 x 2-dimensional Hilbert space. An entanglement concentration scheme based on joint Bell-state measurements is worked out. When the entangled coherent state is embedded in vacuum environment, its entanglement is degraded but not totally lost. It is found that the larger the initial coherent amplitude, the faster entanglement decreases. We investigate a scheme to teleport a coherent superposition state while considering a mixed quantum channel. We find that the decohered entangled coherent state may be useless for quantum teleportation as it gives the optimal fidelity of teleportation less than the classical limit 2/3.
Resumo:
We propose an optimal strategy for continuous-variable teleportation in a realistic situation. We show that the typical imperfect quantum operation can be described as a combination of an asymmetrically decohered quantum channel and perfect apparatuses for other operations. For the asymmetrically decohered quantum channel, we find some counterintuitive results: teleportation does not necessarily get better as the channel is initially squeezed more. We show that decoherence-assisted measurement and transformation may enhance fidelity for an asymmetrically mixed quantum channel.
Resumo:
Measures of entanglement, fidelity, and purity are basic yardsticks in quantum-information processing. We propose how to implement these measures using linear devices and homodyne detectors for continuous-variable Gaussian states. In particular, the test of entanglement becomes simple with some prior knowledge that is relevant to current experiments.
Resumo:
It is shown that a linear superposition of two macroscopically distinguishable optical coherent states can be generated using a single photon source and simple all-optical operations. Weak squeezing on a single photon, beam mixing with an auxiliary coherent state, and photon detecting with imperfect threshold detectors are enough to generate a coherent state superposition in a free propagating optical field with a large coherent amplitude (alpha>2) and high fidelity (F>0.99). In contrast to all previous schemes to generate such a state, our scheme does not need photon number resolving measurements nor Kerr-type nonlinear interactions. Furthermore, it is robust to detection inefficiency and exhibits some resilience to photon production inefficiency.
Resumo:
High-fidelity quantum computation and quantum state transfer are possible in short spin chains. We exploit a system based on a dispersive qubit-boson interaction to mimic XY coupling. In this model, the usually assumed nearest-neighbor coupling is no longer valid: all the qubits are mutually coupled. We analyze the performances of our model for quantum state transfer showing how preengineered coupling rates allow for nearly optimal state transfer. We address a setup of superconducting qubits coupled to a microstrip cavity in which our analysis may be applied.
Resumo:
We formulate a conclusive teleportation protocol for a system in d-dimensional Hilbert space utilizing the positive operator- valued measurement. The conclusive teleportation protocol ensures some perfect teleportation events when the channel is only partially entangled. at the expense of lowering the overall average fidelity. We discuss how much information remains in the inconclusive parts of the teleportation.
Resumo:
This review focuses on the monophyletic group of animal RNA viruses united in the order Nidovirales. The order includes the distantly related coronaviruses, toroviruses, and roniviruses, which possess the largest known RNA genomes (from 26 to 32 kb) and will therefore be called ‘large’ nidoviruses in this review. They are compared with their arterivirus cousins, which also belong to the Nidovirales despite having a much smaller genome (13–16 kb). Common and unique features that have been identified for either large or all nidoviruses are outlined. These include the nidovirus genetic plan and genome diversity, the composition of the replicase machinery and virus particles, virus-specific accessory genes, the mechanisms of RNA and protein synthesis, and the origin and evolution of nidoviruses with small and large genomes. Nidoviruses employ single-stranded, polycistronic RNA genomes of positive polarity that direct the synthesis of the subunits of the replicative complex, including the RNA-dependent RNA polymerase and helicase. Replicase gene expression is under the principal control of a ribosomal frameshifting signal and a chymotrypsin-like protease, which is assisted by one or more papain-like proteases. A nested set of subgenomic RNAs is synthesized to express the 3'-proximal ORFs that encode most conserved structural proteins and, in some large nidoviruses, also diverse accessory proteins that may promote virus adaptation to specific hosts. The replicase machinery includes a set of RNA-processing enzymes some of which are unique for either all or large nidoviruses. The acquisition of these enzymes may have improved the low fidelity of RNA replication to allow genome expansion and give rise to the ancestors of small and, subsequently, large nidoviruses.
Resumo:
Surrogate-based-optimization methods provide a means to achieve high-fidelity design optimization at reduced computational cost by using a high-fidelity model in combination with lower-fidelity models that are less expensive to evaluate. This paper presents a provably convergent trust-region model-management methodology for variableparameterization design models: that is, models for which the design parameters are defined over different spaces. Corrected space mapping is introduced as a method to map between the variable-parameterization design spaces. It is then used with a sequential-quadratic-programming-like trust-region method for two aerospace-related design optimization problems. Results for a wing design problem and a flapping-flight problem show that the method outperforms direct optimization in the high-fidelity space. On the wing design problem, the new method achieves 76% savings in high-fidelity function calls. On a bat-flight design problem, it achieves approximately 45% time savings, although it converges to a different local minimum than did the benchmark.
Resumo:
We study genuine multipartite entanglement (GME) in a system of n qubits prepared in symmetric Dicke states and subjected to the influences of noise. We provide general, setup-independent expressions for experimentally favorable tools such as fidelity- and collective spin-based entanglement witnesses, as well as entangled-class discriminators and multi-point correlation functions. Besides highlighting the effects of the environment on large qubit registers, we also discuss strategies for the robust detection of GME. Our work provides techniques and results for the experimental communities interested in investigating and characterizing multipartite entangled states by introducing realistic milestones for setup design and associated predictions.