2 resultados para statistical accuracy
Resumo:
The strong mixing of many-electron basis states in excited atoms and ions with open f shells results in very large numbers of complex, chaotic eigenstates that cannot be computed to any degree of accuracy. Describing the processes which involve such states requires the use of a statistical theory. Electron capture into these “compound resonances” leads to electron-ion recombination rates that are orders of magnitude greater than those of direct, radiative recombination and cannot be described by standard theories of dielectronic recombination. Previous statistical theories considered this as a two-electron capture process which populates a pair of single-particle orbitals, followed by “spreading” of the two-electron states into chaotically mixed eigenstates. This method is similar to a configuration-average approach because it neglects potentially important effects of spectator electrons and conservation of total angular momentum. In this work we develop a statistical theory which considers electron capture into “doorway” states with definite angular momentum obtained by the configuration interaction method. We apply this approach to electron recombination with W20+, considering 2×106 doorway states. Despite strong effects from the spectator electrons, we find that the results of the earlier theories largely hold. Finally, we extract the fluorescence yield (the probability of photoemission and hence recombination) by comparison with experiment.
Resumo:
Reliability has emerged as a critical design constraint especially in memories. Designers are going to great lengths to guarantee fault free operation of the underlying silicon by adopting redundancy-based techniques, which essentially try to detect and correct every single error. However, such techniques come at a cost of large area, power and performance overheads which making many researchers to doubt their efficiency especially for error resilient systems where 100% accuracy is not always required. In this paper, we present an alternative method focusing on the confinement of the resulting output error induced by any reliability issues. By focusing on memory faults, rather than correcting every single error the proposed method exploits the statistical characteristics of any target application and replaces any erroneous data with the best available estimate of that data. To realize the proposed method a RISC processor is augmented with custom instructions and special-purpose functional units. We apply the method on the proposed enhanced processor by studying the statistical characteristics of the various algorithms involved in a popular multimedia application. Our experimental results show that in contrast to state-of-the-art fault tolerance approaches, we are able to reduce runtime and area overhead by 71.3% and 83.3% respectively.