929 resultados para Lattice theory - Computer programs


Relevância:

30.00% 30.00%

Publicador:

Resumo:

GeoRef

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we apply the canonical decomposition of two-qubit unitaries to find pulse schemes to control the proposed Kane quantum computer. We explicitly find pulse sequences for the controlled-NOT, swap, square root of swap, and controlled Z rotations. We analyze the speed and fidelity of these gates, both of which compare favorably to existing schemes. The pulse sequences presented in this paper are theoretically faster, with higher fidelity, and simpler. Any two-qubit gate may be easily found and implemented using similar pulse sequences. Numerical simulation is used to verify the accuracy of each pulse scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computer model of the mechanical alloying process has been developed to simulate phase formation during the mechanical alloying of Mo and Si elemental powders with a ternary addition of Al, Mg, Ti or Zr. Using the Arhennius equation, the model balances the formation rates of the competing reactions that are observed during milling. These reactions include the formation of tetragonal C11(b) MOSi2 (t-MoSi2) by combustion, the formation of the hexagonal C40 MoSi2 polymorph (h-MoSi2), the transformation of the tetragonal to the hexagonal form, and the recovery of t-MoSi2 from h-MoSi2 and deformed t-MoSi2. The addition of the ternary additions changes the free energy of formation of the associated MoSi2 alloys, i.e. Mo(Si, Al)(2), Mo(Mg, Al)(2), (Mo, Ti)Si-2 (Mo, Zr)Si-2 and (Mo, Fe)Si-2, respectively. Variation of the energy of formation alone is sufficient for the simulation to accurately model the observed phase formation. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The data structure of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. This research develops a methodology for evaluating, ex ante, the relative desirability of alternative data structures for end user queries. This research theorizes that the data structure that yields the lowest weighted average complexity for a representative sample of information requests is the most desirable data structure for end user queries. The theory was tested in an experiment that compared queries from two different relational database schemas. As theorized, end users querying the data structure associated with the less complex queries performed better Complexity was measured using three different Halstead metrics. Each of the three metrics provided excellent predictions of end user performance. This research supplies strong evidence that organizations can use complexity metrics to evaluate, ex ante, the desirability of alternate data structures. Organizations can use these evaluations to enhance the efficient and effective retrieval of information by creating data structures that minimize end user query complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new low-complexity multicarrier modulation (MCM) technique based on lattices which achieves a peak-to-average power ratio (PAR) as low as three. The scheme can be viewed as a drop in replacement for the discrete multitone (DMT) modulation of an asymmetric digital subscriber line modem. We show that the lattice-MCM retains many of the attractive features of sinusoidal-MCM, and does so with lower implementation complexity, O(N), compared with DMT, which requires O(N log N) operations. We also present techniques for narrowband interference rejection and power profiling. Simulation studies confirm that performance of the lattice-MCM is superior, even compared with recent techniques for PAR reduction in DMT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we investigate the effect of dephasing on proposed quantum gates for the solid-state Kane quantum computing architecture. Using a simple model of the decoherence, we find that the typical error in a controlled-NOT gate is 8.3x10(-5). We also compute the fidelities of Z, X, swap, and controlled Z operations under a variety of dephasing rates. We show that these numerical results are comparable with the error threshold required for fault tolerant quantum computation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper argues for a more specific formal methodology for the textual analysis of individual game genres. In doing so, it advances a set of formal analytical tools and a theoretical framework for the analysis of turn-based computer strategy games. The analytical tools extend the useful work of Steven Poole, who suggests a Peircian semiotic approach to the study of games as formal systems. The theoretical framework draws upon postmodern cultural theory to analyse and explain the representation of space and the organisation of knowledge in these games. The methodology and theoretical framework is supported by a textual analysis of Civilization II, a significant and influential turn-based computer strategy game. Finally, this paper suggests possibilities for future extensions of this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical tests of Load-Unload Response Ratio (LURR) signals are carried in order to verify statistical robustness of the previous studies using the Lattice Solid Model (MORA et al., 2002b). In each case 24 groups of samples with the same macroscopic parameters (tidal perturbation amplitude A, period T and tectonic loading rate k) but different particle arrangements are employed. Results of uni-axial compression experiments show that before the normalized time of catastrophic failure, the ensemble average LURR value rises significantly, in agreement with the observations of high LURR prior to the large earthquakes. In shearing tests, two parameters are found to control the correlation between earthquake occurrence and tidal stress. One is, A/(kT) controlling the phase shift between the peak seismicity rate and the peak amplitude of the perturbation stress. With an increase of this parameter, the phase shift is found to decrease. Another parameter, AT/k, controls the height of the probability density function (Pdf) of modeled seismicity. As this parameter increases, the Pdf becomes sharper and narrower, indicating a strong triggering. Statistical studies of LURR signals in shearing tests also suggest that except in strong triggering cases, where LURR cannot be calculated due to poor data in unloading cycles, the larger events are more likely to occur in higher LURR periods than the smaller ones, supporting the LURR hypothesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Lattice Solid Model has been used successfully as a virtual laboratory to simulate fracturing of rocks, the dynamics of faults, earthquakes and gouge processes. However, results from those simulations show that in order to make the next step towards more realistic experiments it will be necessary to use models containing a significantly larger number of particles than current models. Thus, those simulations will require a greatly increased amount of computational resources. Whereas the computing power provided by single processors can be expected to increase according to Moore's law, i.e., to double every 18-24 months, parallel computers can provide significantly larger computing power today. In order to make this computing power available for the simulation of the microphysics of earthquakes, a parallel version of the Lattice Solid Model has been implemented. Benchmarks using large models with several millions of particles have shown that the parallel implementation of the Lattice Solid Model can achieve a high parallel-efficiency of about 80% for large numbers of processors on different computer architectures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Density functional theory (DFT) is a powerful approach to electronic structure calculations in extended systems, but suffers currently from inadequate incorporation of long-range dispersion, or Van der Waals (VdW) interactions. VdW-corrected DFT is tested for interactions involving molecular hydrogen, graphite, single-walled carbon nanotubes (SWCNTs), and SWCNT bundles. The energy correction, based on an empirical London dispersion term with a damping function at short range, allows a reasonable physisorption energy and equilibrium distance to be obtained for H-2 on a model graphite surface. The VdW-corrected DFT calculation for an (8, 8) nanotube bundle reproduces accurately the experimental lattice constant. For H-2 inside or outside an (8, 8) SWCNT, we find the binding energies are respectively higher and lower than that on a graphite surface, correctly predicting the well known curvature effect. We conclude that the VdW correction is a very effective method for implementing DFT calculations, allowing a reliable description of both short-range chemical bonding and long-range dispersive interactions. The method will find powerful applications in areas of SWCNT research where empirical potential functions either have not been developed, or do not capture the necessary range of both dispersion and bonding interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adsorption of simple Lennard-Jones fluids in a carbon slit pore of finite length was studied with Canonical Ensemble (NVT) and Gibbs Ensemble Monte Carlo Simulations (GEMC). The Canonical Ensemble was a collection of cubic simulation boxes in which a finite pore resides, while the Gibbs Ensemble was that of the pore space of the finite pore. Argon was used as a model for Lennard-Jones fluids, while the adsorbent was modelled as a finite carbon slit pore whose two walls were composed of three graphene layers with carbon atoms arranged in a hexagonal pattern. The Lennard-Jones (LJ) 12-6 potential model was used to compute the interaction energy between two fluid particles, and also between a fluid particle and a carbon atom. Argon adsorption isotherms were obtained at 87.3 K for pore widths of 1.0, 1.5 and 2.0 nm using both Canonical and Gibbs Ensembles. These results were compared with isotherms obtained with corresponding infinite pores using Grand Canonical Ensembles. The effects of the number of cycles necessary to reach equilibrium, the initial allocation of particles, the displacement step and the simulation box size were particularly investigated in the Monte Carlo simulation with Canonical Ensembles. Of these parameters, the displacement step had the most significant effect on the performance of the Monte Carlo simulation. The simulation box size was also important, especially at low pressures at which the size must be sufficiently large to have a statistically acceptable number of particles in the bulk phase. Finally, it was found that the Canonical Ensemble and the Gibbs Ensemble both yielded the same isotherm (within statistical error); however, the computation time for GEMC was shorter than that for canonical ensemble simulation. However, the latter method described the proper interface between the reservoir and the adsorbed phase (and hence the meniscus).