975 resultados para EFFICIENT SIMULATION
Resumo:
A two-dimensional numerical simulation model of interface states in scanning capacitance microscopy (SCM) measurements of p-n junctions is presented-In the model, amphoteric interface states with two transition energies in the Si band gap are represented as fixed charges to account for their behavior in SCM measurements. The interface states are shown to cause a stretch-out-and a parallel shift of the capacitance-voltage characteristics in the depletion. and neutral regions of p-n junctions, respectively. This explains the discrepancy between - the SCM measurement and simulation near p-n junctions, and thus modeling interface states is crucial for SCM dopant profiling of p-n junctions. (C) 2002 American Institute of Physics.
Resumo:
The splitting method is a simulation technique for the estimation of very small probabilities. In this technique, the sample paths are split into multiple copies, at various stages in the simulation. Of vital importance to the efficiency of the method is the Importance Function (IF). This function governs the placement of the thresholds or surfaces at which the paths are split. We derive a characterisation of the optimal IF and show that for multi-dimensional models the natural choice for the IF is usually not optimal. We also show how nearly optimal splitting surfaces can be derived or simulated using reverse time analysis. Our numerical experiments illustrate that by using the optimal IF, one can obtain a significant improvement in simulation efficiency.
Resumo:
We prove that the simple group L-3(5) which has order 372000 is efficient by providing an efficient presentation for it. This leaves one simple group with order less than one million, S-4(4) which has order 979200, whose efficiency or otherwise remains to be determined.
Resumo:
We are witnessing an enormous growth in biological nitrogen removal from wastewater. It presents specific challenges beyond traditional COD (carbon) removal. A possibility for optimised process design is the use of biomass-supporting media. In this paper, attached growth processes (AGP) are evaluated using dynamic simulations. The advantages of these systems that were qualitatively described elsewhere, are validated quantitatively based on a simulation benchmark for activated sludge treatment systems. This simulation benchmark is extended with a biofilm model that allows for fast and accurate simulation of the conversion of different substrates in a biofilm. The economic feasibility of this system is evaluated using the data generated with the benchmark simulations. Capital savings due to volume reduction and reduced sludge production are weighed out against increased aeration costs. In this evaluation, effluent quality is integrated as well.
Resumo:
This article proposes a more accurate approach to dopant extraction using combined inverse modeling and forward simulation of scanning capacitance microscopy (SCM) measurements on p-n junctions. The approach takes into account the essential physics of minority carrier response to the SCM probe tip in the presence of lateral electric fields due to a p-n junction. The effects of oxide fixed charge and interface state densities in the grown oxide layer on the p-n junction samples were considered in the proposed method. The extracted metallurgical and electrical junctions were compared to the apparent electrical junction obtained from SCM measurements. (C) 2002 American Institute of Physics.
Resumo:
The performance of the Oxford University Gun Tunnel has been estimated using a quasi-one-dimensional simulation of the facility gas dynamics. The modelling of the actual facility area variations so as to adequately simulate both shock reflection and flow discharge processes has been considered in some detail. Test gas stagnation pressure and temperature histories are compared with measurements at two different operating conditions - one with nitrogen and the other with carbon dioxide as the test gas. It is demonstrated that both the simulated pressures and temperatures are typically within 3% of the experimental measurements.
Resumo:
An efficient Lanczos subspace method has been devised for calculating state-to-state reaction probabilities. The method recasts the time-independent wave packet Lippmann-Schwinger equation [Kouri , Chem. Phys. Lett. 203, 166 (1993)] inside a tridiagonal (Lanczos) representation in which action of the causal Green's operator is affected easily with a QR algorithm. The method is designed to yield all state-to-state reaction probabilities from a given reactant-channel wave packet using a single Lanczos subspace; the spectral properties of the tridiagonal Hamiltonian allow calculations to be undertaken at arbitrary energies within the spectral range of the initial wave packet. The method is applied to a H+O-2 system (J=0), and the results indicate the approach is accurate and stable. (C) 2002 American Institute of Physics.
Resumo:
A major limitation in any high-performance digital communication system is the linearity region of the transmitting amplifier. Nonlinearities typically lead to signal clipping. Efficient communication in such conditions requires maintaining a low peak-to-average power ratio (PAR) in the transmitted signal while achieving a high throughput of data. Excessive PAR leads either to frequent clipping or to inadequate resolution in the analog-to-digital or digital-to-analog converters. Currently proposed signaling schemes for future generation wireless communications suffer from a high PAR. This paper presents a new signaling scheme for channels with clipping which achieves a PAR as low as 3. For a given linear range in the transmitter's digital-to-analog converter, this scheme achieves a lower bit-error rate than existing multicarrier schemes, owing to increased separation between constellation points. We present the theoretical basis for this new scheme, approximations for the expected bit-error rate, and simulation results. (C) 2002 Elsevier Science (USA).
Resumo:
Developments in computer and three dimensional (3D) digitiser technologies have made it possible to keep track of the broad range of data required to simulate an insect moving around or over the highly heterogeneous habitat of a plant's surface. Properties of plant parts vary within a complex canopy architecture, and insect damage can induce further changes that affect an animal's movements, development and likelihood of survival. Models of plant architectural development based on Lindenmayer systems (L-systems) serve as dynamic platforms for simulation of insect movement, providing ail explicit model of the developing 3D structure of a plant as well as allowing physiological processes associated with plant growth and responses to damage to be described and Simulated. Simple examples of the use of the L-system formalism to model insect movement, operating Lit different spatial scales-from insects foraging on an individual plant to insects flying around plants in a field-are presented. Such models can be used to explore questions about the consequences of changes in environmental architecture and configuration on host finding, exploitation and its population consequences. In effect this model is a 'virtual ecosystem' laboratory to address local as well as landscape-level questions pertinent to plant-insect interactions, taking plant architecture into account. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Models of plant architecture allow us to explore how genotype environment interactions effect the development of plant phenotypes. Such models generate masses of data organised in complex hierarchies. This paper presents a generic system for creating and automatically populating a relational database from data generated by the widely used L-system approach to modelling plant morphogenesis. Techniques from compiler technology are applied to generate attributes (new fields) in the database, to simplify query development for the recursively-structured branching relationship. Use of biological terminology in an interactive query builder contributes towards making the system biologist-friendly. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.
Resumo:
CULTURE is an Artificial Life simulation that aims to provide primary school children with opportunities to become actively engaged in the high-order thinking processes of problem solving and critical thinking. A preliminary evaluation of CULTURE has found that it offers the freedom for children to take part in process-oriented learning experiences. Through providing children with opportunities to make inferences, validate results, explain discoveries and analyse situations, CULTURE encourages the development of high-order thinking skills. The evaluation found that CULTURE allows users to autonomously explore the important scientific concepts of life and living, and energy and change within a software environment that children find enjoyable and easy to use.
Resumo:
This paper presents results on the simulation of the solid state sintering of copper wires using Monte Carlo techniques based on elements of lattice theory and cellular automata. The initial structure is superimposed onto a triangular, two-dimensional lattice, where each lattice site corresponds to either an atom or vacancy. The number of vacancies varies with the simulation temperature, while a cluster of vacancies is a pore. To simulate sintering, lattice sites are picked at random and reoriented in terms of an atomistic model governing mass transport. The probability that an atom has sufficient energy to jump to a vacant lattice site is related to the jump frequency, and hence the diffusion coefficient, while the probability that an atomic jump will be accepted is related to the change in energy of the system as a result of the jump, as determined by the change in the number of nearest neighbours. The jump frequency is also used to relate model time, measured in Monte Carlo Steps, to the actual sintering time. The model incorporates bulk, grain boundary and surface diffusion terms and includes vacancy annihilation on the grain boundaries. The predictions of the model were found to be consistent with experimental data, both in terms of the microstructural evolution and in terms of the sintering time. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Multi-environment trials (METs) used to evaluate breeding lines vary in the number of years that they sample. We used a cropping systems model to simulate the target population of environments (TPE) for 6 locations over 108 years for 54 'near-isolines' of sorghum in north-eastern Australia. For a single reference genotype, each of 547 trials was clustered into 1 of 3 'drought environment types' (DETs) based on a seasonal water stress index. Within sequential METs of 2 years duration, the frequencies of these drought patterns often differed substantially from those derived for the entire TPE. This was reflected in variation in the mean yield of the reference genotype. For the TPE and for 2-year METs, restricted maximum likelihood methods were used to estimate components of genotypic and genotype by environment variance. These also varied substantially, although not in direct correlation with frequency of occurrence of different DETs over a 2-year period. Combined analysis over different numbers of seasons demonstrated the expected improvement in the correlation between MET estimates of genotype performance and the overall genotype averages as the number of seasons in the MET was increased.
Resumo:
This paper presents a new approach to the LU decomposition method for the simulation of stationary and ergodic random fields. The approach overcomes the size limitations of LU and is suitable for any size simulation. The proposed approach can facilitate fast updating of generated realizations with new data, when appropriate, without repeating the full simulation process. Based on a novel column partitioning of the L matrix, expressed in terms of successive conditional covariance matrices, the approach presented here demonstrates that LU simulation is equivalent to the successive solution of kriging residual estimates plus random terms. Consequently, it can be used for the LU decomposition of matrices of any size. The simulation approach is termed conditional simulation by successive residuals as at each step, a small set (group) of random variables is simulated with a LU decomposition of a matrix of updated conditional covariance of residuals. The simulated group is then used to estimate residuals without the need to solve large systems of equations.