57 resultados para Microscopic simulation models
Resumo:
A workshop on the computational fluid dynamics (CFD) prediction of shock boundary-layer interactions (SBLIs) was held at the 48th AIAA Aerospace Sciences Meeting. As part of the workshop numerous CFD analysts submitted solutions to four experimentally measured SBLIs. This paper describes the assessment of the CFD predictions. The assessment includes an uncertainty analysis of the experimental data, the definition of an error metric and the application of that metric to the CFD solutions. The CFD solutions provided very similar levels of error and in general it was difficult to discern clear trends in the data. For the Reynolds Averaged Navier-Stokes methods the choice of turbulence model appeared to be the largest factor in solution accuracy. Large-eddy simulation methods produced error levels similar to RANS methods but provided superior predictions of normal stresses.
Resumo:
The effects of initial soil fabric on behaviors of granular soils are investigated by using Distinct Element Method (DEM) numerical simulation. Soil specimens are represented by an assembly of non-uniform sized spheres with different initial contact normal distributions. Isotropically consolidated triaxial compression loading and extension unloading in both undrained and drained conditions are simulated for vertically- and horizontally-sheared specimens. The numerical simulation results are compared qualitatively with the published experimental data and the effects of initial soil fabric on resulting soil behaviors are discussed, including the effects of specimen reconstitution methods, effects of large preshearing, and anisotropic characteristics in undrained and drained conditions. The effects of initial soil fabric and mode of shearing on the quasi-steady state line are also investigated. The numerical simulation results can systematically explain that the observed experimental behaviors of granular soils are due principally to their conditions of the initial soil fabric. This outcome provides insights into the observed phenomena in microscopic view. © 2011 Elsevier Ltd.
Resumo:
In this paper, we engage a Lagrangian, particle-based CFD method, named Smoothed Particle Hydrodynamic (SPH) to study the solitary wave motion and its impact on coastal structures. Two-dimensional weakly compressible and incompressible SPH models were applied to simulate wave impacting on seawall and schematic coastal house. The results confirmed the accuracy of both models for predicting the wave surface profiles. The incompressible SPH model performed better in predicting the pressure field and impact loadings on coastal structures than the weakly compressible SPH model. The results are in qualitatively agreement with experimental results. Copyright © 2011 by the International Society of Offshore and Polar Engineers (ISOPE).
Resumo:
Multi-objective Genetic Algorithms have become a popular choice to aid in optimising the size of the whole hybrid power train. Within these optimisation processes, other optimisation techniques for the control strategy are implemented. This optimisation within an optimisation requires many simulations to be run, so reducing the computational cost is highly desired. This paper presents an optimisation framework consisting of a series hybrid optimisation algorithm, in which a global search optimizes a submarine propulsion system using low-fidelity models and, in order to refine the results, a local search is used with high-fidelity models. The effectiveness of the Hybrid optimisation algorithm is demonstrated with the optimisation of a submarine propulsion system. © 2011 EPE Association - European Power Electr.
Resumo:
This article investigates how to use UK probabilistic climate-change projections (UKCP09) in rigorous building energy analysis. Two office buildings (deep plan and shallow plan) are used as case studies to demonstrate the application of UKCP09. Three different methods for reducing the computational demands are explored: statistical reduction (Finkelstein-Schafer [F-S] statistics), simplification using degree-day theory and the use of metamodels. The first method, which is based on an established technique, can be used as reference because it provides the most accurate information. However, it is necessary to automatically choose weather files based on F-S statistic by using computer programming language because thousands of weather files created from UKCP09 weather generator need to be processed. A combination of the second (degree-day theory) and third method (metamodels) requires only a relatively small number of simulation runs, but still provides valuable information to further implement the uncertainty and sensitivity analyses. The article also demonstrates how grid computing can be used to speed up the calculation for many independent EnergyPlus models by harnessing the processing power of idle desktop computers. © 2011 International Building Performance Simulation Association (IBPSA).
Resumo:
Quality control is considered from the simulator's perspective through comparative simulation of an ultra energy-efficient building with EE4-DOE2.1E and EnergyPlus. The University of Calgary's Leadership in Energy and Environmental Design Platinum Child Development Centre, with a 66% certified energy cost reduction rating, was the case study building. A Natural Resources Canada incentive program required use of EE4 interface with DOE2.1E simulation engine for energy modelling. As DOE2.1E lacks specific features to simulate advanced systems such as radiant cooling in the CDC, an EnergyPlus model was developed to further evaluate these features. The EE4-DOE2.1E model was used for quality control during development of the base EnergyPlus model and simulation results were compared. Advanced energy systems then added to the EnergyPlus model generated small difference in estimated total annual energy use. The comparative simulation process helped identify the main input errors in the draft EnergyPlus model. The comparative use of less complex simulation programs is recommended for quality control when producing more complex models. © 2009 International Building Performance Simulation Association (IBPSA).
Resumo:
Iteration is unavoidable in the design process and should be incorporated when planning and managing projects in order to minimize surprises and reduce schedule distortions. However, planning and managing iteration is challenging because the relationships between its causes and effects are complex. Most approaches which use mathematical models to analyze the impact of iteration on the design process focus on a relatively small number of its causes and effects. Therefore, insights derived from these analytical models may not be robust under a broader consideration of potential influencing factors. In this article, we synthesize an explanatory framework which describes the network of causes and effects of iteration identified from the literature, and introduce an analytic approach which combines a task network modeling approach with System Dynamics simulation. Our approach models the network of causes and effects of iteration alongside the process architecture which is required to analyze the impact of iteration on design process performance. We show how this allows managers to assess the impact of changes to process architecture and to management levers which influence iterative behavior, accounting for the fact that these changes can occur simultaneously and can accumulate in non-linear ways. We also discuss how the insights resulting from this analysis can be visualized for easier consumption by project participants not familiar with simulation methods. Copyright © 2010 by ASME.
Resumo:
This paper presents the development of a new building physics and energy supply systems simulation platform. It has been adapted from both existing commercial models and empirical works, but designed to provide expedient exhaustive simulation of all salient types of energy- and carbon-reducing retrofit options. These options may include any combination of behavioural measures, building fabric and equipment upgrades, improved HVAC control strategies, or novel low-carbon energy supply technologies. We provide a methodological description of the proposed model, followed by two illustrative case studies of the tool when used to investigate retrofit options of a mixed-use office building and primary school in the UK. It is not the intention of this paper, nor would it be feasible, to provide a complete engineering decomposition of the proposed model, describing all calculation processes in detail. Instead, this paper concentrates on presenting the particular engineering aspects of the model which steer away from conventional practise. © 2011 Elsevier Ltd.
Resumo:
In the context of collaborative product development, new requirements need to be accommodated for Virtual Prototyping Simulation (VPS), such as distributed processing and the integration of models created using different tools or languages. Existing solutions focus mainly on the implementation of distributed processing, but this paper explores the issues of combining different models (some of which may be proprietary) developed in different software environments. In this paper, we discuss several approaches for developing VPS, and suggest how it can best be integrated into the design process. An approach is developed to improve collaborative work in a VPS development by combining disparate computational models. Specifically, a system framework is proposed to separate the system-level modeling from the computational infrastructure. The implementation of a simple prototype demonstrates that such a paradigm is viable and thus provides a new means for distributed VPS development. © 2009 by ASME.
Resumo:
Elderly and disabled people can be hugely benefited through the advancement of modern electronic devices, as those can help them to engage more fully with the world. However, existing design practices often isolate elderly or disabled users by considering them as users with special needs. This article presents a simulator that can reflect problems faced by elderly and disabled users while they use computer, television, and similar electronic devices. The simulator embodies both the internal state of an application and the perceptual, cognitive, and motor processes of its user. It can help interface designers to understand, visualize, and measure the effect of impairment on interaction with an interface. Initially a brief survey of different user modeling techniques is presented, and then the existing models are classified into different categories. In the context of existing modeling approaches the work on user modeling is presented for people with a wide range of abilities. A few applications of the simulator, which shows the predictions are accurate enough to make design choices and point out the implication and limitations of the work, are also discussed. © 2012 Copyright Taylor and Francis Group, LLC.
Resumo:
This paper presents the steps and the challenges for implementing analytical, physics-based models for the insulated gate bipolar transistor (IGBT) and the PIN diode in hardware and more specifically in field programmable gate arrays (FPGAs). The models can be utilised in hardware co-simulation of complex power electronic converters and entire power systems in order to reduce the simulation time without compromising the accuracy of results. Such a co-simulation allows reliable prediction of the system's performance as well as accurate investigation of the power devices' behaviour during operation. Ultimately, this will allow application-specific optimisation of the devices' structure, circuit topologies as well as enhancement of the control and/or protection schemes.
Resumo:
Simulation of materials at the atomistic level is an important tool in studying microscopic structure and processes. The atomic interactions necessary for the simulation are correctly described by Quantum Mechanics. However, the computational resources required to solve the quantum mechanical equations limits the use of Quantum Mechanics at most to a few hundreds of atoms and only to a small fraction of the available configurational space. This thesis presents the results of my research on the development of a new interatomic potential generation scheme, which we refer to as Gaussian Approximation Potentials. In our framework, the quantum mechanical potential energy surface is interpolated between a set of predetermined values at different points in atomic configurational space by a non-linear, non-parametric regression method, the Gaussian Process. To perform the fitting, we represent the atomic environments by the bispectrum, which is invariant to permutations of the atoms in the neighbourhood and to global rotations. The result is a general scheme, that allows one to generate interatomic potentials based on arbitrary quantum mechanical data. We built a series of Gaussian Approximation Potentials using data obtained from Density Functional Theory and tested the capabilities of the method. We showed that our models reproduce the quantum mechanical potential energy surface remarkably well for the group IV semiconductors, iron and gallium nitride. Our potentials, while maintaining quantum mechanical accuracy, are several orders of magnitude faster than Quantum Mechanical methods.
Resumo:
A recent trend in spoken dialogue research is the use of reinforcement learning to train dialogue systems in a simulated environment. Past researchers have shown that the types of errors that are simulated can have a significant effect on simulated dialogue performance. Since modern systems typically receive an N-best list of possible user utterances, it is important to be able to simulate a full N-best list of hypotheses. This paper presents a new method for simulating such errors based on logistic regression, as well as a new method for simulating the structure of N-best lists of semantics and their probabilities, based on the Dirichlet distribution. Off-line evaluations show that the new Dirichlet model results in a much closer match to the receiver operating characteristics (ROC) of the live data. Experiments also show that the logistic model gives confusions that are closer to the type of confusions observed in live situations. The hope is that these new error models will be able to improve the resulting performance of trained dialogue systems. © 2012 IEEE.