928 resultados para Automobiles - Dynamics - Computer simulation
Resumo:
In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methods. In a second step a multi-scenario experiment was carried out to study the behaviour of the models when they are used for the purpose of operational improvement. Overall we have found that for our case study example both, discrete event simulation and agent based simulation have the same potential to support the investigation into the efficiency of implementing new management policies.
Resumo:
This work focuses on the creation and applications of a dynamic simulation software in order to study the hard metal structure (WC-Co). The technological ground used to increase the GPU hardware capacity was Geforce 9600 GT along with the PhysX chip created to make games more realistic. The software simulates the three-dimensional carbide structure to the shape of a cubic box where tungsten carbide (WC) are modeled as triangular prisms and truncated triangular prisms. The program was proven effective regarding checking testes, ranging from calculations of parameter measures such as the capacity to increase the number of particles simulated dynamically. It was possible to make an investigation of both the mean parameters and distributions stereological parameters used to characterize the carbide structure through cutting plans. Grounded on the cutting plans concerning the analyzed structures, we have investigated the linear intercepts, the intercepts to the area, and the perimeter section of the intercepted grains as well as the binder phase to the structure by calculating the mean value and distribution of the free path. As literature shows almost consensually that the distribution of the linear intercepts is lognormal, this suggests that the grain distribution is also lognormal. Thus, a routine was developed regarding the program which made possible a more detailed research on this issue. We have observed that it is possible, under certain values for the parameters which define the shape and size of the Prismatic grain to find out the distribution to the linear intercepts that approach the lognormal shape. Regarding a number of developed simulations, we have observed that the distribution curves of the linear and area intercepts as well as the perimeter section are consistent with studies on static computer simulation to these parameters.
Resumo:
Natural ventilation is an efficient bioclimatic strategy, one that provides thermal comfort, healthful and cooling to the edification. However, the disregard for quality environment, the uncertainties involved in the phenomenon and the popularization of artificial climate systems are held as an excuse for those who neglect the benefits of passive cooling. The unfamiliarity with the concept may be lessened if ventilation is observed in every step of the project, especially in the initial phase in which decisions bear a great impact in the construction process. The tools available in order to quantify the impact of projected decisions consist basically of the renovation rate calculations or computer simulations of fluids, commonly dubbed CFD, which stands for Computational Fluid Dynamics , both somewhat apart from the project s execution and unable to adapt for use in parametric studies. Thus, we chose to verify, through computer simulation, the representativeness of the results with a method of simplified air reconditioning rate calculation, as well as making it more compatible with the questions relevant to the first phases of the project s process. The case object consists of a model resulting from the recommendations of the Código de Obras de Natal/ RN, customized according to the NBR 15220. The study has shown the complexity in aggregating a CFD tool to the process and the need for a method capable of generating data at the compatible rate to the flow of ideas and are discarded during the project s development. At the end of our study, we discuss the necessary concessions for the realization of simulations, the applicability and the limitations of both the tools used and the method adopted, as well as the representativeness of the results obtained
Resumo:
The building envelope is the principal mean of interaction between indoors and environment, with direct influence on thermal and energy performance of the building. By intervening in the envelope, with the proposal of specific architectural elements, it is possible to promote the use of passive strategies of conditioning, such as natural ventilation. The cross ventilation is recommended by the NBR 15220-3 as the bioclimatic main strategy for the hot and humid climate of Natal/RN, offering among other benefits, the thermal comfort of occupants. The analysis tools of natural ventilation, on the other hand, cover a variety of techniques, from the simplified calculation methods to computer fluid dynamics, whose limitations are discussed in several papers, but without detailing the problems encountered. In this sense, the present study aims to evaluate the potential of wind catchers, envelope elements used to increase natural ventilation in the building, through CFD simplified simulation. Moreover, it seeks to quantify the limitations encountered during the analysis. For this, the procedure adopted to evaluate the elements implementation and efficiency was the CFD simulation, abbreviation for Computer Fluid Dynamics, with the software DesignBuilder CFD. It was defined a base case, where wind catchers were added with various settings, to compare them with each other and appreciate the differences in flows and air speeds encountered. Initially there has been done sensitivity tests for familiarization with the software and observe simulation patterns, mapping the settings used and simulation time for each case simulated. The results show the limitations encountered during the simulation process, as well as an overview of the efficiency and potential of wind catchers, with the increase of ventilation with the use of catchers, differences in air flow patterns and significant increase in air speeds indoors, besides changes found due to different element geometries. It is considered that the software used can help designers during preliminary analysis in the early stages of design
Resumo:
Oil production and exploration techniques have evolved in the last decades in order to increase fluid flows and optimize how the required equipment are used. The base functioning of Electric Submersible Pumping (ESP) lift method is the use of an electric downhole motor to move a centrifugal pump and transport the fluids to the surface. The Electric Submersible Pumping is an option that has been gaining ground among the methods of Artificial Lift due to the ability to handle a large flow of liquid in onshore and offshore environments. The performance of a well equipped with ESP systems is intrinsically related to the centrifugal pump operation. It is the pump that has the function to turn the motor power into Head. In this present work, a computer model to analyze the three-dimensional flow in a centrifugal pump used in Electric Submersible Pumping has been developed. Through the commercial program, ANSYS® CFX®, initially using water as fluid flow, the geometry and simulation parameters have been defined in order to obtain an approximation of what occurs inside the channels of the impeller and diffuser pump in terms of flow. Three different geometry conditions were initially tested to determine which is most suitable to solving the problem. After choosing the most appropriate geometry, three mesh conditions were analyzed and the obtained values were compared to the experimental characteristic curve of Head provided by the manufacturer. The results have approached the experimental curve, the simulation time and the model convergence were satisfactory if it is considered that the studied problem involves numerical analysis. After the tests with water, oil was used in the simulations. The results were compared to a methodology used in the petroleum industry to correct viscosity. In general, for models with water and oil, the results with single-phase fluids were coherent with the experimental curves and, through three-dimensional computer models, they are a preliminary evaluation for the analysis of the two-phase flow inside the channels of centrifugal pump used in ESP systems
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.
Resumo:
Las teorías administrativas se han basado, casi sin excepción, en los fundamentos y los modelos de la ciencia clásica (particularmente, en los modelos de la física newtoniana). Sin embargo, las organizaciones actualmente se enfrentan a un mundo globalizado, plagado de información (y no necesariamente conocimiento), hiperconectado, dinámico y cargado de incertidumbre, por lo que muchas de las teorías pueden mostrar limitaciones para las organizaciones. Y quizá no por la estructura, la lógica o el alcance de las mismas, sino por la falta de criterios que justifiquen su aplicación. En muchos casos, las organizaciones siguen utilizando la intuición, las suposiciones y las verdades a medias en la toma de decisiones. Este panorama pone de manifiesto dos hechos: de un lado, la necesidad de buscar un método que permita comprender las situaciones de cada organización para apoyar la toma de decisiones. De otro lado, la necesidad de potenciar la intuición con modelos y técnicas no tradicionales (usualmente provenientes o inspiradas por la ingeniería). Este trabajo busca anticipar los pilares de un posible método que permita apoyar la toma de decisiones por medio de la simulación de modelos computacionales, utilizando las posibles interacciones entre: la administración basada en modelos, la ciencia computacional de la organización y la ingeniería emergente.
Resumo:
This study proposed to evaluate the mandibular biomechanics in the posterior dentition based on experimental and computational analyses. The analyses were performed on a model of human mandible, which was modeled by epoxy resin for photoelastic analysis and by computer-aided design for finite element analysis. To standardize the evaluation, specific areas were determined at the lateral surface of mandibular body. The photoelastic analysis was configured through a vertical load on the first upper molar and fixed support at the ramus of mandible. The same configuration was used in the computer simulation. Force magnitudes of 50, 100, 150, and 200 N were applied to evaluate the bone stress. The stress results presented similar distribution in both analyses, with the more intense stress being at retromolar area and oblique line and alveolar process at molar level. This study presented the similarity of results in the experimental and computational analyses and, thus, showed the high importance of morphology biomechanical characterization at posterior dentition.
Resumo:
OBJETIVO: Desenvolver simulação computadorizada de ablação para produzir lentes de contato personalizadas a fim de corrigir aberrações de alta ordem. MÉTODOS: Usando dados reais de um paciente com ceratocone, mensurados em um aberrômetro ("wavefront") com sensor Hartmann-Shack, foram determinados as espessuras de lentes de contato que compensam essas aberrações assim como os números de pulsos necessários para fazer ablação as lentes especificamente para este paciente. RESULTADOS: Os mapas de correção são apresentados e os números dos pulsos foram calculados, usando feixes com a largura de 0,5 mm e profundidade de ablação de 0,3 µm. CONCLUSÕES: Os resultados simulados foram promissores, mas ainda precisam ser aprimorados para que o sistema de ablação "real" possa alcançar a precisão desejada.
Resumo:
This article describes the design, implementation, and experiences with AcMus, an open and integrated software platform for room acoustics research, which comprises tools for measurement, analysis, and simulation of rooms for music listening and production. Through use of affordable hardware, such as laptops, consumer audio interfaces and microphones, the software allows evaluation of relevant acoustical parameters with stable and consistent results, thus providing valuable information in the diagnosis of acoustical problems, as well as the possibility of simulating modifications in the room through analytical models. The system is open-source and based on a flexible and extensible Java plug-in framework, allowing for cross-platform portability, accessibility and experimentation, thus fostering collaboration of users, developers and researchers in the field of room acoustics.
Resumo:
Enzymes are extremely efficient catalysts. Here, part of the mechanisms proposed to explain this catalytic power will be compared to quantitative experimental results and computer simulations. Influence of the enzymatic environment over species along the reaction coordinate will be analysed. Concepts of transition state stabilisation and reactant destabilisation will be confronted. Divided site model and near-attack conformation hypotheses will also be discussed. Molecular interactions such as covalent catalysis, general acid-base catalysis, electrostatics, entropic effects, steric hindrance, quantum and dynamical effects will also be analysed as sources of catalysis. Reaction mechanisms, in particular that catalysed by protein tyrosine phosphatases, illustrate the concepts.
Resumo:
Shallow subsurface layers of gold nanoclusters were formed in polymethylmethacrylate (PMMA) polymer by very low energy (49 eV) gold ion implantation. The ion implantation process was modeled by computer simulation and accurately predicted the layer depth and width. Transmission electron microscopy (TEM) was used to image the buried layer and individual nanoclusters; the layer width was similar to 6-8 nm and the cluster diameter was similar to 5-6 nm. Surface plasmon resonance (SPR) absorption effects were observed by UV-visible spectroscopy. The TEM and SPR results were related to prior measurements of electrical conductivity of Au-doped PMMA, and excellent consistency was found with a model of electrical conductivity in which either at low implantation dose the individual nanoclusters are separated and do not physically touch each other, or at higher implantation dose the nanoclusters touch each other to form a random resistor network (percolation model). (C) 2009 American Vacuum Society. [DOI: 10.1116/1.3231449]
Resumo:
PMMA (polymethylmethacrylate) was ion implanted with gold at very low energy and over a range of different doses using a filtered cathodic arc metal plasma system. A nanometer scale conducting layer was formed, fully buried below the polymer surface at low implantation dose, and evolving to include a gold surface layer as the dose was increased. Depth profiles of the implanted material were calculated using the Dynamic TRIM computer simulation program. The electrical conductivity of the gold-implanted PMMA was measured in situ as a function of dose. Samples formed at a number of different doses were subsequently characterized by Rutherford backscattering spectrometry, and test patterns were formed on the polymer by electron beam lithography. Lithographic patterns were imaged by atomic force microscopy and demonstrated that the contrast properties of the lithography were well maintained in the surface-modified PMMA.
Resumo:
We study a stochastic lattice model describing the dynamics of coexistence of two interacting biological species. The model comprehends the local processes of birth, death, and diffusion of individuals of each species and is grounded on interaction of the predator-prey type. The species coexistence can be of two types: With self-sustained coupled time oscillations of population densities and without oscillations. We perform numerical simulations of the model on a square lattice and analyze the temporal behavior of each species by computing the time correlation functions as well as the spectral densities. This analysis provides an appropriate characterization of the different types of coexistence. It is also used to examine linked population cycles in nature and in experiment.
Resumo:
This paper presents a framework to build medical training applications by using virtual reality and a tool that helps the class instantiation of this framework. The main purpose is to make easier the building of virtual reality applications in the medical training area, considering systems to simulate biopsy exams and make available deformation, collision detection, and stereoscopy functionalities. The instantiation of the classes allows quick implementation of the tools for such a purpose, thus reducing errors and offering low cost due to the use of open source tools. Using the instantiation tool, the process of building applications is fast and easy. Therefore, computer programmers can obtain an initial application and adapt it to their needs. This tool allows the user to include, delete, and edit parameters in the functionalities chosen as well as storing these parameters for future use. In order to verify the efficiency of the framework, some case studies are presented.