925 resultados para Stochastic processes - Computer simulation
Resumo:
We apply Agent-Based Modeling and Simulation (ABMS) to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between human resource management practices and retail productivity. Despite the fact we are working within a relatively novel and complex domain, it is clear that intelligent agents do offer potential for developing organizational capabilities in the future. Our multi-disciplinary research team has worked with a UK department store to collect data and capture perceptions about operations from actors within departments. Based on this case study work, we have built a simulator that we present in this paper. We then use the simulator to gather empirical evidence regarding two specific management practices: empowerment and employee development.
Resumo:
In this dissertation I draw a connection between quantum adiabatic optimization, spectral graph theory, heat-diffusion, and sub-stochastic processes through the operators that govern these processes and their associated spectra. In particular, we study Hamiltonians which have recently become known as ``stoquastic'' or, equivalently, the generators of sub-stochastic processes. The operators corresponding to these Hamiltonians are of interest in all of the settings mentioned above. I predominantly explore the connection between the spectral gap of an operator, or the difference between the two lowest energies of that operator, and certain equilibrium behavior. In the context of adiabatic optimization, this corresponds to the likelihood of solving the optimization problem of interest. I will provide an instance of an optimization problem that is easy to solve classically, but leaves open the possibility to being difficult adiabatically. Aside from this concrete example, the work in this dissertation is predominantly mathematical and we focus on bounding the spectral gap. Our primary tool for doing this is spectral graph theory, which provides the most natural approach to this task by simply considering Dirichlet eigenvalues of subgraphs of host graphs. I will derive tight bounds for the gap of one-dimensional, hypercube, and general convex subgraphs. The techniques used will also adapt methods recently used by Andrews and Clutterbuck to prove the long-standing ``Fundamental Gap Conjecture''.
Resumo:
In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methods. In a second step a multi-scenario experiment was carried out to study the behaviour of the models when they are used for the purpose of operational improvement. Overall we have found that for our case study example both, discrete event simulation and agent based simulation have the same potential to support the investigation into the efficiency of implementing new management policies.
Resumo:
A primary goal of this dissertation is to understand the links between mathematical models that describe crystal surfaces at three fundamental length scales: The scale of individual atoms, the scale of collections of atoms forming crystal defects, and macroscopic scale. Characterizing connections between different classes of models is a critical task for gaining insight into the physics they describe, a long-standing objective in applied analysis, and also highly relevant in engineering applications. The key concept I use in each problem addressed in this thesis is coarse graining, which is a strategy for connecting fine representations or models with coarser representations. Often this idea is invoked to reduce a large discrete system to an appropriate continuum description, e.g. individual particles are represented by a continuous density. While there is no general theory of coarse graining, one closely related mathematical approach is asymptotic analysis, i.e. the description of limiting behavior as some parameter becomes very large or very small. In the case of crystalline solids, it is natural to consider cases where the number of particles is large or where the lattice spacing is small. Limits such as these often make explicit the nature of links between models capturing different scales, and, once established, provide a means of improving our understanding, or the models themselves. Finding appropriate variables whose limits illustrate the important connections between models is no easy task, however. This is one area where computer simulation is extremely helpful, as it allows us to see the results of complex dynamics and gather clues regarding the roles of different physical quantities. On the other hand, connections between models enable the development of novel multiscale computational schemes, so understanding can assist computation and vice versa. Some of these ideas are demonstrated in this thesis. The important outcomes of this thesis include: (1) a systematic derivation of the step-flow model of Burton, Cabrera, and Frank, with corrections, from an atomistic solid-on-solid-type models in 1+1 dimensions; (2) the inclusion of an atomistically motivated transport mechanism in an island dynamics model allowing for a more detailed account of mound evolution; and (3) the development of a hybrid discrete-continuum scheme for simulating the relaxation of a faceted crystal mound. Central to all of these modeling and simulation efforts is the presence of steps composed of individual layers of atoms on vicinal crystal surfaces. Consequently, a recurring theme in this research is the observation that mesoscale defects play a crucial role in crystal morphological evolution.
Resumo:
This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools
Resumo:
This volume presents a collection of papers covering applications from a wide range of systems with infinitely many degrees of freedom studied using techniques from stochastic and infinite dimensional analysis, e.g. Feynman path integrals, the statistical mechanics of polymer chains, complex networks, and quantum field theory. Systems of infinitely many degrees of freedom create their particular mathematical challenges which have been addressed by different mathematical theories, namely in the theories of stochastic processes, Malliavin calculus, and especially white noise analysis. These proceedings are inspired by a conference held on the occasion of Prof. Ludwig Streit’s 75th birthday and celebrate his pioneering and ongoing work in these fields.
Resumo:
The method "toe-to-heel air injection" (THAITM) is a process of enhanced oil recovery, which is the integration of in-situ combustion with technological advances in drilling horizontal wells. This method uses horizontal wells as producers of oil, keeping vertical injection wells to inject air. This process has not yet been applied in Brazil, making it necessary, evaluation of these new technologies applied to local realities, therefore, this study aimed to perform a parametric study of the combustion process with in-situ oil production in horizontal wells, using a semi synthetic reservoir, with characteristics of the Brazilian Northeast basin. The simulations were performed in a commercial software "STARS" (Steam, Thermal, and Advanced Processes Reservoir Simulator), from CMG (Computer Modelling Group). The following operating parameters were analyzed: air rate, configuration of producer wells and oxygen concentration. A sensitivity study on cumulative oil (Np) was performed with the technique of experimental design, with a mixed model of two and three levels (32x22), a total of 36 runs. Also, it was done a technical economic estimative for each model of fluid. The results showed that injection rate was the most influence parameter on oil recovery, for both studied models, well arrangement depends on fluid model, and oxygen concentration favors recovery oil. The process can be profitable depends on air rate
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2016.
Resumo:
Las teorías administrativas se han basado, casi sin excepción, en los fundamentos y los modelos de la ciencia clásica (particularmente, en los modelos de la física newtoniana). Sin embargo, las organizaciones actualmente se enfrentan a un mundo globalizado, plagado de información (y no necesariamente conocimiento), hiperconectado, dinámico y cargado de incertidumbre, por lo que muchas de las teorías pueden mostrar limitaciones para las organizaciones. Y quizá no por la estructura, la lógica o el alcance de las mismas, sino por la falta de criterios que justifiquen su aplicación. En muchos casos, las organizaciones siguen utilizando la intuición, las suposiciones y las verdades a medias en la toma de decisiones. Este panorama pone de manifiesto dos hechos: de un lado, la necesidad de buscar un método que permita comprender las situaciones de cada organización para apoyar la toma de decisiones. De otro lado, la necesidad de potenciar la intuición con modelos y técnicas no tradicionales (usualmente provenientes o inspiradas por la ingeniería). Este trabajo busca anticipar los pilares de un posible método que permita apoyar la toma de decisiones por medio de la simulación de modelos computacionales, utilizando las posibles interacciones entre: la administración basada en modelos, la ciencia computacional de la organización y la ingeniería emergente.
Resumo:
This study proposed to evaluate the mandibular biomechanics in the posterior dentition based on experimental and computational analyses. The analyses were performed on a model of human mandible, which was modeled by epoxy resin for photoelastic analysis and by computer-aided design for finite element analysis. To standardize the evaluation, specific areas were determined at the lateral surface of mandibular body. The photoelastic analysis was configured through a vertical load on the first upper molar and fixed support at the ramus of mandible. The same configuration was used in the computer simulation. Force magnitudes of 50, 100, 150, and 200 N were applied to evaluate the bone stress. The stress results presented similar distribution in both analyses, with the more intense stress being at retromolar area and oblique line and alveolar process at molar level. This study presented the similarity of results in the experimental and computational analyses and, thus, showed the high importance of morphology biomechanical characterization at posterior dentition.
Resumo:
OBJETIVO: Desenvolver simulação computadorizada de ablação para produzir lentes de contato personalizadas a fim de corrigir aberrações de alta ordem. MÉTODOS: Usando dados reais de um paciente com ceratocone, mensurados em um aberrômetro ("wavefront") com sensor Hartmann-Shack, foram determinados as espessuras de lentes de contato que compensam essas aberrações assim como os números de pulsos necessários para fazer ablação as lentes especificamente para este paciente. RESULTADOS: Os mapas de correção são apresentados e os números dos pulsos foram calculados, usando feixes com a largura de 0,5 mm e profundidade de ablação de 0,3 µm. CONCLUSÕES: Os resultados simulados foram promissores, mas ainda precisam ser aprimorados para que o sistema de ablação "real" possa alcançar a precisão desejada.
Resumo:
This article describes the design, implementation, and experiences with AcMus, an open and integrated software platform for room acoustics research, which comprises tools for measurement, analysis, and simulation of rooms for music listening and production. Through use of affordable hardware, such as laptops, consumer audio interfaces and microphones, the software allows evaluation of relevant acoustical parameters with stable and consistent results, thus providing valuable information in the diagnosis of acoustical problems, as well as the possibility of simulating modifications in the room through analytical models. The system is open-source and based on a flexible and extensible Java plug-in framework, allowing for cross-platform portability, accessibility and experimentation, thus fostering collaboration of users, developers and researchers in the field of room acoustics.
Resumo:
Enzymes are extremely efficient catalysts. Here, part of the mechanisms proposed to explain this catalytic power will be compared to quantitative experimental results and computer simulations. Influence of the enzymatic environment over species along the reaction coordinate will be analysed. Concepts of transition state stabilisation and reactant destabilisation will be confronted. Divided site model and near-attack conformation hypotheses will also be discussed. Molecular interactions such as covalent catalysis, general acid-base catalysis, electrostatics, entropic effects, steric hindrance, quantum and dynamical effects will also be analysed as sources of catalysis. Reaction mechanisms, in particular that catalysed by protein tyrosine phosphatases, illustrate the concepts.
Resumo:
Shallow subsurface layers of gold nanoclusters were formed in polymethylmethacrylate (PMMA) polymer by very low energy (49 eV) gold ion implantation. The ion implantation process was modeled by computer simulation and accurately predicted the layer depth and width. Transmission electron microscopy (TEM) was used to image the buried layer and individual nanoclusters; the layer width was similar to 6-8 nm and the cluster diameter was similar to 5-6 nm. Surface plasmon resonance (SPR) absorption effects were observed by UV-visible spectroscopy. The TEM and SPR results were related to prior measurements of electrical conductivity of Au-doped PMMA, and excellent consistency was found with a model of electrical conductivity in which either at low implantation dose the individual nanoclusters are separated and do not physically touch each other, or at higher implantation dose the nanoclusters touch each other to form a random resistor network (percolation model). (C) 2009 American Vacuum Society. [DOI: 10.1116/1.3231449]
Resumo:
PMMA (polymethylmethacrylate) was ion implanted with gold at very low energy and over a range of different doses using a filtered cathodic arc metal plasma system. A nanometer scale conducting layer was formed, fully buried below the polymer surface at low implantation dose, and evolving to include a gold surface layer as the dose was increased. Depth profiles of the implanted material were calculated using the Dynamic TRIM computer simulation program. The electrical conductivity of the gold-implanted PMMA was measured in situ as a function of dose. Samples formed at a number of different doses were subsequently characterized by Rutherford backscattering spectrometry, and test patterns were formed on the polymer by electron beam lithography. Lithographic patterns were imaged by atomic force microscopy and demonstrated that the contrast properties of the lithography were well maintained in the surface-modified PMMA.