930 resultados para Heat -- Transmission -- Computer simulation
Resumo:
The nature of Discrete-Event Simulation (DES) and the use of DES in organisations is changing. Two important developments are the use of Visual Interactive Modelling systems and the use of DES in Business Process Management (BPM) projects. Survey research is presented that shows that despite these developments usage of DES remains relatively low due to a lack of knowledge of the benefits of the technique. This paper considers two factors that could lead to a greater achievement and appreciation of the full benefit of DES and thus lead to greater usage. Firstly in relation to using DES to investigate social systems, both in the process of undertaking a simulation project and in the interpretation of the findings a 'soft' approach may generate more knowledge from the DES intervention and thus increase its benefit to businesses. Secondly in order to assess the full range of outcomes of DES the technique could be considered from the perspective of an information processing tool within the organisation. This will allow outcomes to be considered under the three modes of organisational information use of sense making, knowledge creating and decision making which relate to the theoretical areas of knowledge management, organisational learning and decision making respectively. The association of DES with these popular techniques could further increase its usage in business.
Resumo:
The following thesis presents results obtained from both numerical simulation and laboratory experimentation (both of which were carried out by the author). When data is propagated along an optical transmission line some timing irregularities can occur such as timing jitter and phase wander. Traditionally these timing problems would have been corrected by converting the optical signal into the electrical domain and then compensating for the timing irregularity before converting the signal back into the optical domain. However, this thesis posses a potential solution to the problem by remaining completely in the optical domain, eliminating the need for electronics. This is desirable as not only does optical processing reduce the latency effect that their electronic counterpart have, it also holds the possibility of an increase in overall speed. A scheme was proposed which utilises the principle of wavelength conversion to dynamically convert timing irregularities (timing jitter and phase wander) into a change in wavelength (this occurs on a bit-by-bit level and so timing jitter and phase wander can be compensated for simultaneously). This was achieved by optically sampling a linearly chirped, locally generated clock source (the sampling function was achieved using a nonlinear optical loop mirror). The data, now with each bit or code word having a unique wavelength, is then propagated through a dispersion compensation module. The dispersion compensation effectively re-aligns the data in time and so thus, the timing irregularities are removed. The principle of operation was tested using computer simulation before being re-tested in a laboratory environment. A second stage was added to the device to create 3R regeneration. The second stage is used to simply convert the timing suppressed data back into a single wavelength. By controlling the relative timing displacement between stage one and stage two, the wavelength that is finally produced can be controlled.
Resumo:
The research is concerned with the application of the computer simulation technique to study the performance of reinforced concrete columns in a fire environment. The effect of three different concrete constitutive models incorporated in the computer simulation on the structural response of reinforced concrete columns exposed to fire is investigated. The material models differed mainly in respect to the formulation of the mechanical properties of concrete. The results from the simulation have clearly illustrated that a more realistic response of a reinforced concrete column exposed to fire is given by a constitutive model with transient creep or appropriate strain effect The assessment of the relative effect of the three concrete material models is considered from the analysis by adopting the approach of a parametric study, carried out using the results from a series of analyses on columns heated on three sides which produce substantial thermal gradients. Three different loading conditions were used on the column; axial loading and eccentric loading both to induce moments in the same sense and opposite sense to those induced by the thermal gradient. An axially loaded column heated on four sides was also considered. The computer modelling technique adopted separated the thermal and structural responses into two distinct computer programs. A finite element heat transfer analysis was used to determine the thermal response of the reinforced concrete columns when exposed to the ISO 834 furnace environment. The temperature distribution histories obtained were then used in conjunction with a structural response program. The effect of the occurrence of spalling on the structural behaviour of reinforced concrete column is also investigated. There is general recognition of the potential problems of spalling but no real investigation into what effect spalling has on the fire resistance of reinforced concrete members. In an attempt to address the situation, a method has been developed to model concrete columns exposed to fire which incorporates the effect of spalling. A total of 224 computer simulations were undertaken by varying the amounts of concrete lost during a specified period of exposure to fire. An array of six percentages of spalling were chosen for one range of simulation while a two stage progressive spalling regime was used for a second range. The quantification of the reduction in fire resistance of the columns against the amount of spalling, heating and loading patterns, and the time at which the concrete spalls appears to indicate that it is the amount of spalling which is the most significant variable in the reduction of fire resistance.
Resumo:
This paper presents a discrete event simulation study to examine tenancy service performance in a shopping centre. The study aims to provide an understanding of how informal management mechanisms could enhance existing ERP systems. The research shows the potential benefits of combining the traditional strengths of ERP in providing better performance in terms of efficiency with the ability to react with flexibility to customer's requests. © 2012 SIMULATION COUNCILS, INC.
Resumo:
The UK Police Force is required to operate communications centres under increased funding constraints. Staff represent the main cost in operating the facility and the key issue for the efficient deployment of staff, in this case call handler staff, is to try to ensure sufficient staff are available to make a timely response to customer calls when the timing of individual calls is difficult to predict. A discrete-event simulation study is presented of an investigation of a new shift pattern for call handler staff that aims to improve operational efficiency. The communications centre can be considered a specialised case of a call centre but an important issue for Police Force management is the particularly stressful nature of the work staff are involved with when responding to emergency calls. Thus decisions regarding changes to the shift system were made in the context of both attempting to improve efficiency by matching staff supply with customer demand, but also ensuring a reasonable workload pattern for staff over time.
Resumo:
Computer simulation has been used to study the structure and dynamics of methane in hydrated sodium montmorillonite clays under conditions encountered in sedimentary basins. Systems containing approximately one, two, three and four molecular layers of water have followed gradients of 150 bar km-1 and 30Kkm-1, to a maximum burial depth of 6 km (900 bar and 460 K). Methane is coordinated to approximately 19 oxygen atoms, of which typically 6 are provided by the clay surface. Only in the three- and four-layer hydrates is methane able to leave the clay surface. Diffusion depends strongly on the porosity (water content) and burial depth: self-diffusion coefficients are in the range 0.12 × 10-9m2s-1 for water and 0.04 × 10−9m2s−1 < D < 8.64 × 10−9m2s−1 for methane. Bearing in mind that porosity decreases with burial depth, it is estimated that maximum diffusion occurs at around 3 km. This is in good agreement with the known location of methane reservoirs in sedimentary basins.
Resumo:
Swallowable capsule endoscopy is used for non-invasive diagnosis of some gastrointestinal (GI) organs. However, control over the position of the capsule is a major unresolved issue. This study presents a design for steering the capsule based on magnetic levitation. The levitation is stabilized with the aid of a computer-aided feedback control system and diamagnetism. Peristaltic and gravitational forces to be overcome were calculated. A levitation setup was built to analyze the feasibility of using Hall Effect sensors to locate the in- vivo capsule. CAD software Maxwell 3D (Ansoft, Pittsburgh, PA) was used to determine the dimensions of the resistive electromagnets required for levitation and the feasibility of building them was examined. Comparison based on design complexity was made between positioning the patient supinely and upright.
Resumo:
We apply Agent-Based Modeling and Simulation (ABMS) to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between human resource management practices and retail productivity. Despite the fact we are working within a relatively novel and complex domain, it is clear that intelligent agents do offer potential for developing organizational capabilities in the future. Our multi-disciplinary research team has worked with a UK department store to collect data and capture perceptions about operations from actors within departments. Based on this case study work, we have built a simulator that we present in this paper. We then use the simulator to gather empirical evidence regarding two specific management practices: empowerment and employee development.
Resumo:
In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methods. In a second step a multi-scenario experiment was carried out to study the behaviour of the models when they are used for the purpose of operational improvement. Overall we have found that for our case study example both, discrete event simulation and agent based simulation have the same potential to support the investigation into the efficiency of implementing new management policies.
Resumo:
A primary goal of this dissertation is to understand the links between mathematical models that describe crystal surfaces at three fundamental length scales: The scale of individual atoms, the scale of collections of atoms forming crystal defects, and macroscopic scale. Characterizing connections between different classes of models is a critical task for gaining insight into the physics they describe, a long-standing objective in applied analysis, and also highly relevant in engineering applications. The key concept I use in each problem addressed in this thesis is coarse graining, which is a strategy for connecting fine representations or models with coarser representations. Often this idea is invoked to reduce a large discrete system to an appropriate continuum description, e.g. individual particles are represented by a continuous density. While there is no general theory of coarse graining, one closely related mathematical approach is asymptotic analysis, i.e. the description of limiting behavior as some parameter becomes very large or very small. In the case of crystalline solids, it is natural to consider cases where the number of particles is large or where the lattice spacing is small. Limits such as these often make explicit the nature of links between models capturing different scales, and, once established, provide a means of improving our understanding, or the models themselves. Finding appropriate variables whose limits illustrate the important connections between models is no easy task, however. This is one area where computer simulation is extremely helpful, as it allows us to see the results of complex dynamics and gather clues regarding the roles of different physical quantities. On the other hand, connections between models enable the development of novel multiscale computational schemes, so understanding can assist computation and vice versa. Some of these ideas are demonstrated in this thesis. The important outcomes of this thesis include: (1) a systematic derivation of the step-flow model of Burton, Cabrera, and Frank, with corrections, from an atomistic solid-on-solid-type models in 1+1 dimensions; (2) the inclusion of an atomistically motivated transport mechanism in an island dynamics model allowing for a more detailed account of mound evolution; and (3) the development of a hybrid discrete-continuum scheme for simulating the relaxation of a faceted crystal mound. Central to all of these modeling and simulation efforts is the presence of steps composed of individual layers of atoms on vicinal crystal surfaces. Consequently, a recurring theme in this research is the observation that mesoscale defects play a crucial role in crystal morphological evolution.
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.