83 resultados para Systems Simulation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since much knowledge is tacit, eliciting knowledge is a common bottleneck during the development of knowledge-based systems. Visual interactive simulation (VIS) has been proposed as a means for eliciting experts’ decision-making by getting them to interact with a visual simulation of the real system in which they work. In order to explore the effectiveness and efficiency of VIS based knowledge elicitation, an experiment has been carried out with decision-makers in a Ford Motor Company engine assembly plant. The model properties under investigation were the level of visual representation (2-dimensional, 2½-dimensional and 3-dimensional) and the model parameter settings (unadjusted and adjusted to represent more uncommon and extreme situations). The conclusion from the experiment is that using a 2-dimensional representation with adjusted parameter settings provides the better simulation-based means for eliciting knowledge, at least for the case modelled.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The investigation of insulation debris generation, transport and sedimentation becomes important with regard to reactor safety research for PWR and BWR, when considering the long-term behavior of emergency core cooling systems during all types of loss of coolant accidents (LOCA). The insulation debris released near the break during a LOCA incident consists of a mixture of disparate particle population that varies with size, shape, consistency and other properties. Some fractions of the released insulation debris can be transported into the reactor sump, where it may perturb/impinge on the emergency core cooling systems. Open questions of generic interest are the sedimentation of the insulation debris in a water pool, its possible re-suspension and transport in the sump water flow and the particle load on strainers and corresponding pressure drop. A joint research project on such questions is being performed in cooperation between the University of Applied Sciences Zittau/Görlitz and the Forschungszentrum Dresden-Rossendorf. The project deals with the experimental investigation of particle transport phenomena in coolant flow and the development of CFD models for its description. While the experiments are performed at the University at Zittau/Görlitz, the theoretical modeling efforts are concentrated at Forschungszentrum Dresden-Rossendorf. In the presentation the basic concepts for CFD modeling are described and feasibility studies including the conceptual design of the experiments are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The investigation of insulation debris generation, transport and sedimentation becomes important with regard to reactor safety research for PWR and BWR, when considering the long-term behavior of emergency core cooling systems during all types of loss of coolant accidents (LOCA). The insulation debris released near the break during a LOCA incident consists of a mixture of disparate particle population that varies with size, shape, consistency and other properties. Some fractions of the released insulation debris can be transported into the reactor sump, where it may perturb/impinge on the emergency core cooling systems. Open questions of generic interest are the sedimentation of the insulation debris in a water pool, its possible re-suspension and transport in the sump water flow and the particle load on strainers and corresponding pressure drop. A joint research project on such questions is being performed in cooperation between the University of Applied Sciences Zittau/Gorlitz and the Forschungszentrum Dresden-Rossendorf. The project deals with the experimental investigation of particle transport phenomena in coolant flow and the development of CFD models for its description. While the experiments are performed at the University at Zittau/Gorlitz, the theoretical modeling efforts are concentrated at Forschungszentrum Dresden-Rossendorf. In the current paper the basic concepts for CFD modeling are described and feasibility studies including the conceptual design of the experiments are presented. Copyright © 2008 by ASME.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The computer systems of today are characterised by data and program control that are distributed functionally and geographically across a network. A major issue of concern in this environment is the operating system activity of resource management for different processors in the network. To ensure equity in load distribution and improved system performance, load balancing is often undertaken. The research conducted in this field so far, has been primarily concerned with a small set of algorithms operating on tightly-coupled distributed systems. More recent studies have investigated the performance of such algorithms in loosely-coupled architectures but using a small set of processors. This thesis describes a simulation model developed to study the behaviour and general performance characteristics of a range of dynamic load balancing algorithms. Further, the scalability of these algorithms are discussed and a range of regionalised load balancing algorithms developed. In particular, we examine the impact of network diameter and delay on the performance of such algorithms across a range of system workloads. The results produced seem to suggest that the performance of simple dynamic policies are scalable but lack the load stability of more complex global average algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cellular manufacturing is widely acknowledged as one of the key approaches to achieving world-class performance in batch manufacturing operations. The design of cellular manufacturing systems (CMS) is therefore crucial in determining a company's competitiveness. This thesis postulated that, in order to be effective the design of CMS should not only be systematic but also systemic. A systemic design uses the concepts of the body of work known as the 'systems approach' to ensure that a truly effective CMS is defined. The thesis examined the systems approach and created a systemic framework against which existing approaches to the design of CMS were evaluated. The most promising of these, Manufacturing Systems Engineering (MSE), was further investigated using a series of cross-sectional case-studies. Although, in practice, MSE proved to be less than systemic, it appeared to produce significant benefits. This seemed to suggest that CMS design did not need to be systemic to be effective. However, further longitudinal case-studies showed that the benefits claimed were at an operational level not at a business level and also that the performance of the whole system had not been evaluated. The deficiencies identified in the existing approaches to designing CMS were then addressed by the development of a novel CMS design methodology that fully utilised systems concepts. A key aspect of the methodology was the use of the Whole Business Simulator (WBS), a modelling and simulation tool that enabled the evaluation of CMS at operational and business levels. The most contentious aspects of the methodology were tested on a significant and complex case-study. The results of the exercise indicated that the systemic methodology was feasible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research described in this thesis investigates three issues related to the use of expert systems for decision making in organizations. These are the effectiveness of ESs when used in different roles, to replace a human decision maker or to advise a human decision maker, the users' behaviourand opinions towards using an expertadvisory system and, the possibility of organization-wide deployment of expert systems and the role of an ES in different organizational levels. The research was based on the development of expert systems within a business game environment, a simulation of a manufacturing company. This was chosen to give more control over the `experiments' than would be possible in a real organization. An expert system (EXGAME) was developed based on a structure derived from Anthony's three levels of decision making to manage the simulated company in the business game itself with little user intervention. On the basis of EXGAME, an expert advisory system (ADGAME) was built to help game players to make better decisions in managing the game company. EXGAME and ADGAME are thus two expert systems in the same domain performing different roles; it was found that ADGAME had, in places, to be different from EXGAME, not simply an extension of it. EXGAME was tested several times against human rivals and was evaluated by measuring its performance. ADGAME was also tested by different users and was assessed by measuring the users' performance and analysing their opinions towards it as a helpful decision making aid. The results showed that an expert system was able to replace a human at the operational level, but had difficulty at the strategic level. It also showed the success of the organization-wide deployment of expert systems in this simulated company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main theme of research of this project concerns the study of neutral networks to control uncertain and non-linear control systems. This involves the control of continuous time, discrete time, hybrid and stochastic systems with input, state or output constraints by ensuring good performances. A great part of this project is devoted to the opening of frontiers between several mathematical and engineering approaches in order to tackle complex but very common non-linear control problems. The objectives are: 1. Design and develop procedures for neutral network enhanced self-tuning adaptive non-linear control systems; 2. To design, as a general procedure, neural network generalised minimum variance self-tuning controller for non-linear dynamic plants (Integration of neural network mapping with generalised minimum variance self-tuning controller strategies); 3. To develop a software package to evaluate control system performances using Matlab, Simulink and Neural Network toolbox. An adaptive control algorithm utilising a recurrent network as a model of a partial unknown non-linear plant with unmeasurable state is proposed. Appropriately, it appears that structured recurrent neural networks can provide conveniently parameterised dynamic models for many non-linear systems for use in adaptive control. Properties of static neural networks, which enabled successful design of stable adaptive control in the state feedback case, are also identified. A survey of the existing results is presented which puts them in a systematic framework showing their relation to classical self-tuning adaptive control application of neural control to a SISO/MIMO control. Simulation results demonstrate that the self-tuning design methods may be practically applicable to a reasonably large class of unknown linear and non-linear dynamic control systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis introduces and develops a novel real-time predictive maintenance system to estimate the machine system parameters using the motion current signature. Recently, motion current signature analysis has been addressed as an alternative to the use of sensors for monitoring internal faults of a motor. A maintenance system based upon the analysis of motion current signature avoids the need for the implementation and maintenance of expensive motion sensing technology. By developing nonlinear dynamical analysis for motion current signature, the research described in this thesis implements a novel real-time predictive maintenance system for current and future manufacturing machine systems. A crucial concept underpinning this project is that the motion current signature contains infor­mation relating to the machine system parameters and that this information can be extracted using nonlinear mapping techniques, such as neural networks. Towards this end, a proof of con­cept procedure is performed, which substantiates this concept. A simulation model, TuneLearn, is developed to simulate the large amount of training data required by the neural network ap­proach. Statistical validation and verification of the model is performed to ascertain confidence in the simulated motion current signature. Validation experiment concludes that, although, the simulation model generates a good macro-dynamical mapping of the motion current signature, it fails to accurately map the micro-dynamical structure due to the lack of knowledge regarding performance of higher order and nonlinear factors, such as backlash and compliance. Failure of the simulation model to determine the micro-dynamical structure suggests the pres­ence of nonlinearity in the motion current signature. This motivated us to perform surrogate data testing for nonlinearity in the motion current signature. Results confirm the presence of nonlinearity in the motion current signature, thereby, motivating the use of nonlinear tech­niques for further analysis. Outcomes of the experiment show that nonlinear noise reduction combined with the linear reverse algorithm offers precise machine system parameter estimation using the motion current signature for the implementation of the real-time predictive maintenance system. Finally, a linear reverse algorithm, BJEST, is developed and applied to the motion current signature to estimate the machine system parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An investigation is carried out into the design of a small local computer network for eventual implementation on the University of Aston campus. Microprocessors are investigated as a possible choice for use as a node controller for reasons of cost and reliability. Since the network will be local, high speed lines of megabit order are proposed. After an introduction to several well known networks, various aspects of networks are discussed including packet switching, functions of a node and host-node protocol. Chapter three develops the network philosophy with an introduction to microprocessors. Various organisations of microprocessors into multicomputer and multiprocessor systems are discussed, together with methods of achieving reliabls computing. Chapter four presents the simulation model and its implentation as a computer program. The major modelling effort is to study the behaviour of messages queueing for access to the network and the message delay experienced on the network. Use is made of spectral analysis to determine the sampling frequency while Sxponentially Weighted Noving Averages are used for data smoothing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Manufacturing firms are driven by competitive pressures to continually improve the effectiveness and efficiency of their organisations. For this reason, manufacturing engineers often implement changes to existing processes, or design new production facilities, with the expectation of making further gains in manufacturing system performance. This thesis relates to how the likely outcome of this type of decision should be predicted prior to its implementation. The thesis argues that since manufacturing systems must also interact with many other parts of an organisation, the expected performance improvements can often be significantly hampered by constraints that arise elsewhere in the business. As a result, decision-makers should attempt to predict just how well a proposed design will perform when these other factors, or 'support departments', are taken into consideration. However, the thesis also demonstrates that, in practice, where quantitative analysis is used to evaluate design decisions, the analysis model invariably ignores the potential impact of support functions on a system's overall performance. A more comprehensive modelling approach is therefore required. A study of how various business functions interact establishes that to properly represent the kind of delays that give rise to support department constraints, a model should actually portray the dynamic and stochastic behaviour of entities in both the manufacturing and non-manufacturing aspects of a business. This implies that computer simulation be used to model design decisions but current simulation software does not provide a sufficient range of functionality to enable the behaviour of all of these entities to be represented in this way. The main objective of the research has therefore been the development of a new simulator that will overcome limitations of existing software and so enable decision-makers to conduct a more holistic evaluation of design decisions. It is argued that the application of object-oriented techniques offers a potentially better way of fulfilling both the functional and ease-of-use issues relating to development of the new simulator. An object-oriented analysis and design of the system, called WBS/Office, are therefore presented that extends to modelling a firm's administrative and other support activities in the context of the manufacturing system design process. A particularly novel feature of the design is the ability for decision-makers to model how a firm's specific information and document processing requirements might hamper shop-floor performance. The simulator is primarily intended for modelling make-to-order batch manufacturing systems and the thesis presents example models created using a working version of WBS/Office that demonstrate the feasibility of using the system to analyse manufacturing system designs in this way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particulate solids are complex redundant systems which consist of discrete particles. The interactions between the particles are complex and have been the subject of many theoretical and experimental investigations. Invetigations of particulate material have been restricted by the lack of quantitative information on the mechanisms occurring within an assembly. Laboratory experimentation is limited as information on the internal behaviour can only be inferred from measurements on the assembly boundary, or the use of intrusive measuring devices. In addition comparisons between test data are uncertain due to the difficulty in reproducing exact replicas of physical systems. Nevertheless, theoretical and technological advances require more detailed material information. However, numerical simulation affords access to information on every particle and hence the micro-mechanical behaviour within an assembly, and can replicate desired systems. To use a computer program to numerically simulate material behaviour accurately it is necessary to incorporte realistic interaction laws. This research programme used the finite difference simulation program `BALL', developed by Cundall (1971), which employed linear spring force-displacement laws. It was thus necessary to incorporate more realistic interaction laws. Therefore, this research programme was primarily concerned with the implementation of the normal force-displacement law of Hertz (1882) and the tangential force-displacement laws of Mindlin and Deresiewicz (1953). Within this thesis the contact mechanics theories employed in the program are developed and the adaptations which were necessary to incorporate these laws are detailed. Verification of the new contact force-displacement laws was achieved by simulating a quasi-static oblique contact and single particle oblique impact. Applications of the program to the simulation of large assemblies of particles is given, and the problems in undertaking quasi-static shear tests along with the results from two successful shear tests are described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investment in transport infrastructure can be highly sensitive to uncertainty. The scale and lead time of strategic transport programmes are such that they require continuing policy support and accurate forecasting. Delay, cost escalation and abandonment of projects often result if these conditions are not present. In Part One the physical characteristics of infrastructure are identified as a major constraint on planning processes. The extent to which strategies and techniques acknowledge these constraints is examined. A simple simulation model is developed to evaluate the effects on system development of variations in the scale and lead time of investments. In Part Two, two case studies of strategic infrastructure investment are analysed. The absence of a policy consensus for airport location was an important factor in the delayed resolution of the Third London Airport issue. In London itself, the traffic and environmental effects of major highway investment ultimately resulted in the abandonment of plans to construct urban motorways. In both cases, the infrastructure implications of alternative strategies are reviewed with reference to the problems of uncertainty. In conclusion, the scale of infrastructure investment is considered the most important of the constraints on the processes of transport planning. Adequate appraisal of such constraints may best be achieved by evaluation more closely aligned to policy objectives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atomistic Molecular Dynamics provides powerful and flexible tools for the prediction and analysis of molecular and macromolecular systems. Specifically, it provides a means by which we can measure theoretically that which cannot be measured experimentally: the dynamic time-evolution of complex systems comprising atoms and molecules. It is particularly suitable for the simulation and analysis of the otherwise inaccessible details of MHC-peptide interaction and, on a larger scale, the simulation of the immune synapse. Progress has been relatively tentative yet the emergence of truly high-performance computing and the development of coarse-grained simulation now offers us the hope of accurately predicting thermodynamic parameters and of simulating not merely a handful of proteins but larger, longer simulations comprising thousands of protein molecules and the cellular scale structures they form. We exemplify this within the context of immunoinformatics.