39 resultados para Emerging Modelling Paradigms and Model Coupling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Liquid-liquid extraction has long been known as a unit operation that plays an important role in industry. This process is well known for its complexity and sensitivity to operation conditions. This thesis presents an attempt to explore the dynamics and control of this process using a systematic approach and state of the art control system design techniques. The process was studied first experimentally under carefully selected. operation conditions, which resembles the ranges employed practically under stable and efficient conditions. Data were collected at steady state conditions using adequate sampling techniques for the dispersed and continuous phases as well as during the transients of the column with the aid of a computer-based online data logging system and online concentration analysis. A stagewise single stage backflow model was improved to mimic the dynamic operation of the column. The developed model accounts for the variation in hydrodynamics, mass transfer, and physical properties throughout the length of the column. End effects were treated by addition of stages at the column entrances. Two parameters were incorporated in the model namely; mass transfer weight factor to correct for the assumption of no mass transfer in the. settling zones at each stage and the backmixing coefficients to handle the axial dispersion phenomena encountered in the course of column operation. The parameters were estimated by minimizing the differences between the experimental and the model predicted concentration profiles at steady state conditions using non-linear optimisation technique. The estimated values were then correlated as functions of operating parameters and were incorporated in·the model equations. The model equations comprise a stiff differential~algebraic system. This system was solved using the GEAR ODE solver. The calculated concentration profiles were compared to those experimentally measured. A very good agreement of the two profiles was achieved within a percent relative error of ±2.S%. The developed rigorous dynamic model of the extraction column was used to derive linear time-invariant reduced-order models that relate the input variables (agitator speed, solvent feed flowrate and concentration, feed concentration and flowrate) to the output variables (raffinate concentration and extract concentration) using the asymptotic method of system identification. The reduced-order models were shown to be accurate in capturing the dynamic behaviour of the process with a maximum modelling prediction error of I %. The simplicity and accuracy of the derived reduced-order models allow for control system design and analysis of such complicated processes. The extraction column is a typical multivariable process with agitator speed and solvent feed flowrate considered as manipulative variables; raffinate concentration and extract concentration as controlled variables and the feeds concentration and feed flowrate as disturbance variables. The control system design of the extraction process was tackled as multi-loop decentralised SISO (Single Input Single Output) as well as centralised MIMO (Multi-Input Multi-Output) system using both conventional and model-based control techniques such as IMC (Internal Model Control) and MPC (Model Predictive Control). Control performance of each control scheme was. studied in terms of stability, speed of response, sensitivity to modelling errors (robustness), setpoint tracking capabilities and load rejection. For decentralised control, multiple loops were assigned to pair.each manipulated variable with each controlled variable according to the interaction analysis and other pairing criteria such as relative gain array (RGA), singular value analysis (SVD). Loops namely Rotor speed-Raffinate concentration and Solvent flowrate Extract concentration showed weak interaction. Multivariable MPC has shown more effective performance compared to other conventional techniques since it accounts for loops interaction, time delays, and input-output variables constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research has been undertaken to determine how successful multi-organisational enterprise strategy is reliant on the correct type of Enterprise Resource Planning (ERP) information systems being used. However there appears to be a dearth of research as regards strategic alignment between ERP systems development and multi-organisational enterprise governance as guidelines and frameworks to assist practitioners in making decision for multi-organisational collaboration supported by different types of ERP systems are still missing from theoretical and empirical perspectives. This calls for this research which investigates ERP systems development and emerging practices in the management of multi-organisational enterprises (i.e. parts of companies working with parts of other companies to deliver complex product-service systems) and identify how different ERP systems fit into different multi-organisational enterprise structures, in order to achieve sustainable competitive success. An empirical inductive study was conducted using the Grounded Theory-based methodological approach based on successful manufacturing and service companies in the UK and China. This involved an initial pre-study literature review, data collection via 48 semi-structured interviews with 8 companies delivering complex products and services across organisational boundaries whilst adopting ERP systems to support their collaborative business strategies – 4 cases cover printing, semiconductor manufacturing, and parcel distribution industries in the UK and 4 cases cover crane manufacturing, concrete production, and banking industries in China in order to form a set of 29 tentative propositions that have been validated via a questionnaire receiving 116 responses from 16 companies. The research has resulted in the consolidation of the validated propositions into a novel concept referred to as the ‘Dynamic Enterprise Reference Grid for ERP’ (DERG-ERP) which draws from multiple theoretical perspectives. The core of the DERG-ERP concept is a contingency management framework which indicates that different multi-organisational enterprise paradigms and the supporting ERP information systems are not the result of different strategies, but are best considered part of a strategic continuum with the same overall business purpose of multi-organisational cooperation. At different times and circumstances in a partnership lifecycle firms may prefer particular multi-organisational enterprise structures and the use of different types of ERP systems to satisfy business requirements. Thus the DERG-ERP concept helps decision makers in selecting, managing and co-developing the most appropriate multi-organistional enterprise strategy and its corresponding ERP systems by drawing on core competence, expected competitiveness, and information systems strategic capabilities as the main contingency factors. Specifically, this research suggests that traditional ERP(I) systems are associated with Vertically Integrated Enterprise (VIE); whilst ERPIIsystems can be correlated to Extended Enterprise (EE) requirements and ERPIII systems can best support the operations of Virtual Enterprise (VE). The contribution of this thesis is threefold. Firstly, this work contributes to a gap in the extant literature about the best fit between ERP system types and multi-organisational enterprise structure types; and proposes a new contingency framework – the DERG-ERP, which can be used to explain how and why enterprise managers need to change and adapt their ERP information systems in response to changing business and operational requirements. Secondly, with respect to a priori theoretical models, the new DERG-ERP has furthered multi-organisational enterprise management thinking by incorporating information system strategy, rather than purely focusing on strategy, structural, and operational aspects of enterprise design and management. Simultaneously, the DERG-ERP makes theoretical contributions to the current IS Strategy Formulation Model which does not explicitly address multi-organisational enterprise governance. Thirdly, this research clarifies and emphasises the new concept and ideas of future ERP systems (referred to as ERPIII) that are inadequately covered in the extant literature. The novel DERG-ERP concept and its elements have also been applied to 8 empirical cases to serve as a practical guide for ERP vendors, information systems management, and operations managers hoping to grow and sustain their competitive advantage with respect to effective enterprise strategy, enterprise structures, and ERP systems use; referred to in this thesis as the “enterprisation of operations”.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Softeam has over 20 years of experience providing UML-based modelling solutions, such as its Modelio modelling tool, and its Constellation enterprise model management and collaboration environment. Due to the increasing number and size of the models used by Softeam’s clients, Softeam joined the MONDO FP7 EU research project, which worked on solutions for these scalability challenges and produced the Hawk model indexer among other results. This paper presents the technical details and several case studies on the integration of Hawk into Softeam’s toolset. The first case study measured the performance of Hawk’s Modelio support using varying amounts of memory for the Neo4j backend. In another case study, Hawk was integrated into Constellation to provide scalable global querying of model repositories. Finally, the combination of Hawk and the Epsilon Generation Language was compared against Modelio for document generation: for the largest model, Hawk was two orders of magnitude faster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports preliminary progress on a principled approach to modelling nonstationary phenomena using neural networks. We are concerned with both parameter and model order complexity estimation. The basic methodology assumes a Bayesian foundation. However to allow the construction of pragmatic models, successive approximations have to be made to permit computational tractibility. The lowest order corresponds to the (Extended) Kalman filter approach to parameter estimation which has already been applied to neural networks. We illustrate some of the deficiencies of the existing approaches and discuss our preliminary generalisations, by considering the application to nonstationary time series.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Open-loop operatlon of the stepping motor exploits the inherent advantages of the machine. For near optimum operation: in this mode, however, an accurate system model is required to facilitate controller design. Such a model must be comprehensive and take account of the non-linearities inherent in the system. The result is a complex formulation which can be made manageable with a computational aid. A digital simulation of a hybrid type stepping motor and its associated drive circuit is proposed. The simulation is based upon a block diagram model which includes reasonable approximations to the major non-linearities. The simulation is shown to yield accurate performance predictions. The determination of the transfer functions is based upon the consideration of the physical processes involved rather than upon direct input-outout measurements. The effects of eddy currents, saturation, hysteresis, drive circuit characteristics and non-linear torque displacement characteristics are considered and methods of determining transfer functions, which take account of these effects, are offered. The static torque displacement characteristic is considered in detail and a model is proposed which predicts static torque for any combination of phase currents and shaft position. Methods of predicting the characteristic directly from machine geometry are investigated. Drive circuit design for high efficiency operation is considered and a model of a bipolar, bilevel circuit is proposed. The transfers between stator voltage and stator current and between stator current and air gap flux are complicated by the effects of eddy currents, saturation and hysteresis. Frequency response methods, combined with average inductance measurements, are shown to yield reasonable transfer functions. The modelling procedure and subsequent digital simulation is concluded to be a powerful method of non-linear analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent explosive growth in advanced manufacturing technology (AMT) and continued development of sophisticated information technologies (IT) is expected to have a profound effect on the way we design and operate manufacturing businesses. Furthermore, the escalating capital requirements associated with these developments have significantly increased the level of risk associated with initial design, ongoing development and operation. This dissertation has examined the integration of two key sub-elements of the Computer Integrated Manufacturing (CIM) system, namely the manufacturing facility and the production control system. This research has concentrated on the interactions between production control (MRP) and an AMT based production facility. The disappointing performance of such systems has been discussed in the context of a number of potential technological and performance incompatibilities between these two elements. It was argued that the design and selection of operating policies for both is the key to successful integration. Furthermore, policy decisions are shown to play an important role in matching the performance of the total system to the demands of the marketplace. It is demonstrated that a holistic approach to policy design must be adopted if successful integration is to be achieved. It is shown that the complexity of the issues resulting from such an approach required the formulation of a structured design methodology. Such a methodology was subsequently developed and discussed. This combined a first principles approach to the behaviour of system elements with the specification of a detailed holistic model for use in the policy design environment. The methodology aimed to make full use of the `low inertia' characteristics of AMT, whilst adopting a JIT configuration of MRP and re-coupling the total system to the market demands. This dissertation discussed the application of the methodology to an industrial case study and the subsequent design of operational policies. Consequently a novel approach to production control resulted. A central feature of which was a move toward reduced manual intervention in the MRP processing and scheduling logic with increased human involvement and motivation in the management of work-flow on the shopfloor. Experimental results indicated that significant performance advantages would result from the adoption of the recommended policy set.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Practising engineers frequently seek to understand what the effects of various manufacturing strategies will be on the performance of their production facilities. In this situation a computer model can help to provide insight and form predictions about future manufacturing system performance. Various types of modelling methods exist and each provide models that have distinct characteristics. This paper presents a review of popular modelling techniques and, based on the results of a structured experimental study, summarises their capabilities to support the evaluation of manufacturing strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web-based distributed modelling architectures are gaining increasing recognition as potentially useful tools to build holistic environmental models, combining individual components in complex workflows. However, existing web-based modelling frameworks currently offer no support for managing uncertainty. On the other hand, the rich array of modelling frameworks and simulation tools which support uncertainty propagation in complex and chained models typically lack the benefits of web based solutions such as ready publication, discoverability and easy access. In this article we describe the developments within the UncertWeb project which are designed to provide uncertainty support in the context of the proposed ‘Model Web’. We give an overview of uncertainty in modelling, review uncertainty management in existing modelling frameworks and consider the semantic and interoperability issues raised by integrated modelling. We describe the scope and architecture required to support uncertainty management as developed in UncertWeb. This includes tools which support elicitation, aggregation/disaggregation, visualisation and uncertainty/sensitivity analysis. We conclude by highlighting areas that require further research and development in UncertWeb, such as model calibration and inference within complex environmental models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss aggregation of data from neuropsychological patients and the process of evaluating models using data from a series of patients. We argue that aggregation can be misleading but not aggregating can also result in information loss. The basis for combining data needs to be theoretically defined, and the particular method of aggregation depends on the theoretical question and characteristics of the data. We present examples, often drawn from our own research, to illustrate these points. We also argue that statistical models and formal methods of model selection are a useful way to test theoretical accounts using data from several patients in multiple-case studies or case series. Statistical models can often measure fit in a way that explicitly captures what a theory allows; the parameter values that result from model fitting often measure theoretically important dimensions and can lead to more constrained theories or new predictions; and model selection allows the strength of evidence for models to be quantified without forcing this into the artificial binary choice that characterizes hypothesis testing methods. Methods that aggregate and then formally model patient data, however, are not automatically preferred to other methods. Which method is preferred depends on the question to be addressed, characteristics of the data, and practical issues like availability of suitable patients, but case series, multiple-case studies, single-case studies, statistical models, and process models should be complementary methods when guided by theory development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is now stylized that the importance of foreign direct investment for developing countries and emerging markets arises from the impact of the presence of multinational corporations (MNCs) in the host country on the productivity of local firms, by way of technology diffusion and competition. There is also general agreement that the extent of technology transfer by an MNC to a developing country affiliate depends on the extent of its control on the local affiliate and that, in turn, the extent of this control depends on the mode of entry of the MNC into the host country. However, the existing literature is based on the experience of developed countries and as such does not contribute to the literature on development economics. This article addresses this lacuna using unique firm-level data from South Africa and Egypt. Our results indicate that the determinants of entry mode choice not only differ between developed and developing countries, but also among developing countries. They also bring into question the role of MNCs in fostering productivity growth in developing countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cell-wall components (cellulose, hemicellulose (oat spelt xylan), lignin (Organosolv)), and model compounds (levoglucosan (an intermediate product of cellulose decomposition) and chlorogenic acid (structurally similar to lignin polymer units)) have been investigated to probe in detail the influence of potassium on their pyrolysis behaviours as well as their uncatalysed decomposition reaction. Cellulose and lignin were pretreated to remove salts and metals by hydrochloric acid, and this dematerialized sample was impregnated with 1% of potassium as potassium acetate. Levoglucosan, xylan and chlorogenic acid were mixed with CHCOOK to introduce 1% K. Characterisation was performed using thermogravimetric analysis (TGA) and differential thermal analysis (DTA). In addition to the TGA pyrolysis, pyrolysis-gas chromatography-mass spectrometry (PY-GC-MS) analysis was introduced to examine reaction products. Potassium-catalysed pyrolysis has a huge influence on the char formation stage and increases the char yields considerably (from 7.7% for raw cellulose to 27.7% for potassium impregnated cellulose; from 5.7% for raw levoglucosan to 20.8% for levoglucosan with CHCOOK added). Major changes in the pyrolytic decomposition pathways were observed for cellulose, levoglucosan and chlorogenic acid. The results for cellulose and levoglucosan are consistent with a base catalysed route in the presence of the potassium salt which promotes complete decomposition of glucosidic units by a heterolytic mechanism and favours its direct depolymerization and fragmentation to low molecular weight components (e.g. acetic acid, formic acid, glyoxal, hydroxyacetaldehyde and acetol). Base catalysed polymerization reactions increase the char yield. Potassium-catalysed lignin pyrolysis is very significant: the temperature of maximum conversion in pyrolysis shifts to lower temperature by 70 K and catalysed polymerization reactions increase the char yield from 37% to 51%. A similar trend is observed for the model compound, chlorogenic acid. The addition of potassium does not produce a dramatic change in the tar product distribution, although its addition to chlorogenic acid promoted the generation of cyclohexane and phenol derivatives. Postulated thermal decomposition schemes for chlorogenic acid are presented. © 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Engineering adaptive software is an increasingly complex task. Here, we demonstrate Genie, a tool that supports the modelling, generation, and operation of highly reconfigurable, component-based systems. We showcase how Genie is used in two case-studies: i) the development and operation of an adaptive flood warning system, and ii) a service discovery application. In this context, adaptation is enabled by the Gridkit reflective middleware platform.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human accommodation system has been extensively examined for over a century, with a particular focus on trying to understand the mechanisms that lead to the loss of accommodative ability with age (Presbyopia). The accommodative process, along with the potential causes of presbyopia, are disputed; hindering efforts to develop methods of restoring accommodation in the presbyopic eye. One method that can be used to provide insight into this complex area is Finite Element Analysis (FEA). The effectiveness of FEA in modelling the accommodative process has been illustrated by a number of accommodative FEA models developed to date. However, there have been limitations to these previous models; principally due to the variation in data on the geometry of the accommodative components, combined with sparse measurements of their material properties. Despite advances in available data, continued oversimplification has occurred in the modelling of the crystalline lens structure and the zonular fibres that surround the lens. A new accommodation model was proposed by the author that aims to eliminate these limitations. A novel representation of the zonular structure was developed, combined with updated lens and capsule modelling methods. The model has been designed to be adaptable so that a range of different age accommodation systems can be modelled, allowing the age related changes that occur to be simulated. The new modelling methods were validated by comparing the changes induced within the model to available in vivo data, leading to the definition of three different age models. These were used in an extended sensitivity study on age related changes, where individual parameters were altered to investigate their effect on the accommodative process. The material properties were found to have the largest impact on the decline in accommodative ability, in particular compared to changes in ciliary body movement or zonular structure. Novel data on the importance of the capsule stiffness and thickness was also established. The new model detailed within this thesis provides further insight into the accommodation mechanism, as well as a foundation for future, more detailed investigations into accommodation, presbyopia and accommodative restoration techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-awareness and self-expression are promising architectural concepts for embedded systems to be equipped with to match them with dedicated application scenarios and constraints in the avionic and space-flight industry. Typically, these systems operate in largely undefined environments and are not reachable after deployment for a long time or even never ever again. This paper introduces a reference architecture as well as a novel modelling and simulation environment for self-aware and self-expressive systems with transaction level modelling, simulation and detailed modelling capabilities for hardware aspects, precise process chronology execution as well as fine timing resolutions. Furthermore, industrial relevant system sizes with several self-aware and self-expressive nodes can be handled by the modelling and simulation environment.