79 resultados para pumping system design
Resumo:
The verification and validation of engineering designs are of primary importance as they directly influence production performance and ultimately define product functionality and customer perception. Research in aspects of verification and validation is widely spread ranging from tools employed during the digital design phase, to methods deployed for prototype verification and validation. This paper reviews the standard definitions of verification and validation in the context of engineering design and progresses to provide a coherent analysis and classification of these activities from preliminary design, to design in the digital domain and the physical verification and validation of products and processes. The scope of the paper includes aspects of system design and demonstrates how complex products are validated in the context of their lifecycle. Industrial requirements are highlighted and research trends and priorities identified. © 2010 CIRP.
Resumo:
Software architecture plays an essential role in the high level description of a system design, where the structure and communication are emphasized. Despite its importance in the software engineering process, the lack of formal description and automated verification hinders the development of good software architecture models. In this paper, we present an approach to support the rigorous design and verification of software architecture models using the semantic web technology. We view software architecture models as ontology representations, where their structures and communication constraints are captured by the Web Ontology Language (OWL) and the Semantic Web Rule Language (SWRL). Specific configurations on the design are represented as concrete instances of the ontology, to which their structures and dynamic behaviors must conform. Furthermore, ontology reasoning tools can be applied to perform various automated verification on the design to ensure correctness, such as consistency checking, style recognition, and behavioral inference.
Resumo:
Wireless Sensor Network (WSN) systems have become more and more popular in our modern life. They have been widely used in many areas, such as smart homes/buildings, context-aware devices, military applications, etc. Despite the increasing usage, there is a lack of formal description and automated verification for WSN system design. In this paper, we present an approach to support the rigorous verification of WSN modeling using the Semantic Web technology We use Web Ontology Language (OWL) and Semantic Web Rule Language (SWRL) to define a meta-ontology for the modeling of WSN systems. Furthermore, we apply ontology reasoners to perform automated verification on customized WSN models and their instances. We demonstrate and evaluate our approach through a Light Control System (LCS) as the case study.
Resumo:
The amplification of demand variation up a supply chain widely termed ‘the Bullwhip Effect’ is disruptive, costly and something that supply chain management generally seeks to minimise. Originally attributed to poor system design; deficiencies in policies, organisation structure and delays in material and information flow all lead to sub-optimal reorder point calculation. It has since been attributed to exogenous random factors such as: uncertainties in demand, supply and distribution lead time but these causes are not exclusive as academic and operational studies since have shown that orders and/or inventories can exhibit significant variability even if customer demand and lead time are deterministic. This increase in the range of possible causes of dynamic behaviour indicates that our understanding of the phenomenon is far from complete. One possible, yet previously unexplored, factor that may influence dynamic behaviour in supply chains is the application and operation of supply chain performance measures. Organisations monitoring and responding to their adopted key performance metrics will make operational changes and this action may influence the level of dynamics within the supply chain, possibly degrading the performance of the very system they were intended to measure. In order to explore this a plausible abstraction of the operational responses to the Supply Chain Council’s SCOR® (Supply Chain Operations Reference) model was incorporated into a classic Beer Game distribution representation, using the dynamic discrete event simulation software Simul8. During the simulation the five SCOR Supply Chain Performance Attributes: Reliability, Responsiveness, Flexibility, Cost and Utilisation were continuously monitored and compared to established targets. Operational adjustments to the; reorder point, transportation modes and production capacity (where appropriate) for three independent supply chain roles were made and the degree of dynamic behaviour in the Supply Chain measured, using the ratio of the standard deviation of upstream demand relative to the standard deviation of the downstream demand. Factors employed to build the detailed model include: variable retail demand, order transmission, transportation delays, production delays, capacity constraints demand multipliers and demand averaging periods. Five dimensions of supply chain performance were monitored independently in three autonomous supply chain roles and operational settings adjusted accordingly. Uniqueness of this research stems from the application of the five SCOR performance attributes with modelled operational responses in a dynamic discrete event simulation model. This project makes its primary contribution to knowledge by measuring the impact, on supply chain dynamics, of applying a representative performance measurement system.
Resumo:
The application of any e-Solution promises significant returns. In particular, using internet technologies both within enterprises and across the supply (value) chain provides real opportunity, not only for operational improvement but also for innovative strategic positioning. However, significant questions obscure potential investment; how any value will actually be created and, importantly, how this value will be shared across the value chain is not clear. This paper will describe a programme of research that is developing an enterprise simulator that will provide a more fundamental understanding of the impact of e-Solutions across operational supply chains, in terms of both standard operational and financial measures of performance. An efficient supply chain reduces total costs of operations by sharing accurate real-time information and coordinating inter-organizational business processes. This form of electronic link between organizations is known as business-to-business (B2B) e-Business. The financial measures go beyond simple cost calculations to real bottom-line performance by modelling the financial transactions that business processes generate. The paper will show how this enterprise simulator allows for a complete supply chain to be modelled in this way across four key applications: control system design, virtual enterprises, pan-supply-chain performance metrics and supporting e-Supply-chain design methodology.
Resumo:
The design and synthesis of safe efficient non-viral vectors for gene delivery has attracted significant attention in recent years due primarily to the severe side-effect profile reported with the use of their viral counterparts. Previous experiments have revealed that the strong interaction between the carriers and nucleic acid may well hinder the release of the gene from the complex in the cytosol adversely affecting transfection efficiency. However, incorporating reducible disulfide bonds within the delivery systems themselves which are then cleaved in the glutathione-rich intracellular environment may help in solving this puzzle. This review focuses on recent development of these reducible carriers. The biological rationale and approaches to the synthesis of reducible vectors are discussed in detail. The in vitro and in vivo evaluations of reducible carriers are also summarized and it is evident that they offer a promising approach in non-viral gene delivery system design.
Resumo:
This thesis presents improvements to optical transmission systems through the use of optical solitons as a digital transmission format, both theoretically and experimentally. An introduction to the main concepts and impairments of optical fibre on pulse transmission is included before introducing the concept of solitons in optically amplified communications and the problems of soliton system design. The theoretical work studies two fibre dispersion profiling schemes and a soliton launch improvement. The first provides superior pulse transmission by optimally tailoring the fibre dispersion to better follow the power, and hence nonlinearity, decay and thus allow soliton transmission for longer amplifier spacings and shorter pulse widths than normally possible. The second profiling scheme examines the use of dispersion compensating fibre in the context of soliton transmission over existing, standard fibre systems. The limits for solitons in uncompensated standard fibre are assessed, before the potential benefits of dispersion compensating fibre included as part of each amplifier are shown. The third theoretical investigation provides a simple improvement to the propagation of solitons in a highly perturbed system. By introducing a section of fibre of the correct length prior to the first system amplifier span, the soliton shape can be better coupled into the system thus providing an improved "average soliton" propagation model. The experimental work covers two areas. An important issue for soliton systems is pulse sources. Three potential lasers are studied, two ring laser configurations and one semiconductor device with external pulse shaping. The second area studies soliton transmission using a recalculating loop, reviewing the advantages and draw-backs of such an experiment in system testing and design. One particular example of employing the recirculating loop is also examined, using a novel method of pulse shape stabilisation over long distances with low jitter. The future for nonlinear optical communications is considered with the thesis conclusions.
Resumo:
A major application of computers has been to control physical processes in which the computer is embedded within some large physical process and is required to control concurrent physical processes. The main difficulty with these systems is their event-driven characteristics, which complicate their modelling and analysis. Although a number of researchers in the process system community have approached the problems of modelling and analysis of such systems, there is still a lack of standardised software development formalisms for the system (controller) development, particular at early stage of the system design cycle. This research forms part of a larger research programme which is concerned with the development of real-time process-control systems in which software is used to control concurrent physical processes. The general objective of the research in this thesis is to investigate the use of formal techniques in the analysis of such systems at their early stages of development, with a particular bias towards an application to high speed machinery. Specifically, the research aims to generate a standardised software development formalism for real-time process-control systems, particularly for software controller synthesis. In this research, a graphical modelling formalism called Sequential Function Chart (SFC), a variant of Grafcet, is examined. SFC, which is defined in the international standard IEC1131 as a graphical description language, has been used widely in industry and has achieved an acceptable level of maturity and acceptance. A comparative study between SFC and Petri nets is presented in this thesis. To overcome identified inaccuracies in the SFC, a formal definition of the firing rules for SFC is given. To provide a framework in which SFC models can be analysed formally, an extended time-related Petri net model for SFC is proposed and the transformation method is defined. The SFC notation lacks a systematic way of synthesising system models from the real world systems. Thus a standardised approach to the development of real-time process control systems is required such that the system (software) functional requirements can be identified, captured, analysed. A rule-based approach and a method called system behaviour driven method (SBDM) are proposed as a development formalism for real-time process-control systems.
Resumo:
This thesis presents a theoretical investigation of the application of advanced modelling formats in high-speed fibre lightwave systems. The first part of this work focuses on numerical optimisation of dense wavelength division multiplexing (DWDM) system design. We employ advanced spectral domain filtering techniques and carrier pulse reshaping. We then apply these optimisation methods to investigate spectral and temporal domain characteristics of advanced modulation formats in fibre optic telecommunication systems. Next we investigate numerical methods used in detecting and measuring the system performance of advanced modulation formats. We then numerically study the combination of return-to-zero differential phase-shift keying (RZ-DPSK) with advanced photonic devices. Finally we analyse the dispersion management of Nx40 Gbit/s RZ-DPSK transmission applied to a commercial terrestrial lightwave system.
Resumo:
Liquid-liquid extraction has long been known as a unit operation that plays an important role in industry. This process is well known for its complexity and sensitivity to operation conditions. This thesis presents an attempt to explore the dynamics and control of this process using a systematic approach and state of the art control system design techniques. The process was studied first experimentally under carefully selected. operation conditions, which resembles the ranges employed practically under stable and efficient conditions. Data were collected at steady state conditions using adequate sampling techniques for the dispersed and continuous phases as well as during the transients of the column with the aid of a computer-based online data logging system and online concentration analysis. A stagewise single stage backflow model was improved to mimic the dynamic operation of the column. The developed model accounts for the variation in hydrodynamics, mass transfer, and physical properties throughout the length of the column. End effects were treated by addition of stages at the column entrances. Two parameters were incorporated in the model namely; mass transfer weight factor to correct for the assumption of no mass transfer in the. settling zones at each stage and the backmixing coefficients to handle the axial dispersion phenomena encountered in the course of column operation. The parameters were estimated by minimizing the differences between the experimental and the model predicted concentration profiles at steady state conditions using non-linear optimisation technique. The estimated values were then correlated as functions of operating parameters and were incorporated in·the model equations. The model equations comprise a stiff differential~algebraic system. This system was solved using the GEAR ODE solver. The calculated concentration profiles were compared to those experimentally measured. A very good agreement of the two profiles was achieved within a percent relative error of ±2.S%. The developed rigorous dynamic model of the extraction column was used to derive linear time-invariant reduced-order models that relate the input variables (agitator speed, solvent feed flowrate and concentration, feed concentration and flowrate) to the output variables (raffinate concentration and extract concentration) using the asymptotic method of system identification. The reduced-order models were shown to be accurate in capturing the dynamic behaviour of the process with a maximum modelling prediction error of I %. The simplicity and accuracy of the derived reduced-order models allow for control system design and analysis of such complicated processes. The extraction column is a typical multivariable process with agitator speed and solvent feed flowrate considered as manipulative variables; raffinate concentration and extract concentration as controlled variables and the feeds concentration and feed flowrate as disturbance variables. The control system design of the extraction process was tackled as multi-loop decentralised SISO (Single Input Single Output) as well as centralised MIMO (Multi-Input Multi-Output) system using both conventional and model-based control techniques such as IMC (Internal Model Control) and MPC (Model Predictive Control). Control performance of each control scheme was. studied in terms of stability, speed of response, sensitivity to modelling errors (robustness), setpoint tracking capabilities and load rejection. For decentralised control, multiple loops were assigned to pair.each manipulated variable with each controlled variable according to the interaction analysis and other pairing criteria such as relative gain array (RGA), singular value analysis (SVD). Loops namely Rotor speed-Raffinate concentration and Solvent flowrate Extract concentration showed weak interaction. Multivariable MPC has shown more effective performance compared to other conventional techniques since it accounts for loops interaction, time delays, and input-output variables constraints.
Resumo:
The separation performance of a semicontinuous counter-current chromatographic refiner (SCCR7), consisting of twelve 5.4 cm id x 75cm long columns packed with calcium charged cross-linked polysytrene resin (KORELA VO7C), was optimised. An industrial barley syrup was used containing 42% fructose, 52% glucose and 6% maltose and oligosaccharides. The effects of temperature, flow rates and concentration on the distribution coefficients were evaluated and quantified by deriving general relationships. The effects of flow rates, feed composition and concentration on the separation performance of the SCCR7 were identified and general relationships between them and the switch time, which was found to be the controlling parameter, were developed. Fructose rich (FRP) and glucose rich (GRP) product purities of 99.9% were obtained at 18.6% w/v feed concentrations. When a 66% w/v feed concentration was used and product splitting technique was employed, the throughput was 32.1 kg sugar solids/m3 resin/hr. The GRP contained less than 4.5% fructose, the FRP was over 95% pure, and the respective concentrations were 22.56 and 11.29% w/v. Over 94% of the glucose and 95.78% of the fructose in the feed were recovered in the GRP and FRP respectively. By recycling the dilute product split fractions, the GRP and FRP concentrations were increased to 25.4 and 12.96% w/v; the FRP was 90.2% pure and the GRP contained 6.69% w/v fructose. A theoretical link between batch and semicontinuous chromatographic equipments has been determined. A computer simulation was developed predicting successfully the purging concentration profiles at `pseudo-equilibrium', and also certain system design parameters. An important further aspect of the work has been to study the behaviour of chromatographic bioreactor-separators. Such batch systems of 5.4cm id and lengths varying between 30 and 230cm, were used to investigate the effect of scaling up on the conversion of sucrose into dextran and fructose in the presence of the dextransucrase enzyme. Conversions of over 80% were achieved at 4 hr sucrose residence times. The crude dextransucrase was purified using centrifugation, ultrafiltration and cross-flow microfiltration techniques. Better enzyme stability was obtained by first separating the non-solid impurities using cross-flow microfiltration, and then removing the cells from the enzyme immediately before use by continuous centrifugation.
Resumo:
This thesis addresses the viability of automatic speech recognition for control room systems; with careful system design, automatic speech recognition (ASR) devices can be useful means for human computer interaction in specific types of task. These tasks can be defined as complex verbal activities, such as command and control, and can be paired with spatial tasks, such as monitoring, without detriment. It is suggested that ASR use be confined to routine plant operation, as opposed the critical incidents, due to possible problems of stress on the operators' speech. It is proposed that using ASR will require operators to adapt a commonly used skill to cater for a novel use of speech. Before using the ASR device, new operators will require some form of training. It is shown that a demonstration by an experienced user of the device can lead to superior performance than instructions. Thus, a relatively cheap and very efficient form of operator training can be supplied by demonstration by experienced ASR operators. From a series of studies into speech based interaction with computers, it is concluded that the interaction be designed to capitalise upon the tendency of operators to use short, succinct, task specific styles of speech. From studies comparing different types of feedback, it is concluded that operators be given screen based feedback, rather than auditory feedback, for control room operation. Feedback will take two forms: the use of the ASR device will require recognition feedback, which will be best supplied using text; the performance of a process control task will require task feedback integrated into the mimic display. This latter feedback can be either textual or symbolic, but it is suggested that symbolic feedback will be more beneficial. Related to both interaction style and feedback is the issue of handling recognition errors. These should be corrected by simple command repetition practices, rather than use error handling dialogues. This method of error correction is held to be non intrusive to primary command and control operations. This thesis also addresses some of the problems of user error in ASR use, and provides a number of recommendations for its reduction.
Resumo:
Many manufacturing companies have long endured the problems associated with the presence of `islands of automation'. Due to rapid computerisation, `islands' such as Computer-Aided Design (CAD), Computer-Aided Manufacturing (CAM), Flexible Manufacturing Systems (FMS) and Material Requirement Planning (MRP), have emerged, and with a lack of co-ordination, often lead to inefficient performance of the overall system. The main objective of Computer-Integrated Manufacturing (CIM) technology is to form a cohesive network between these islands. Unfortunately, a commonly used approach - the centralised system approach, has imposed major technical constraints and design complication on development strategies. As a consequence, small companies have experienced difficulties in participating in CIM technology. The research described in this thesis has aimed to examine alternative approaches to CIM system design. Through research and experimentation, the cellular system approach, which has existed in the form of manufacturing layouts, has been found to simplify the complexity of an integrated manufacturing system, leading to better control and far higher system flexibility. Based on the cellular principle, some central management functions have also been distributed to smaller cells within the system. This concept is known, specifically, as distributed planning and control. Through the development of an embryo cellular CIM system, the influence of both the cellular principle and the distribution methodology have been evaluated. Based on the evidence obtained, it has been concluded that distributed planning and control methodology can greatly enhance cellular features within an integrated system. Both the cellular system approach and the distributed control concept will therefore make significant contributions to the design of future CIM systems, particularly systems designed with respect to small company requirements.
Resumo:
In response to the increasing international competitiveness, many manufacturing businesses are rethinking their management strategies and philosophies towards achieving a computer integrated environment. The explosive growth in Advanced Manufacturing Technology (AMI) has resulted in the formation of functional "Islands of Automation" such as Computer Aided Design (CAD), Computer Aided Manufacturing (CAM), Computer Aided Process Planning (CAPP) and Manufacturing Resources Planning (MRPII). This has resulted in an environment which has focussed areas of excellence and poor overall efficiency, co-ordination and control. The main role of Computer Integrated Manufacturing (CIM) is to integrate these islands of automation and develop a totally integrated and controlled environment. However, the various perceptions of CIM, although developing, remain focussed on a very narrow integration scope and have consequently resulted in mere linked islands of automation with little improvement in overall co-ordination and control. This thesis, that is the research described within, develops and examines a more holistic view of CIM, which is based on the integration of various business elements. One particular business element, namely control, has been shown to have a multi-facetted and underpinning relationship with the CIM philosophy. This relationship impacts various CIM system design aspects including the CIM business analysis and modelling technique, the specification of systems integration requirements, the CIM system architectural form and the degree of business redesign. The research findings show that fundamental changes to CIM system design are required; these are incorporated in a generic CIM design methodology. The affect and influence of this holistic view of CIM on a manufacturing business has been evaluated through various industrial case study applications. Based on the evidence obtained, it has been concluded that this holistic, control based approach to CIM can provide a greatly improved means of achieving a totally integrated and controlled business environment. This generic CIM methodology will therefore make a significant contribution to the planning, modelling, design and development of future CIM systems.
Resumo:
The high capital cost of robots prohibit their economic application. One method of making their application more economic is to increase their operating speed. This can be done in a number of ways e.g. redesign of robot geometry, improving actuators and improving control system design. In this thesis the control system design is considered. It is identified in the literature review that two aspects in relation to robot control system design have not been addressed in any great detail by previous researchers. These are: how significant are the coupling terms in the dynamic equations of the robot and what is the effect of the coupling terms on the performance of a number of typical independent axis control schemes?. The work in this thesis addresses these two questions in detail. A program was designed to automatically calculate the path and trajectory and to calculate the significance of the coupling terms in an example application of a robot manipulator tracking a part on a moving conveyor. The inertial and velocity coupling terms have been shown to be of significance when the manipulator was considered to be directly driven. A simulation of the robot manipulator following the planned trajectory has been established in order to assess the performance of the independent axis control strategies. The inertial coupling was shown to reinforce the control torque at the corner points of the trajectory, where there was an abrupt demand in acceleration in each axis but of opposite sign. This reduced the tracking error however, this effect was not controllable. A second effect was due to the velocity coupling terms. At high trajectory speeds it was shown, by means of a root locus analysis, that the velocity coupling terms caused the system to become unstable.