942 resultados para Machine-tools - numerical control
Resumo:
The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.
Resumo:
Open-loop operatlon of the stepping motor exploits the inherent advantages of the machine. For near optimum operation: in this mode, however, an accurate system model is required to facilitate controller design. Such a model must be comprehensive and take account of the non-linearities inherent in the system. The result is a complex formulation which can be made manageable with a computational aid. A digital simulation of a hybrid type stepping motor and its associated drive circuit is proposed. The simulation is based upon a block diagram model which includes reasonable approximations to the major non-linearities. The simulation is shown to yield accurate performance predictions. The determination of the transfer functions is based upon the consideration of the physical processes involved rather than upon direct input-outout measurements. The effects of eddy currents, saturation, hysteresis, drive circuit characteristics and non-linear torque displacement characteristics are considered and methods of determining transfer functions, which take account of these effects, are offered. The static torque displacement characteristic is considered in detail and a model is proposed which predicts static torque for any combination of phase currents and shaft position. Methods of predicting the characteristic directly from machine geometry are investigated. Drive circuit design for high efficiency operation is considered and a model of a bipolar, bilevel circuit is proposed. The transfers between stator voltage and stator current and between stator current and air gap flux are complicated by the effects of eddy currents, saturation and hysteresis. Frequency response methods, combined with average inductance measurements, are shown to yield reasonable transfer functions. The modelling procedure and subsequent digital simulation is concluded to be a powerful method of non-linear analysis.
Resumo:
The thesis describes an investigation into methods for the specification, design and implementation of computer control systems for flexible manufacturing machines comprising multiple, independent, electromechanically-driven mechanisms. An analysis is made of the elements of conventional mechanically-coupled machines in order that the operational functions of these elements may be identified. This analysis is used to define the scope of requirements necessary to specify the format, function and operation of a flexible, independently driven mechanism machine. A discussion of how this type of machine can accommodate modern manufacturing needs of high-speed and flexibility is presented. A sequential method of capturing requirements for such machines is detailed based on a hierarchical partitioning of machine requirements from product to independent drive mechanism. A classification of mechanisms using notations, including Data flow diagrams and Petri-nets, is described which supports capture and allows validation of requirements. A generic design for a modular, IDM machine controller is derived based upon hierarchy of control identified in these machines. A two mechanism experimental machine is detailed which is used to demonstrate the application of the specification, design and implementation techniques. A computer controller prototype and a fully flexible implementation for the IDM machine, based on Petri-net models described using the concurrent programming language Occam, is detailed. The ability of this modular computer controller to support flexible, safe and fault-tolerant operation of the two intermittent motion, discrete-synchronisation independent drive mechanisms is presented. The application of the machine development methodology to industrial projects is established.
Resumo:
This work examines prosody modelling for the Standard Yorùbá (SY) language in the context of computer text-to-speech synthesis applications. The thesis of this research is that it is possible to develop a practical prosody model by using appropriate computational tools and techniques which combines acoustic data with an encoding of the phonological and phonetic knowledge provided by experts. Our prosody model is conceptualised around a modular holistic framework. The framework is implemented using the Relational Tree (R-Tree) techniques (Ehrich and Foith, 1976). R-Tree is a sophisticated data structure that provides a multi-dimensional description of a waveform. A Skeletal Tree (S-Tree) is first generated using algorithms based on the tone phonological rules of SY. Subsequent steps update the S-Tree by computing the numerical values of the prosody dimensions. To implement the intonation dimension, fuzzy control rules where developed based on data from native speakers of Yorùbá. The Classification And Regression Tree (CART) and the Fuzzy Decision Tree (FDT) techniques were tested in modelling the duration dimension. The FDT was selected based on its better performance. An important feature of our R-Tree framework is its flexibility in that it facilitates the independent implementation of the different dimensions of prosody, i.e. duration and intonation, using different techniques and their subsequent integration. Our approach provides us with a flexible and extendible model that can also be used to implement, study and explain the theory behind aspects of the phenomena observed in speech prosody.
Resumo:
The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.
Resumo:
The main objective of the work presented in this thesis is to investigate the two sides of the flute, the face and the heel of a twist drill. The flute face was designed to yield straight diametral lips which could be extended to eliminate the chisel edge, and consequently a single cutting edge will be obtained. Since drill rigidity and space for chip conveyance have to be a compromise a theoretical expression is deduced which enables optimum chip disposal capacity to be described in terms of drill parameters. This expression is used to describe the flute heel side. Another main objective is to study the effect on drill performance of changing the conventional drill flute. Drills were manufactured according to the new flute design. Tests were run in order to compare the performance of a conventional flute drill and non conventional design put forward. The results showed that 50% reduction in thrust force and approximately 18% reduction in torque were attained for the new design. The flank wear was measured at the outer corner and found to be less for the new design drill than for the conventional one in the majority of cases. Hole quality, roundness, size and roughness were also considered as a further aspect of drill performance. Improvement in hole quality is shown to arise under certain cutting conditions. Accordingly it might be possible to use a hole which is produced in one pass of the new drill which previously would have required a drilled and reamed hole. A subsidiary objective is to design the form milling cutter that should be employed for milling the foregoing special flute from drill blank allowing for the interference effect. A mathematical analysis in conjunction with computing technique and computers is used. To control the grinding parameter, a prototype drill grinder was designed and built upon the framework of an existing cincinnati cutter grinder. The design and build of the new grinder is based on a computer aided drill point geometry analysis. In addition to the conical grinding concept, the new grinder is also used to produce spherical point utilizing a computer aided drill point geometry analysis.
Resumo:
The advent of the harmonic neutralised shunt Converter Compensator as a practical means of reactive power compensation in power transmission systems has cleared ground for wider application of this type of equipment. An experimental 24-pulse voltage sourced convector has been successfully applied in controlling the terminal power factor of a 1.5kW, 240V three phase cage rotor induction motor, whose winding has been used in place of the usual phase shifting transformers. To achieve this, modifications have been made to the conventional stator winding of the induction machine. These include an unconventional phase spread and facilitation of compensator connections to selected tapping points between stator coils to give a three phase winding with a twelve phase connection to the twenty four pulse converter. Theoretical and experimental assessments of the impact of these modifications and attachment of the compensator have shown that there is a slight reduction in the torque developed at a given slip and in the combined system efficiency. There is also an increase in the noise level, also a consequence of the harmonics. The stator leakage inductance gave inadequate coupling reactance between the converter and the effective voltage source, necessitating the use of external inductors in each of the twelve phases. The terminal power factor is fully controllable when the induction machine is used either as a motor or as a generator.
Resumo:
A re-examination of fundamental concepts and a formal structuring of the waveform analysis problem is presented in Part I. eg. the nature of frequency is examined and a novel alternative to the classical methods of detection proposed and implemented which has the advantage of speed and independence from amplitude. Waveform analysis provides the link between Parts I and II. Part II is devoted to Human Factors and the Adaptive Task Technique. The Historical, Technical and Intellectual development of the technique is traced in a review which examines the evidence of its advantages relative to non-adaptive fixed task methods of training, skill assessment and man-machine optimisation. A second review examines research evidence on the effect of vibration on manual control ability. Findings are presented in terms of percentage increment or decrement in performance relative to performance without vibration in the range 0-0.6Rms'g'. Primary task performance was found to vary by as much as 90% between tasks at the same Rms'g'. Differences in task difficulty accounted for this difference. Within tasks vibration-added-difficulty accounted for the effects of vibration intensity. Secondary tasks were found to be largely insensitive to vibration except secondaries which involved fine manual adjustment of minor controls. Three experiments are reported next in which an adaptive technique was used to measure the % task difficulty added by vertical random and sinusoidal vibration to a 'Critical Compensatory Tracking task. At vibration intensities between 0 - 0.09 Rms 'g' it was found that random vibration added (24.5 x Rms'g')/7.4 x 100% to the difficulty of the control task. An equivalence relationship between Random and Sinusoidal vibration effects was established based upon added task difficulty. Waveform Analyses which were applied to the experimental data served to validate Phase Plane analysis and uncovered the development of a control and possibly a vibration isolation strategy. The submission ends with an appraisal of subjects mentioned in the thesis title.
Resumo:
This thesis deals with the problems associated with the planning and control of production, with particular reference to a small aluminium die casting company. The main problem areas were identified as: (a) A need to be able to forecast the customers demands upon the company's facilities. (b) A need to produce a manufacturing programme in which the output of the foundry (or die casting section) was balanced with the available capacity in the machine shop. (c) The need to ensure that the resultant system enabled the company's operating budget to have a reasonable chance of being achieved. At the commencement of the research work the major customers were members of the automobile industry and had their own system of forecasting, from which they issued manufacturing schedules to their component suppliers, The errors in the forecast were analysed and the distributions noted. Using these distributions the customer's forecast was capable of being modified to enable his final demand to be met with a known degree of confidence. Before a manufacturing programme could be developed the actual manufacturing system had to be reviewed and it was found that as with many small companies there was a remarkable lack of formal control and written data. Relevant data with regards to the component and the manufacturing process had therefore to be collected and analysed. The foundry process was fixed but the secondary machining operations were analysed by a technique similar to Component Flow Analysis and as a result the machines were arranged in a series of flow lines. A system of manual production control was proposed and for comparison, a local computer bureau was approached and a system proposed incorporating the production of additional management information. These systems are compared and the relative merits discussed and a proposal made for implementation.
Resumo:
Physically based distributed models of catchment hydrology are likely to be made available as engineering tools in the near future. Although these models are based on theoretically acceptable equations of continuity, there are still limitations in the present modelling strategy. Of interest to this thesis are the current modelling assumptions made concerning the effects of soil spatial variability, including formations producing distinct zones of preferential flow. The thesis contains a review of current physically based modelling strategies and a field based assessment of soil spatial variability. In order to investigate the effects of soil nonuniformity a fully three dimensional model of variability saturated flow in porous media is developed. The model is based on a Galerkin finite element approximation to Richards equation. Accessibility to a vector processor permits numerical solutions on grids containing several thousand node points. The model is applied to a single hillslope segment under various degrees of soil spatial variability. Such variability is introduced by generating random fields of saturated hydraulic conductivity using the turning bands method. Similar experiments are performed under conditions of preferred soil moisture movement. The results show that the influence of soil variability on subsurface flow may be less significant than suggested in the literature, due to the integrating effects of three dimensional flow. Under conditions of widespread infiltration excess runoff, the results indicate a greater significance of soil nonuniformity. The recognition of zones of preferential flow is also shown to be an important factor in accurate rainfall-runoff modelling. Using the results of various fields of soil variability, experiments are carried out to assess the validity of the commonly used concept of `effective parameters'. The results of these experiments suggest that such a concept may be valid in modelling subsurface flow. However, the effective parameter is observed to be event dependent when the dominating mechanism is infiltration excess runoff.
Resumo:
Firstly, we numerically model a practical 20 Gb/s undersea configuration employing the Return-to-Zero Differential Phase Shift Keying data format. The modelling is completed using the Split-Step Fourier Method to solve the Generalised Nonlinear Schrdinger Equation. We optimise the dispersion map and per-channel launch power of these channels and investigate how the choice of pre/post compensation can influence the performance. After obtaining these optimal configurations, we investigate the Bit Error Rate estimation of these systems and we see that estimation based on Gaussian electrical current systems is appropriate for systems of this type, indicating quasi-linear behaviour. The introduction of narrower pulses due to the deployment of quasi-linear transmission decreases the tolerance to chromatic dispersion and intra-channel nonlinearity. We used tools from Mathematical Statistics to study the behaviour of these channels in order to develop new methods to estimate Bit Error Rate. In the final section, we consider the estimation of Eye Closure Penalty, a popular measure of signal distortion. Using a numerical example and assuming the symmetry of eye closure, we see that we can simply estimate Eye Closure Penalty using Gaussian statistics. We also see that the statistics of the logical ones dominates the statistics of the logical ones dominates the statistics of signal distortion in the case of Return-to-Zero On-Off Keying configurations.
Resumo:
The unmitigated transmission of undesirable vibration can result in problems by way of causing human discomfort, machinery and equipment failure, and affecting the quality of a manufacturing process. When identifiable transmission paths are discernible, vibrations from the source can be isolated from the rest of the system and this prevents or minimises the problems. The approach proposed here for vibration isolation is active force cancellation at points close to the vibration source. It uses force feedback for multiple-input and multiple-output control at the mounting locations. This is particularly attractive for rigid mounting of machine on relative flexible base where machine alignment and motions are to be restricted. The force transfer function matrix is used as a disturbance rejection performance specification for the design of MIMO controllers. For machine soft-mounted via flexible isolators, a model for this matrix has been derived. Under certain conditions, a simple multiplicative uncertainty model is obtained that shows the amount of perturbation a flexible base has on the machine-isolator-rigid base transmissibility matrix. Such a model is very suitable for use with robust control design paradigm. A different model is derived for the machine on hard-mounts without the flexible isolators. With this model, the level of force transmitted from a machine to a final mounting structure using the measurements for the machine running on another mounting structure can be determined. The two mounting structures have dissimilar dynamic characteristics. Experiments have verified the usefulness of the expression. The model compares well with other methods in the literature. The disadvantage lies with the large amount of data that has to be collected. Active force cancellation is demonstrated on an experimental rig using an AC industrial motor hard-mounted onto a relative flexible structure. The force transfer function matrix, determined from measurements, is used to design H and Static Output Feedback controllers. Both types of controllers are stable and robust to modelling errors within the identified frequency range. They reduce the RMS of transmitted force by between 30?80% at all mounting locations for machine running at 1340 rpm. At the rated speed of 1440 rpm only the static gain controller is able to provide 30?55% reduction at all locations. The H controllers on the other hand could only give a small reduction at one mount location. This is due in part to the deficient of the model used in the design. Higher frequency dynamics has been ignored in the model. This can be resolved by the use of a higher order model that can result in a high order controller. A low order static gain controller, with some tuning, performs better. But it lacks the analytical framework for analysis and design.
Resumo:
Orthogonal frequency division multiplexing (OFDM) is becoming a fundamental technology in future generation wireless communications. Call admission control is an effective mechanism to guarantee resilient, efficient, and quality-of-service (QoS) services in wireless mobile networks. In this paper, we present several call admission control algorithms for OFDM-based wireless multiservice networks. Call connection requests are differentiated into narrow-band calls and wide-band calls. For either class of calls, the traffic process is characterized as batch arrival since each call may request multiple subcarriers to satisfy its QoS requirement. The batch size is a random variable following a probability mass function (PMF) with realistically maximum value. In addition, the service times for wide-band and narrow-band calls are different. Following this, we perform a tele-traffic queueing analysis for OFDM-based wireless multiservice networks. The formulae for the significant performance metrics call blocking probability and bandwidth utilization are developed. Numerical investigations are presented to demonstrate the interaction between key parameters and performance metrics. The performance tradeoff among different call admission control algorithms is discussed. Moreover, the analytical model has been validated by simulation. The methodology as well as the result provides an efficient tool for planning next-generation OFDM-based broadband wireless access systems.
Resumo:
Control of spatiotemporal chaos is achieved in the catalytic oxidation of CO on Pt(110) by localized modification of the kinetic properties of the surface chemical reaction. In the experiment, a small temperature heterogeneity is created on the surface by a focused laser beam. This heterogeneity constitutes a pacemaker and starts to emit target waves. These waves slowly entrain the medium and suppress the spatiotemporal chaos that is present in the absence of control. We compare this experimental result with a numerical study of the Krischer-Eiswirth-Ertl model for CO oxidation on Pt(110). We confirm the experimental findings and identify regimes where complete and partial controls are possible.
Resumo:
Successful commercialization of a technology such as Fiber Bragg Gratings requires the ability to manufacture devices repeatably, quickly and at low cost. Although the first report of photorefractive gratings was in 1978 it was not until 1993, when phase mask fabrication was demonstrated, that this became feasible. More recently, draw tower fabrication on a production level and grating writing through the polymer jacket have been realized; both important developments since they preserve the intrinsic strength of the fiber. Potentially the most significant recent development has been femtosecond laser inscription of gratings. Although not yet a commercial technology, it provides the means of writing multiple gratings in the optical core providing directional sensing capability in a single fiber. Femtosecond processing can also be used to machine the fiber to produce micronscale slots and holes enhancing the interaction between the light in the core and the surrounding medium. © 2011 Bentham Science Publishers Ltd. All rights reserved.