51 resultados para Computer control systems


Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is an increasing emphasis on the use of software to control safety critical plants for a wide area of applications. The importance of ensuring the correct operation of such potentially hazardous systems points to an emphasis on the verification of the system relative to a suitably secure specification. However, the process of verification is often made more complex by the concurrency and real-time considerations which are inherent in many applications. A response to this is the use of formal methods for the specification and verification of safety critical control systems. These provide a mathematical representation of a system which permits reasoning about its properties. This thesis investigates the use of the formal method Communicating Sequential Processes (CSP) for the verification of a safety critical control application. CSP is a discrete event based process algebra which has a compositional axiomatic semantics that supports verification by formal proof. The application is an industrial case study which concerns the concurrent control of a real-time high speed mechanism. It is seen from the case study that the axiomatic verification method employed is complex. It requires the user to have a relatively comprehensive understanding of the nature of the proof system and the application. By making a series of observations the thesis notes that CSP possesses the scope to support a more procedural approach to verification in the form of testing. This thesis investigates the technique of testing and proposes the method of Ideal Test Sets. By exploiting the underlying structure of the CSP semantic model it is shown that for certain processes and specifications the obligation of verification can be reduced to that of testing the specification over a finite subset of the behaviours of the process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A detailed investigation has been undertaken into the field induced electron emission (FIEE) mechanism that occurs at microscopically localised `sites' on uncoated and dielectric coated metallic electrodes. These processes have been investigated using two dedicated experimental systems that were developed for this study. The first is a novel combined photo/field emission microscope, which employs a UV source to stimulate photo-electrons from the sample surface in order to generate a topographical image. This system utilises an electrostatic lens column to provide identical optical properties under the different operating conditions required for purely topographical and combined photo/field imaging. The system has been demonstrated to have a resolution approaching 1m. Emission images have been obtained from carbon emission sites using this system to reveal that emission may occur from the edge triple junction or from the bulk of the carbon particle. An existing UHV electron spectrometer has been extensively rebuilt to incorporate a computer control and data acquisition system, improved sample handling and manipulation and a specimen heating stage. Details are given of a comprehensive study into the effects of sample heating on the emission process under conditions of both bulk and transient heating. Similar studies were also performed under conditions of both zero and high applied field. These show that the properties of emission sites are strongly temperature and field dependent thus indicating that the emission process is `non-metallic' in nature. The results have been shown to be consistent with an existing hot electron emission model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis examines the ways that libraries have employed computers to assist with housekeeping operations. It considers the relevance of such applications to company libraries in the construction industry, and describes more specifically the development of an integrated cataloguing and loan system. A review of the main features in the development of computerised ordering, cataloguing and circulation control systems shows that fully integrated packages are beginning to be completed, and that some libraries are introducing second generation programs. Cataloguing is the most common activity to be computerised, both at national and company level. Results from a sample of libraries in the construction industry suggest that the only computerised housekeeping system is at Taylor Woodrow. Most of the firms have access to an in-house computer, and some of the libraries, particularly those in firms of consulting engineers, might benefit from computerisation, but there are differing attitudes amongst the librarians towards the computer. A detailed study of the library at Taylor Woodrow resulted in a feasibility report covering all the areas of its activities. One of the main suggestions was the possible use of a computerised loans and cataloguing system. An integrated system to cover these two areas was programmed in Fortran and implemented. This new system provides certain benefits and saves staff time, but at the cost of time on the computer. Some improvements could be made by reprogramming, but it provides a general system for small technical libraries. A general equation comparing costs for manual and computerised operations is progressively simplified to a form where the annual saving from the computerised system is expressed in terms of staff and computer costs and the size of the library. This equation gives any library an indication of the savings or extra cost which would result from using the computerised system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years the topic of risk management has moved up the agenda of both government and industry, and private sector initiatives to improve risk and internal control systems have been mirrored by similar promptings for change in the public sector. Both regulators and practitioners now view risk management as an integral part of the process of corporate governance, and an aid to the achievement of strategic objectives. The paper uses case study material on the risk management control system at Birmingham City Council to extend existing theory by developing a contingency theory for the public sector. The case demonstrates that whilst the structure of the control system fits a generic model, the operational details indicate that controls are contingent upon three core variables—central government policies, information and communication technology and organisational size. All three contingent variables are suitable for testing the theory across the broader public sector arena.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A sizeable amount of the testing in eye care, requires either the identification of targets such as letters to assess functional vision, or the subjective evaluation of imagery by an examiner. Computers can render a variety of different targets on their monitors and can be used to store and analyse ophthalmic images. However, existing computing hardware tends to be large, screen resolutions are often too low, and objective assessments of ophthalmic images unreliable. Recent advances in mobile computing hardware and computer-vision systems can be used to enhance clinical testing in optometry. High resolution touch screens embedded in mobile devices, can render targets at a wide variety of distances and can be used to record and respond to patient responses, automating testing methods. This has opened up new opportunities in computerised near vision testing. Equally, new image processing techniques can be used to increase the validity and reliability of objective computer vision systems. Three novel apps for assessing reading speed, contrast sensitivity and amplitude of accommodation were created by the author to demonstrate the potential of mobile computing to enhance clinical measurement. The reading speed app could present sentences effectively, control illumination and automate the testing procedure for reading speed assessment. Meanwhile the contrast sensitivity app made use of a bit stealing technique and swept frequency target, to rapidly assess a patient’s full contrast sensitivity function at both near and far distances. Finally, customised electronic hardware was created and interfaced to an app on a smartphone device to allow free space amplitude of accommodation measurement. A new geometrical model of the tear film and a ray tracing simulation of a Placido disc topographer were produced to provide insights on the effect of tear film breakdown on ophthalmic images. Furthermore, a new computer vision system, that used a novel eye-lash segmentation technique, was created to demonstrate the potential of computer vision systems for the clinical assessment of tear stability. Studies undertaken by the author to assess the validity and repeatability of the novel apps, found that their repeatability was comparable to, or better, than existing clinical methods for reading speed and contrast sensitivity assessment. Furthermore, the apps offered reduced examination times in comparison to their paper based equivalents. The reading speed and amplitude of accommodation apps correlated highly with existing methods of assessment supporting their validity. Their still remains questions over the validity of using a swept frequency sine-wave target to assess patient’s contrast sensitivity functions as no clinical test provides the range of spatial frequencies and contrasts, nor equivalent assessment at distance and near. A validation study of the new computer vision system found that the authors tear metric correlated better with existing subjective measures of tear film stability than those of a competing computer-vision system. However, repeatability was poor in comparison to the subjective measures due to eye lash interference. The new mobile apps, computer vision system, and studies outlined in this thesis provide further insight into the potential of applying mobile and image processing technology to enhance clinical testing by eye care professionals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In nonlinear and stochastic control problems, learning an efficient feed-forward controller is not amenable to conventional neurocontrol methods. For these approaches, estimating and then incorporating uncertainty in the controller and feed-forward models can produce more robust control results. Here, we introduce a novel inversion-based neurocontroller for solving control problems involving uncertain nonlinear systems which could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. Based on importance sampling from these distributions a novel robust inverse control approach is obtained. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider. A nonlinear multi-variable system with different delays between the input-output pairs is used to demonstrate the successful application of the developed control algorithm. The proposed method is suitable for redundant control systems and allows us to model strongly non-Gaussian distributions of control signal as well as processes with hysteresis. © 2004 Elsevier Ltd. All rights reserved.