44 resultados para Automatic Control Theory
em Aston University Research Archive
Resumo:
This thesis describes the investigation of an adaptive method of attenuation control for digital speech signals in an analogue-digital environment and its effects on the transmission performance of a national telecommunication network. The first part gives the design of a digital automatic gain control, able to operate upon a P.C.M. signal in its companded form and whose operation is based upon the counting of peaks of the digital speech signal above certain threshold levels. A study was ma.de of a digital automatic gain control (d.a.g.c.) in open-loop configuration and closed-loop configuration. The former was adopted as the means for carrying out the automatic control of attenuation. It was simulated and tested, both objectively and subjectively. The final part is the assessment of the effects on telephone connections of a d.a.g.c. that introduces gains of 6 dB or 12 dB. This work used a Telephone Connection Assessment Model developed at The University of Aston in Birmingham. The subjective tests showed that the d.a.g.c. gives advantage for listeners when the speech level is very low. The benefit is not great when speech is only a little quieter than preferred. The assessment showed that, when a standard British Telecom earphone is used, insertion of gain is desirable if speech voltage across the earphone terminals is below an upper limit of -38 dBV. People commented upon the presence of an adaptive-like effect during the tests. This could be the reason why they voted against the insertion of gain at level only little quieter than preferred, when they may otherwise have judged it to be desirable. A telephone connection with a d.a.g.c. in has a degree of difficulty less than half of that without it. The score Excellent plus Good is 10-30% greater.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
The authors use social control theory to develop a conceptual model that addresses the effectiveness of regulatory agencies’ (e.g., Food and Drug Administration, Occupational Safety and Health Administration) field-level efforts to obtain conformance with product safety laws. Central to the model are the control processes agencies use when monitoring organizations and enforcing the safety rules. These approaches can be labeled formal control (e.g., rigid enforcement) and informal control (e.g., social instruction). The theoretical framework identifies an important antecedent of control and the relative effectiveness of control’s alternative forms in gaining compliance and reducing opportunism. Furthermore, the model predicts that the regulated firms’ level of agreement with the safety rules moderates the relationships between control and firm responses. A local health department’s administration of state food safety regulations provides the empirical context for testing the hypotheses. The results from a survey of 173 restaurants largely support the proposed model. The study findings inform a discussion of effective methods of administering product safety laws. The authors use social control theory to develop a conceptual model that addresses the effectiveness of regulatory agencies’ (e.g., Food and Drug Administration, Occupational Safety and Health Administration) field-level efforts to obtain conformance with product safety laws. Central to the model are the control processes agencies use when monitoring organizations and enforcing the safety rules. These approaches can be labeled formal control (e.g., rigid enforcement) and informal control (e.g., social instruction). The theoretical framework identifies an important antecedent of control and the relative effectiveness of control’s alternative forms in gaining compliance and reducing opportunism. Furthermore, the model predicts that the regulated firms’ level of agreement with the safety rules moderates the relationships between control and firm responses. A local health department’s administration of state food safety regulations provides the empirical context for testing the hypotheses. The results from a survey of 173 restaurants largely support the proposed model. The study findings inform a discussion of effective methods of administering product safety laws.
Resumo:
This work reports the developnent of a mathenatical model and distributed, multi variable computer-control for a pilot plant double-effect climbing-film evaporator. A distributed-parameter model of the plant has been developed and the time-domain model transformed into the Laplace domain. The model has been further transformed into an integral domain conforming to an algebraic ring of polynomials, to eliminate the transcendental terms which arise in the Laplace domain due to the distributed nature of the plant model. This has made possible the application of linear control theories to a set of linear-partial differential equations. The models obtained have well tracked the experimental results of the plant. A distributed-computer network has been interfaced with the plant to implement digital controllers in a hierarchical structure. A modern rnultivariable Wiener-Hopf controller has been applled to the plant model. The application has revealed a limitation condition that the plant matrix should be positive-definite along the infinite frequency axis. A new multi variable control theory has emerged fram this study, which avoids the above limitation. The controller has the structure of the modern Wiener-Hopf controller, but with a unique feature enabling a designer to specify the closed-loop poles in advance and to shape the sensitivity matrix as required. In this way, the method treats directly the interaction problems found in the chemical processes with good tracking and regulation performances. Though the ability of the analytical design methods to determine once and for all whether a given set of specifications can be met is one of its chief advantages over the conventional trial-and-error design procedures. However, one disadvantage that offsets to some degree the enormous advantages is the relatively complicated algebra that must be employed in working out all but the simplest problem. Mathematical algorithms and computer software have been developed to treat some of the mathematical operations defined over the integral domain, such as matrix fraction description, spectral factorization, the Bezout identity, and the general manipulation of polynomial matrices. Hence, the design problems of Wiener-Hopf type of controllers and other similar algebraic design methods can be easily solved.
Resumo:
The thesis deals with the background, development and description of a mathematical stock control methodology for use within an oil and chemical blending company, where demand and replenishment lead-times are generally non-stationary. The stock control model proper relies on, as input, adaptive forecasts of demand determined for an economical forecast/replenishment period precalculated on an individual stock-item basis. The control procedure is principally that of the continuous review, reorder level type, where the reorder level and reorder quantity 'float', that is, each changes in accordance with changes in demand. Two versions of the Methodology are presented; a cost minimisation version and a service level version. Realising the importance of demand forecasts, four recognised variations of the Trigg and Leach adaptive forecasting routine are examined. A fifth variation, developed, is proposed as part of the stock control methodology. The results of testing the cost minimisation version of the Methodology with historical data, by means of a computerised simulation, are presented together with a description of the simulation used. The performance of the Methodology is in addition compared favourably to a rule-of-thumb approach considered by the Company as an interim solution for reducing stack levels. The contribution of the work to the field of scientific stock control is felt to be significant for the following reasons:- (I) The Methodology is designed specifically for use with non-stationary demand and for this reason alone appears to be unique. (2) The Methodology is unique in its approach and the cost-minimisation version is shown to work successfully with the demand data presented. (3) The Methodology and the thesis as a whole fill an important gap between complex mathematical stock control theory and practical application. A brief description of a computerised order processing/stock monitoring system, designed and implemented as a pre-requisite for the Methodology's practical operation, is presented as an appendix.
Resumo:
Optimal stochastic controller pushes the closed-loop behavior as close as possible to the desired one. The fully probabilistic design (FPD) uses probabilistic description of the desired closed loop and minimizes Kullback-Leibler divergence of the closed-loop description to the desired one. Practical exploitation of the fully probabilistic design control theory continues to be hindered by the computational complexities involved in numerically solving the associated stochastic dynamic programming problem. In particular very hard multivariate integration and an approximate interpolation of the involved multivariate functions. This paper proposes a new fully probabilistic contro algorithm that uses the adaptive critic methods to circumvent the need for explicitly evaluating the optimal value function, thereby dramatically reducing computational requirements. This is a main contribution of this short paper.
Resumo:
An ultrasonic thermometer has been developed for high temperature measurement over a wide temperature range. It is particularly suitable for use in measuring nuclear fuel rod centerline temperatures in advanced liquid metal and high flux nuclear reactors. The thermometer which was designed to determine fuel temperature up to the fuel melting point, utilizes the temperature dependence of the ultrasonic propagation velocity (related to the elastic modulus} in a thin rod sensor as the temperature transducing mechanism. A pulse excitation technique has been used, where the mechanical resonator at the remote end of the acoustic·line is madto vibrate. Its natural frequency is proportional to the ultrasonic velocity in the material. This is measured by the electronic instrumentation and enables a frequency temperature or period-temperature calibration to be obtained. A completely digital automatic instrument has been designed, constructed and tested to track the resonance frequency of the temperature sensors. It operates smoothly over a frequency range of about 30%, more than the maximum working range of most probe materials. The control uses the basic property of a resonator that the stored energy decays exponentially at the natural frequency of the resonator.The operation of the electronic system is based on a digital multichannel transmitter that is capable of operating with a predefined number of cycles in the burst. this overcomes a basic defect in the previous deslgn where the analogue time-delayed circuits failed to hold synchronization and hence automatic control could be lost. Development of a particular type of temperature probe, that is small enough to fit into a standard 2 mm reactor tube has made the ultrasonic thermometer a practicable device for measuring fuel temperature. The bulkiness of previous probes has been overcome, the new design consists of a tuning fork, integral with a 1mm line, while maintaining a frequency of no more than 100 kHz. A magnetostrictive rod, acoustically matched to the probe is used to launch and receive the acoustic oscillations. This requires a magnetic bias and the previously used bulky magnets have been replaced by a direct current coil. The probe is supported by terminating the launcher with a short heavy isolating rod which can be secured to the reactor structure. This support, the bias and launching coil and the launcher are made up into a single compact unit. On the material side an extensive study of a wide range of refractory materials identified molybdenum, iridium, rhenium and tungsten as satisfactory for a number of applications but mostly exhibiting to some degree a calibration drift with thermal cycling. When attention was directed to ceramic materials, Sapphire (single crystal alumina) was found to have numerous advantages, particularly in respect of stability of calibration which remained with ±2°C after many cycles to 1800oC. Tungsten and thoriated tungsten (W - 2% Tho2) were also found to be quite satisfactory to 1600oC, the specification for a Euratom application.
Resumo:
Rotating fluidised Beds offer the potential for high intensity combustion, large turndown and extended range of fluidising velocity due to the imposition of an artificial gravitational field. Low thermal capacity should also allow rapid response to load changes. This thesis describes investigations of the validity of these potential virtues. Experiments, at atmospheric pressure, were conducted in flow visualisation rigs and a combustor designed to accommodate a distributor 200mm diameter and 80mm axial length. Ancillary experiments were conducted in a 6" diameter conventional fluidised bed. The investigations encompassed assessment of; fluidisation and elutriation, coal feed requirements, start-up and steady-state combustion using premixed propane and air, transition from propane to coal combustion and mechanical design. Assessments were made of an elutriation model and some effects of particle size on the combustion of premixed fuel gas and air. The findings were: a) more reliable start-up and control methods must be developed. Combustion of premixed propane and air led to severe mechanical and operating problems. Manual control of coal combustion was inadequate. b) Design criteria must encompass pressure loss, mechanical strength and high temperature resistance. The flow characteristics of ancillaries and the distributor must be matcheo. c) Fluidisation of a range of particle sizes was investigated. New correlations for minimum fluidisation and fully supported velocities are proposed. Some effects on elutriation of particle size and the distance between the bed surface and exhaust port have been identified. A conic distributor did not aid initial bed distribution. Furthermore, airflow instability was encountered with this distributor shape. Future use of conic distributors is not recommended. Axial solids mixing was found to be poor. A coal feeder was developed which produced uniform fuel distribution throughout the bed. The report concludes that small scale inhibits development of mechanical design and exploration of performance. future research requires larger combustors and automatic control.
Resumo:
The Models@run.time (MRT) workshop series offers a discussion forum for the rising need to leverage modeling techniques for the software of the future. The main goals are to explore the benefits of models@run.time and to foster collaboration and cross-fertilization between different research communities like for example like model-driven engineering (e.g. MODELS), self-adaptive/autonomous systems communities (e.g., SEAMS and ICAC), the control theory community and the artificial intelligence community. © 2012 Authors.
Resumo:
Binocular combination for first-order (luminancedefined) stimuli has been widely studied, but we know rather little about this binocular process for spatial modulations of contrast (second-order stimuli). We used phase-matching and amplitude-matching tasks to assess binocular combination of second-order phase and modulation depth simultaneously. With fixed modulation in one eye, we found that binocularly perceived phase was shifted, and perceived amplitude increased almost linearly as modulation depth in the other eye increased. At larger disparities, the phase shift was larger and the amplitude change was smaller. The degree of interocular correlation of the carriers had no influence. These results can be explained by an initial extraction of the contrast envelopes before binocular combination (consistent with the lack of dependence on carrier correlation) followed by a weighted linear summation of second-order modulations in which the weights (gains) for each eye are driven by the first-order carrier contrasts as previously found for first-order binocular combination. Perceived modulation depth fell markedly with increasing phase disparity unlike previous findings that perceived first-order contrast was almost independent of phase disparity. We present a simple revision to a widely used interocular gain-control theory that unifies first- and second-order binocular summation with a single principle-contrast-weighted summation-and we further elaborate the model for first-order combination. Conclusion: Second-order combination is controlled by first-order contrast.
Resumo:
This paper describes a model designed to recommend solutions to an organisation's e-business needs. It is designed to produce objective results based on perceived characteristics, unbiased by prejudice on the part of the person using the model. The model also includes a way of encapsulating the potential management concerns that may change for good or ill the likely relevance and probability of success of such solutions. The model has been tested on 13 case studies in small, medium and large organizations. © IFAC.
Resumo:
This thesis addresses the viability of automatic speech recognition for control room systems; with careful system design, automatic speech recognition (ASR) devices can be useful means for human computer interaction in specific types of task. These tasks can be defined as complex verbal activities, such as command and control, and can be paired with spatial tasks, such as monitoring, without detriment. It is suggested that ASR use be confined to routine plant operation, as opposed the critical incidents, due to possible problems of stress on the operators' speech. It is proposed that using ASR will require operators to adapt a commonly used skill to cater for a novel use of speech. Before using the ASR device, new operators will require some form of training. It is shown that a demonstration by an experienced user of the device can lead to superior performance than instructions. Thus, a relatively cheap and very efficient form of operator training can be supplied by demonstration by experienced ASR operators. From a series of studies into speech based interaction with computers, it is concluded that the interaction be designed to capitalise upon the tendency of operators to use short, succinct, task specific styles of speech. From studies comparing different types of feedback, it is concluded that operators be given screen based feedback, rather than auditory feedback, for control room operation. Feedback will take two forms: the use of the ASR device will require recognition feedback, which will be best supplied using text; the performance of a process control task will require task feedback integrated into the mimic display. This latter feedback can be either textual or symbolic, but it is suggested that symbolic feedback will be more beneficial. Related to both interaction style and feedback is the issue of handling recognition errors. These should be corrected by simple command repetition practices, rather than use error handling dialogues. This method of error correction is held to be non intrusive to primary command and control operations. This thesis also addresses some of the problems of user error in ASR use, and provides a number of recommendations for its reduction.
Resumo:
Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.
Resumo:
In recent years the topic of risk management has moved up the agenda of both government and industry, and private sector initiatives to improve risk and internal control systems have been mirrored by similar promptings for change in the public sector. Both regulators and practitioners now view risk management as an integral part of the process of corporate governance, and an aid to the achievement of strategic objectives. The paper uses case study material on the risk management control system at Birmingham City Council to extend existing theory by developing a contingency theory for the public sector. The case demonstrates that whilst the structure of the control system fits a generic model, the operational details indicate that controls are contingent upon three core variables—central government policies, information and communication technology and organisational size. All three contingent variables are suitable for testing the theory across the broader public sector arena.
Resumo:
The medial pFC (mPFC) is frequently reported to play a central role in Theory of Mind (ToM). However, the contribution of this large cortical region in ToM is not well understood. Combining a novel behavioral task with fMRI, we sought to demonstrate functional divisions between dorsal and rostral mPFC. All conditions of the task required the representation of mental states (beliefs and desires). The level of demands on cognitive control (high vs. low) and the nature of the demands on reasoning (deductive vs. abductive) were varied orthogonally between conditions. Activation in dorsal mPFC was modulated by the need for control, whereas rostral mPFC was modulated by reasoning demands. These findings fit with previously suggested domain-general functions for different parts of mPFC and suggest that these functions are recruited selectively in the service of ToM.