951 resultados para Field Oriented Control
Resumo:
This paper presents results from the first use of neural networks for the real-time feedback control of high temperature plasmas in a Tokamak fusion experiment. The Tokamak is currently the principal experimental device for research into the magnetic confinement approach to controlled fusion. In the Tokamak, hydrogen plasmas, at temperatures of up to 100 Million K, are confined by strong magnetic fields. Accurate control of the position and shape of the plasma boundary requires real-time feedback control of the magnetic field structure on a time-scale of a few tens of microseconds. Software simulations have demonstrated that a neural network approach can give significantly better performance than the linear technique currently used on most Tokamak experiments. The practical application of the neural network approach requires high-speed hardware, for which a fully parallel implementation of the multi-layer perceptron, using a hybrid of digital and analogue technology, has been developed.
Resumo:
A new challenge in the field of molecular magnetism is the design of optically and thermally switchable solid state magnetic materials for which various kinds of application may be feasible. Our research activities involve preparative methods, the study of the physical properties and associated mechanisms, as well as the exploration of further possibilities. Particular focus is on heterobimetallic Prussian Blue analogs, such as on RbMn[Fe(CN)6], in which the interplay between the two different adjacent metal ions is crucial for the observation of photo-induced phenomena. Our studies revealed that modification of the preparative conditions lead to differences in structural features that allowed tuning of the magnetic and electron transfer properties of RbxMn[Fe(CN)6]y.zH2O.
Resumo:
This paper draws attention for the fact that traditional Data Envelopment Analysis (DEA) models do not provide the closest possible targets (or peers) to inefficient units, and presents a procedure to obtain such targets. It focuses on non-oriented efficiency measures (which assume that production units are able to control, and thus change, inputs and outputs simultaneously) both measured in relation to a Free Disposal Hull (FDH) technology and in relation to a convex technology. The approaches developed for finding close targets are applied to a sample of Portuguese bank branches.
Resumo:
Over the last ten years our understanding of early spatial vision has improved enormously. The long-standing model of probability summation amongst multiple independent mechanisms with static output nonlinearities responsible for masking is obsolete. It has been replaced by a much more complex network of additive, suppressive, and facilitatory interactions and nonlinearities across eyes, area, spatial frequency, and orientation that extend well beyond the classical recep-tive field (CRF). A review of a substantial body of psychophysical work performed by ourselves (20 papers), and others, leads us to the following tentative account of the processing path for signal contrast. The first suppression stage is monocular, isotropic, non-adaptable, accelerates with RMS contrast, most potent for low spatial and high temporal frequencies, and extends slightly beyond the CRF. Second and third stages of suppression are difficult to disentangle but are possibly pre- and post-binocular summation, and involve components that are scale invariant, isotropic, anisotropic, chromatic, achromatic, adaptable, interocular, substantially larger than the CRF, and saturated by contrast. The monocular excitatory pathways begin with half-wave rectification, followed by a preliminary stage of half-binocular summation, a square-law transducer, full binocular summation, pooling over phase, cross-mechanism facilitatory interactions, additive noise, linear summation over area, and a slightly uncertain decision-maker. The purpose of each of these interactions is far from clear, but the system benefits from area and binocular summation of weak contrast signals as well as area and ocularity invariances above threshold (a herd of zebras doesn't change its contrast when it increases in number or when you close one eye). One of many remaining challenges is to determine the stage or stages of spatial tuning in the excitatory pathway.
Resumo:
The question of how to develop leaders so that they are more effective in a variety of situations, roles and levels has inspired a voluminous amount of research. While leader development programs such as executive coaching and 360-degree feedback have been widely practiced to meet this demand within organisations, the research in this area has only scratched the surface. Drawing from the past literature and leadership practices, the current research conceptualised self-regulation, as a metacompetency that would assist leaders to further develop the specific competencies needed to perform effectively in their leadership role, leading to an increased rating of leader effectiveness and to enhanced group performance. To test this conceptualisation, a longitudinal field experimental study was conducted across ten months with a pre- and two post-test intervention designs with a matched control group. This longitudinal field experimental compared the difference in leader and team performance after receiving self-regulation intervention that was delivered by an executive coach. Leaders in experimental group also received feedback reports from 360-degree feedback at each stage. Participants were 40 leaders, 155 followers and 8 supervisors. Leaders’ performance was measured using a multi-source perceptual measure of leader performance and objective measures of team financial and assessment performance. Analyses using repeated measure of ANCOVA on pre-test and two post-tests responses showed a significant difference between leader and team performance between experimental and control group. Furthermore, leader competencies mediated the relationship between self-regulation and performance. The implications of these findings for the theory and practice of leadership development training programs and the impact on organisational performance are discussed.
Resumo:
In this contribution, certain aspects of the nonlinear dynamics of magnetic field lines are reviewed. First, the basic facts (known from literature) concerning the Hamiltonian structure are briefly summarized. The paper then concentrates on the following subjects: (i) Transition from the continuous description to discrete maps; (ii) Characteristics of incomplete chaos; (iii) Control of chaos. The presentation is concluded by some remarks on the motion of particles in stochastic magnetic fields.
Resumo:
The authors use social control theory to develop a conceptual model that addresses the effectiveness of regulatory agencies’ (e.g., Food and Drug Administration, Occupational Safety and Health Administration) field-level efforts to obtain conformance with product safety laws. Central to the model are the control processes agencies use when monitoring organizations and enforcing the safety rules. These approaches can be labeled formal control (e.g., rigid enforcement) and informal control (e.g., social instruction). The theoretical framework identifies an important antecedent of control and the relative effectiveness of control’s alternative forms in gaining compliance and reducing opportunism. Furthermore, the model predicts that the regulated firms’ level of agreement with the safety rules moderates the relationships between control and firm responses. A local health department’s administration of state food safety regulations provides the empirical context for testing the hypotheses. The results from a survey of 173 restaurants largely support the proposed model. The study findings inform a discussion of effective methods of administering product safety laws. The authors use social control theory to develop a conceptual model that addresses the effectiveness of regulatory agencies’ (e.g., Food and Drug Administration, Occupational Safety and Health Administration) field-level efforts to obtain conformance with product safety laws. Central to the model are the control processes agencies use when monitoring organizations and enforcing the safety rules. These approaches can be labeled formal control (e.g., rigid enforcement) and informal control (e.g., social instruction). The theoretical framework identifies an important antecedent of control and the relative effectiveness of control’s alternative forms in gaining compliance and reducing opportunism. Furthermore, the model predicts that the regulated firms’ level of agreement with the safety rules moderates the relationships between control and firm responses. A local health department’s administration of state food safety regulations provides the empirical context for testing the hypotheses. The results from a survey of 173 restaurants largely support the proposed model. The study findings inform a discussion of effective methods of administering product safety laws.
Resumo:
The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.
Resumo:
A detailed investigation has been undertaken into the field induced electron emission (FIEE) mechanism that occurs at microscopically localised `sites' on uncoated and dielectric coated metallic electrodes. These processes have been investigated using two dedicated experimental systems that were developed for this study. The first is a novel combined photo/field emission microscope, which employs a UV source to stimulate photo-electrons from the sample surface in order to generate a topographical image. This system utilises an electrostatic lens column to provide identical optical properties under the different operating conditions required for purely topographical and combined photo/field imaging. The system has been demonstrated to have a resolution approaching 1m. Emission images have been obtained from carbon emission sites using this system to reveal that emission may occur from the edge triple junction or from the bulk of the carbon particle. An existing UHV electron spectrometer has been extensively rebuilt to incorporate a computer control and data acquisition system, improved sample handling and manipulation and a specimen heating stage. Details are given of a comprehensive study into the effects of sample heating on the emission process under conditions of both bulk and transient heating. Similar studies were also performed under conditions of both zero and high applied field. These show that the properties of emission sites are strongly temperature and field dependent thus indicating that the emission process is `non-metallic' in nature. The results have been shown to be consistent with an existing hot electron emission model.
Resumo:
Three lichen species were wetted with distilled water at different frequencies during August 1973 to July 1974. The radial growth rates of Parmelia glabratula ssp. fuliginosa and Physcia orbicularis thalli declined with increased wetting while the radial growth rate of Parmelia conspersa thalli increased with wetting frequency until ten experimental wettings per month but at fifteen wettings per month fell to a value near to the control. In the summer months, wetting resulted in a decline in the radial growth of P. glabratula ssp fuliginosa compared with the control but had little influence on the growth of P. conspersa and Physcia orbicularis. In the winter months, wetting had no significant influence on the radial growth of Parmelia glabratula ssp. fuliginosa, while the radial growth of P. conspersa increased and Physcia orbicularis declined compared with controls. These results are interpreted physiologically and in relation to the aspect distribution of the three lichens on rock surfaces.
Resumo:
We present a mean field theory of code-division multiple access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems.
Resumo:
The present study describes a pragmatic approach to the implementation of production planning and scheduling techniques in foundries of all types and looks at the use of `state-of-the-art' management control and information systems. Following a review of systems for the classification of manufacturing companies, a definitive statement is made which highlights the important differences between foundries (i.e. `component makers') and other manufacturing companies (i.e. `component buyers'). An investigation of the manual procedures which are used to plan and control the manufacture of components reveals the inherent problems facing foundry production management staff, which suggests the unsuitability of many manufacturing techniques which have been applied to general engineering companies. From the literature it was discovered that computer-assisted systems are required which are primarily `information-based' rather than `decision based', whilst the availability of low-cost computers and `packaged-software' has enabled foundries to `get their feet wet' without the financial penalties which characterized many of the early attempts at computer-assistance (i.e. pre-1980). Moreover, no evidence of a single methodology for foundry scheduling emerged from the review. A philosophy for the development of a CAPM system is presented, which details the essential information requirements and puts forward proposals for the subsequent interactions between types of information and the sub-system of CAPM which they support. The work developed was oriented specifically at the functions of production planning and scheduling and introduces the concept of `manual interaction' for effective scheduling. The techniques developed were designed to use the information which is readily available in foundries and were found to be practically successful following the implementation of the techniques into a wide variety of foundries. The limitations of the techniques developed are subsequently discussed within the wider issues which form a CAPM system, prior to a presentation of the conclusions which can be drawn from the study.
Resumo:
The thesis deals with the background, development and description of a mathematical stock control methodology for use within an oil and chemical blending company, where demand and replenishment lead-times are generally non-stationary. The stock control model proper relies on, as input, adaptive forecasts of demand determined for an economical forecast/replenishment period precalculated on an individual stock-item basis. The control procedure is principally that of the continuous review, reorder level type, where the reorder level and reorder quantity 'float', that is, each changes in accordance with changes in demand. Two versions of the Methodology are presented; a cost minimisation version and a service level version. Realising the importance of demand forecasts, four recognised variations of the Trigg and Leach adaptive forecasting routine are examined. A fifth variation, developed, is proposed as part of the stock control methodology. The results of testing the cost minimisation version of the Methodology with historical data, by means of a computerised simulation, are presented together with a description of the simulation used. The performance of the Methodology is in addition compared favourably to a rule-of-thumb approach considered by the Company as an interim solution for reducing stack levels. The contribution of the work to the field of scientific stock control is felt to be significant for the following reasons:- (I) The Methodology is designed specifically for use with non-stationary demand and for this reason alone appears to be unique. (2) The Methodology is unique in its approach and the cost-minimisation version is shown to work successfully with the demand data presented. (3) The Methodology and the thesis as a whole fill an important gap between complex mathematical stock control theory and practical application. A brief description of a computerised order processing/stock monitoring system, designed and implemented as a pre-requisite for the Methodology's practical operation, is presented as an appendix.