938 resultados para Control theory
Resumo:
Flow control in Computer Communication systems is generally a multi-layered structure, consisting of several mechanisms operating independently at different levels. Evaluation of the performance of networks in which different flow control mechanisms act simultaneously is an important area of research, and is examined in depth in this thesis. This thesis presents the modelling of a finite resource computer communication network equipped with three levels of flow control, based on closed queueing network theory. The flow control mechanisms considered are: end-to-end control of virtual circuits, network access control of external messages at the entry nodes and the hop level control between nodes. The model is solved by a heuristic technique, based on an equivalent reduced network and the heuristic extensions to the mean value analysis algorithm. The method has significant computational advantages, and overcomes the limitations of the exact methods. It can be used to solve large network models with finite buffers and many virtual circuits. The model and its heuristic solution are validated by simulation. The interaction between the three levels of flow control are investigated. A queueing model is developed for the admission delay on virtual circuits with end-to-end control, in which messages arrive from independent Poisson sources. The selection of optimum window limit is considered. Several advanced network access schemes are postulated to improve the network performance as well as that of selected traffic streams, and numerical results are presented. A model for the dynamic control of input traffic is developed. Based on Markov decision theory, an optimal control policy is formulated. Numerical results are given and throughput-delay performance is shown to be better with dynamic control than with static control.
Resumo:
The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.
Resumo:
Distributed digital control systems provide alternatives to conventional, centralised digital control systems. Typically, a modern distributed control system will comprise a multi-processor or network of processors, a communications network, an associated set of sensors and actuators, and the systems and applications software. This thesis addresses the problem of how to design robust decentralised control systems, such as those used to control event-driven, real-time processes in time-critical environments. Emphasis is placed on studying the dynamical behaviour of a system and identifying ways of partitioning the system so that it may be controlled in a distributed manner. A structural partitioning technique is adopted which makes use of natural physical sub-processes in the system, which are then mapped into the software processes to control the system. However, communications are required between the processes because of the disjoint nature of the distributed (i.e. partitioned) state of the physical system. The structural partitioning technique, and recent developments in the theory of potential controllability and observability of a system, are the basis for the design of controllers. In particular, the method is used to derive a decentralised estimate of the state vector for a continuous-time system. The work is also extended to derive a distributed estimate for a discrete-time system. Emphasis is also given to the role of communications in the distributed control of processes and to the partitioning technique necessary to design distributed and decentralised systems with resilient structures. A method is presented for the systematic identification of necessary communications for distributed control. It is also shwon that the structural partitions can be used directly in the design of software fault tolerant concurrent controllers. In particular, the structural partition can be used to identify the boundary of the conversation which can be used to protect a specific part of the system. In addition, for certain classes of system, the partitions can be used to identify processes which may be dynamically reconfigured in the event of a fault. These methods should be of use in the design of robust distributed systems.
Resumo:
We present a mean field theory of code-division multiple access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems.
Resumo:
The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.
Resumo:
Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.
Resumo:
Prior to the development of a production standard control system for ML Aviation's plan-symmetric remotely piloted helicopter system, SPRITE, optimum solutions to technical requirements had yet to be found for some aspects of the work. This thesis describes an industrial project where solutions to real problems have been provided within strict timescale constraints. Use has been made of published material wherever appropriate, new solutions have been contributed where none existed previously. A lack of clearly defined user requirements from potential Remotely Piloted Air Vehicle (RPAV) system users is identified, A simulation package is defined to enable the RPAV designer to progress with air vehicle and control system design, development and evaluation studies and to assist the user to investigate his applications. The theoretical basis of this simulation package is developed including Co-axial Contra-rotating Twin Rotor (CCTR), six degrees of freedom motion, fuselage aerodynamics and sensor and control system models. A compatible system of equations is derived for modelling a miniature plan-symmetric helicopter. Rigorous searches revealed a lack of CCTR models, based on closed form expressions to obviate integration along the rotor blade, for stabilisation and navigation studies through simulation. An economic CCTR simulation model is developed and validated by comparison with published work and practical tests. Confusion in published work between attitude and Euler angles is clarified. The implementation of package is discussed. dynamic adjustment of assessment. the theory into a high integrity software Use is made of a novel technique basing the integration time step size on error Simulation output for control system stability verification, cross coupling of motion between control channels and air vehicle response to demands and horizontal wind gusts studies are presented. Contra-Rotating Twin Rotor Flight Control System Remotely Piloted Plan-Symmetric Helicopter Simulation Six Degrees of Freedom Motion ( i i)
Resumo:
Meta-analysis was used to quantify how well the Theories of Reasoned Action and Planned Behaviour have predicted intentions to attend screening programmes and actual attendance behaviour. Systematic literature searches identified 33 studies that were included in the review. Across the studies as a whole, attitudes had a large-sized relationship with intention, while subjective norms and perceived behavioural control (PBC) possessed medium-sized relationships with intention. Intention had a medium-sized relationship with attendance, whereas the PBC-attendance relationship was small sized. Due to heterogeneity in results between studies, moderator analyses were conducted. The moderator variables were (a) type of screening test, (b) location of recruitment, (c) screening cost and (d) invitation to screen. All moderators affected theory of planned behaviour relationships. Suggestions for future research emerging from these results include targeting attitudes to promote intention to screen, a greater use of implementation intentions in screening information and examining the credibility of different screening providers.
Resumo:
Speed's theory makes two predictions for the development of analogical reasoning. Firstly, young children should not be able to reason analogically due to an undeveloped PFC neural network. Secondly, category knowledge enables the reinforcement of structural features over surface features, and thus the development of sophisticated, analogical, reasoning. We outline existing studies that support these predictions and highlight some critical remaining issues. Specifically, we argue that the development of inhibition must be directly compared alongside the development of reasoning strategies in order to support Speed's account. © 2010 Psychology Press.
Resumo:
This paper draws upon activity theory- to analyse an empirical investigation of the micro practices of strategy in three UK universities. Activity theory provides a framework of four interactive components from which strategy emerges; the collective structures of the organization, the primary actors, in this research conceptualized as the top management team (TMT), the practical activities in which they interact and the strategic practices through which interaction is conducted. Using this framework, the paper focuses specifically on the formal strategic practices involved in direction setting, resource allocation, and monitoring and control. These strategic practices arc associated with continuity of strategic activity in one case study but are involved in the reinterpretation and change of strategic activity in the other two cases. We model this finding into activity theory-based typologies of the cases that illustrate the way that practices either distribute shared interpretations or mediate between contested interpretations of strategic activity. The typologies explain the relationships between strategic practices and continuity and change of strategy as practice. The paper concludes by linking activity theory to wider change literatures to illustrate its potential as an integrative methodological framework for examining the subjective and emergent processes through which strategic activity is constructed. © Blackwell Publishing Ltd 2003.
Resumo:
Electronic channel affiliates are important online intermediaries between customers and host retailers. However, no work has studied how online retailers control online intermediaries. By conducting an exploratory content analysis of 85 online contracts between online retailers and their online intermediaries, and categorizing the governing mechanisms used, insights into the unique aspects of the control of online intermediaries are presented. Findings regarding incentives, monitoring, and enforcement are presented. Additionally, testable research propositions are presented to guide further theory development, drawing on contract theory, resource dependence theory and agency theory. Managerial implications are discussed. © 2012 Elsevier Inc.
Resumo:
Background: Stroke prevention in atrial fibrillation (AF), most commonly with warfarin, requires maintenance of a narrow therapeutic target (INR 2.0 to 3.0) and is often poorly controlled in practice. Poor patient-understanding surrounding AF and its’ treatment may contribute to patient’s willingness to adhere to recommendations. Method: A theory-driven intervention, developed using patient interviews and focus groups, consisting of a one-off group session (1-6 patients) utilising an ‘expert-patient’ focussed DVD, educational booklet, self-monitoring diary and worksheet, was compared in a randomised controlled trial (ISRCTN93952605) against usual care, with patient postal follow-ups at 1, 2, 6, and 12-months. Ninety-seven warfarin-naïve AF patients were randomised to intervention (n=46, mean age (SD) 72.0 (8.2), 67.4% men), or usual care (n=51, mean age (SD) 73.7 (8.1), 62.7% men), stratified by age, sex, and recruitment centre. Primary endpoint was time within therapeutic range (TTR); secondary endpoints included knowledge, quality of life, anxiety/depression, beliefs about medication, and illness perceptions. Main findings: Intervention patients had significantly higher TTR than usual care at 6-months (76.2% vs. 71.3%; p=0.035); at 12-months these differences were not significant (76.0% vs. 70.0%; p=0.44). Knowledge increased significantly across time (F (3, 47) = 6.4; p<0.01), but there were no differences between groups (F (1, 47) = 3.3; p = 0.07). At 6-months, knowledge scores predicted TTR (r=0.245; p=0.04). Patients’ scores on subscales representing their perception of the general harm and overuse of medication, as well as the perceived necessity of their AF specific medications predicted TTR at 6- and 12-months. Conclusions: A theory-driven educational intervention significantly improves TTR in AF patients initiating warfarin during the first 6-months. Adverse clinical outcomes may potentially be reduced by improving patients’ understanding of the necessity of warfarin and reducing their perception of treatment harm. Improving education provision for AF patients is essential to ensure efficacious and safe treatment.
Resumo:
The purpose of this paper is to theorise the changes surrounding the introduction of a management control innovation, total quality management (TQM) techniques, within Telecom Fiji Limited. Using institutional theory and drawing on empirical evidence from multiple sources including interviews, discussions and documents, the paper explicates the institutionalization of these TQM practices. The focus of the paper is the micro-processes and practice changes around TQM implementation, rather than the influence of the macro-level structures that are often linked with institutional theory. The change agents used Quality Action Teams and the National Quality Council to introduce new TQM routines. The present study extends the scope of institutional analysis by explaining how institutional contradictions impact to create and make space for institutional entrepreneurs, who in turn, modify existing routines or introduce new routines in fluid organizational environments which also exhibit evidence of resistance.
Resumo:
This article assesses the impact of education reform and the new public management (NPM) on the discretion of school teachers. The focal point of the study is Michael Lipsky's theory of discretion which casts public service professionals and others involved in service delivery as 'street-level bureaucrats' because their high degree of discretionary rule-making power enabled them to effectively make policy as well as implement it. The article considers the relationship between education reform and the NPM and focuses on the increased emphasis on skills-based teaching and changes in management and leadership in schools. The literature and survey of teachers demonstrate that discretion in the workplace has been eroded to such an extent due to a high degree of central regulation and local accountability as to question the applicability of Lipsky's model. The findings are based on the literature and a small survey undertaken by the author. © 2007 BELMAS.
Resumo:
Projects that are exposed to uncertain environments can be effectively controlled with the application of risk analysis during the planning stage. The Analytic Hierarchy Process, a multiattribute decision-making technique, can be used to analyse and assess project risks which are objective or subjective in nature. Among other advantages, the process logically integrates the various elements in the planning process. The results from risk analysis and activity analysis are then used to develop a logical contingency allowance for the project through the application of probability theory. The contingency allowance is created in two parts: (a) a technical contingency, and (b) a management contingency. This provides a basis for decision making in a changing project environment. Effective control of the project is made possible by the limitation of the changes within the monetary contingency allowance for the work package concerned, and the utilization of the contingency through proper appropriation. The whole methodology is applied to a pipeline-laying project in India, and its effectiveness in project control is demonstrated.