917 resultados para Automatic Control Theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Voluntary limb movements are associated with involuntary and automatic postural adjustments of the trunk muscles. These postural adjustments occur prior to movement and prevent unwanted perturbation of the trunk. In low back pain, postural adjustments of the trunk muscles are altered such that the deep trunk muscles are consistently delayed and the superficial trunk muscles are sometimes augmented. This alteration of postural adjustments may reflect disruption of normal postural control imparted by reduced central nervous system resources available during pain, so-called pain interference, or reflect adoption of an alternate postural adjustment strategy. Methods: We aimed to clarify this by recording electromyographic activity of the upper (obliquus extemus) and lower (transversus abdominis/obliquus internus) abdominal muscles during voluntary arm movements that were coupled with painful cutaneous stimulation at the low back. If the effect of pain on postural adjustments is caused by pain interference, it should be greatest at the onset of the stimulus, should habituate with repeated exposure, and be absent immediately when the threat of pain is removed. Sixteen patients performed 30 forward movements of the right arm in response to a visual cue (control). Seventy trials were then conducted in which arm movement was coupled with pain (pain trials) and then a further 70 trials were conducted without the pain stimulus (no pain trials). Results: There was a gradual and increasing delay of transversus abdominis/obliquus internus electromyograph and augmentation of obliquus externus during the pain trials, both of which gradually returned to control values during the no pain trials. Conclusion: The results suggest that altered postural adjustments of the trunk muscles during pain are not caused by pain interference but are likely to reflect development and adoption of an alternate postural adjustment strategy, which may serve to limit the amplitude and velocity of trunk excursion caused by arm movement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of nonlocal density functional theory (NLDFT) to determine pore size distribution (PSD) of activated carbons using a nongraphitized carbon black, instead of graphitized thermal carbon black, as a reference system is explored. We show that in this case nitrogen and argon adsorption isotherms in activated carbons are precisely correlated by the theory, and such an excellent correlation would never be possible if the pore wall surface was assumed to be identical to that of graphitized carbon black. It suggests that pore wall surfaces of activated carbon are closer to that of amorphous solids because of defects of crystalline lattice, finite pore length, and the presence of active centers.. etc. Application of the NLDFT adapted to amorphous solids resulted in quantitative description of N-2 and Ar adsorption isotherms on nongraphitized carbon black BP280 at their respective boiling points. In the present paper we determined solid-fluid potentials from experimental adsorption isotherms on nongraphitized carbon black and subsequently used those potentials to model adsorption in slit pores and generate a corresponding set of local isotherms, which we used to determine the PSD functions of different activated carbons. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a review of modelling and control of biological nutrient removal (BNR)-activated sludge processes for wastewater treatment using distributed parameter models described by partial differential equations (PDE). Numerical methods for solution to the BNR-activated sludge process dynamics are reviewed and these include method of lines, global orthogonal collocation and orthogonal collocation on finite elements. Fundamental techniques and conceptual advances of the distributed parameter approach to the dynamics and control of activated sludge processes are briefly described. A critical analysis on the advantages of the distributed parameter approach over the conventional modelling strategy in this paper shows that the activated sludge process is more adequately described by the former and the method is recommended for application to the wastewater industry (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Through a prospective study of 70 youths staying at homeless-youth shelters, the authors tested the utility of I. Ajzen's (1991) theory of planned behavior (TPB), by comparing the constructs of self-efficacy with perceived behavioral control (PBC), in predicting people's rule-following behavior during shelter stays. They performed the 1st wave of data collection through a questionnaire assessing the standard TPB components of attitudes, subjective norms, PBC, and behavioral intentions in relation to following the set rules at youth shelters. Further, they distinguished between items assessing PBC (or perceived control) and those reflecting self-efficacy (or perceived difficulty). At the completion of each youth's stay at the shelter, shelter staff rated the rule adherence for that participant. Regression analyses revealed some support for the TPB in that subjective norm was a significant predictor of intentions. However, self-efficacy emerged as the strongest predictor of intentions and was the only significant predictor of rule-following behavior. Thus, the results of the present study indicate the possibility that self-efficacy is integral to predicting rule adherence within this context and reaffirm the importance of incorporating notions of people's perceived ease or difficulty in performing actions in models of attitude-behavior prediction.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent world events aside, downward trends in donating behaviour in Australia have increased the need for research into the factors that inhibit and encourage charitable giving. A revised Theory of Planned Behaviour (TPB) model was used to determine the influence of attitudes, norms (injunctive, descriptive, and moral norms), perceived behavioural control (PBC), and past behaviour (PB) on intentions to donate money to charities and community service organisations. Respondents (N=186) completed a questionnaire assessing the constructs of the revised TPB model. Four weeks later, self-reported donating behaviour was assessed (n=65). Results showed support for the revised TPB model. Attitudes, PBC, injunctive norms, moral norms, and PB all predicted donating intentions. Descriptive norms did not predict intentions. Intention was the only significant predictor of selfreported behaviour four weeks later, with neither PBC nor PB having a direct effect on behaviour. Theoretical and applied implications of the results are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Well understood methods exist for developing programs from given specifications. A formal method identifies proof obligations at each development step: if all such proof obligations are discharged, a precisely defined class of errors can be excluded from the final program. For a class of closed systems such methods offer a gold standard against which less formal approaches can be measured. For open systems -those which interact with the physical world- the task of obtaining the program specification can be as challenging as the task of deriving the program. And, when a system of this class must tolerate certain kinds of unreliability in the physical world, it is still more challenging to reach confidence that the specification obtained is adequate. We argue that widening the notion of software development to include specifying the behaviour of the relevant parts of the physical world gives a way to derive the specification of a control system and also to record precisely the assumptions being made about the world outside the computer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DeVilliers and DeVilliers (2000, 2005) propose that deaf and hearing children acquire a theory of mind (or the understanding that human behaviour is the product of psychological states like true and false beliefs) as a consequence of their linguistic mastery of a rule of syntax. Specifically, they argue that the syntactic rule for sentential complementation with verbs of speech (e.g., “say”) precedes syntactic mastery of complementation for cognition (e.g., “think”) and both of these developmentally precede and promote conceptual mastery of a theory of mind (ToM), as indexed via success on standard false belief tests. The present study examined this proposition in groups of primary-school-aged deaf children and hearing preschoolers who took false belief tests and a modified memory for complements test that included control questions. Guttman scaling techniques indicated no support either for the prediction that syntactic skill precedes ToM understanding or for the earlier emergence of complementation for “say” than for “think”. Methodological issues and implications for deaf children's ToM development are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A sieve plate distillation column has been constructed and interfaced to a minicomputer with the necessary instrumentation for dynamic, estimation and control studies with special bearing on low-cost and noise-free instrumentation. A dynamic simulation of the column with a binary liquid system has been compiled using deterministic models that include fluid dynamics via Brambilla's equation for tray liquid holdup calculations. The simulation predictions have been tested experimentally under steady-state and transient conditions. The simulator's predictions of the tray temperatures have shown reasonably close agreement with the measured values under steady-state conditions and in the face of a step change in the feed rate. A method of extending linear filtering theory to highly nonlinear systems with very nonlinear measurement functional relationships has been proposed and tested by simulation on binary distillation. The simulation results have proved that the proposed methodology can overcome the typical instability problems associated with the Kalman filters. Three extended Kalman filters have been formulated and tested by simulation. The filters have been used to refine a much simplified model sequentially and to estimate parameters such as the unmeasured feed composition using information from the column simulation. It is first assumed that corrupted tray composition measurements are made available to the filter and then corrupted tray temperature measurements are accessed instead. The simulation results have demonstrated the powerful capability of the Kalman filters to overcome the typical hardware problems associated with the operation of on-line analyzers in relation to distillation dynamics and control by, in effect, replacirig them. A method of implementing estimator-aided feedforward (EAFF) control schemes has been proposed and tested by simulation on binary distillation. The results have shown that the EAFF scheme provides much better control and energy conservation than the conventional feedback temperature control in the face of a sustained step change in the feed rate or multiple changes in the feed rate, composition and temperature. Further extensions of this work are recommended as regards simulation, estimation and EAFF control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Control and governance theories recognize that exchange partners are subject to two general forms of control, the unilateral authority of one firm and bilateral expectations extending from their social bond. In this way, a supplier both exerts unilateral, authority-based controls and is subject to socially-based, bilateral controls as it attempts to manage its brand successfully through reseller channels. Such control is being challenged by suppliers’ growing relative dependence on increasingly dominant resellers in many industries. Yet the impact of supplier relative dependence on the efficacy of control-based governance in the supplier’s channel is not well understood. To address this gap, we specify and test a control model moderated by relative dependence involving the conceptualization and measurement of governance at the level of specific control processes: incenting, monitoring, and enforcing. Our empirical findings show relative dependence undercuts the effectiveness of certain unilateral and bilateral control processes while enhancing the effectiveness of others, largely supporting our dual suppositions that each control process operates through a specialized behavioral mechanism and that these underlying mechanisms are differentially impacted by relative dependence. We offer implications of these findings for managers and identify our contributions to channel theory and research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The automatic interpolation of environmental monitoring network data such as air quality or radiation levels in real-time setting poses a number of practical and theoretical questions. Among the problems found are (i) dealing and communicating uncertainty of predictions, (ii) automatic (hyper)parameter estimation, (iii) monitoring network heterogeneity, (iv) dealing with outlying extremes, and (v) quality control. In this paper we discuss these issues, in light of the spatial interpolation comparison exercise held in 2004.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the major problems associated with communication via a loudspeaking telephone (LST) is that, using analogue processing, duplex transmission is limited to low-loss lines and produces a low acoustic output. An architectural for an instrument has been developed and tested, which uses digital signal processing to provide duplex transmission between a LST and a telopnone handset over most of the B.T. network. Digital adaptive-filters are used in the duplex LST to cancel coupling between the loudspeaker and microphone, and across the transmit to receive paths of the 2-to-4-wire converter. Normal movement of a person in the acoustic path causes a loss of stability by increasing the level of coupling from the loudspeaker to the microphone, since there is a lag associated the adaptive filters learning about a non-stationary path, Control of the loop stability and the level of sidetone heard by the hadset user is by a microprocessoe, which continually monitors the system and regulates the gain. The result is a system which offers the best compromise available based on a set of measured parameters.A theory has been developed which gives the loop stability requirements based on the error between the parameters of the filter and those of the unknown path. The programme to develope a low-cost adaptive filter in LST produced a low-cost adaptive filter in LST produced a unique architecture which has a number of features not available in any similar system. These include automatic compensation for the rate of adaptation over a 36 dB range of output level, , 4 rates of adaptation (with a maximum of 465 dB/s), plus the ability to cascade up to 4 filters without loss o performance. A complex story has been developed to determine the adptation which can be achieved using finite-precision arithmatic. This enabled the development of an architecture which distributed the normalisation required to achieve optimum rate of adaptation over the useful input range. Comparison of theory and measurement for the adaptive filter show very close agreement. A single experimental LST was built and tested on connections to hanset telephones over the BT network. The LST demonstrated that duplex transmission was feasible using signal processing and produced a more comfortable means of communication beween people than methods emplying deep voice-switching to regulate the local-loop gain. Although, with the current level of processing power, it is not a panacea and attention must be directed toward the physical acoustic isolation between loudspeaker and microphone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flow control in Computer Communication systems is generally a multi-layered structure, consisting of several mechanisms operating independently at different levels. Evaluation of the performance of networks in which different flow control mechanisms act simultaneously is an important area of research, and is examined in depth in this thesis. This thesis presents the modelling of a finite resource computer communication network equipped with three levels of flow control, based on closed queueing network theory. The flow control mechanisms considered are: end-to-end control of virtual circuits, network access control of external messages at the entry nodes and the hop level control between nodes. The model is solved by a heuristic technique, based on an equivalent reduced network and the heuristic extensions to the mean value analysis algorithm. The method has significant computational advantages, and overcomes the limitations of the exact methods. It can be used to solve large network models with finite buffers and many virtual circuits. The model and its heuristic solution are validated by simulation. The interaction between the three levels of flow control are investigated. A queueing model is developed for the admission delay on virtual circuits with end-to-end control, in which messages arrive from independent Poisson sources. The selection of optimum window limit is considered. Several advanced network access schemes are postulated to improve the network performance as well as that of selected traffic streams, and numerical results are presented. A model for the dynamic control of input traffic is developed. Based on Markov decision theory, an optimal control policy is formulated. Numerical results are given and throughput-delay performance is shown to be better with dynamic control than with static control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed digital control systems provide alternatives to conventional, centralised digital control systems. Typically, a modern distributed control system will comprise a multi-processor or network of processors, a communications network, an associated set of sensors and actuators, and the systems and applications software. This thesis addresses the problem of how to design robust decentralised control systems, such as those used to control event-driven, real-time processes in time-critical environments. Emphasis is placed on studying the dynamical behaviour of a system and identifying ways of partitioning the system so that it may be controlled in a distributed manner. A structural partitioning technique is adopted which makes use of natural physical sub-processes in the system, which are then mapped into the software processes to control the system. However, communications are required between the processes because of the disjoint nature of the distributed (i.e. partitioned) state of the physical system. The structural partitioning technique, and recent developments in the theory of potential controllability and observability of a system, are the basis for the design of controllers. In particular, the method is used to derive a decentralised estimate of the state vector for a continuous-time system. The work is also extended to derive a distributed estimate for a discrete-time system. Emphasis is also given to the role of communications in the distributed control of processes and to the partitioning technique necessary to design distributed and decentralised systems with resilient structures. A method is presented for the systematic identification of necessary communications for distributed control. It is also shwon that the structural partitions can be used directly in the design of software fault tolerant concurrent controllers. In particular, the structural partition can be used to identify the boundary of the conversation which can be used to protect a specific part of the system. In addition, for certain classes of system, the partitions can be used to identify processes which may be dynamically reconfigured in the event of a fault. These methods should be of use in the design of robust distributed systems.