948 resultados para control error
Resumo:
Viscosity represents a key indicator of product quality in polymer extrusion but has traditionally been difficult to measure in-process in real-time. An innovative, yet simple, solution to this problem is proposed by a Prediction-Feedback observer mechanism. A `Prediction' model based on the operating conditions generates an open-loop estimate of the melt viscosity; this estimate is used as an input to a second, `Feedback' model to predict the pressure of the system. The pressure value is compared to the actual measured melt pressure and the error used to correct the viscosity estimate. The Prediction model captures the relationship between the operating conditions and the resulting melt viscosity and as such describes the specific material behavior. The Feedback model on the other hand describes the fundamental physical relationship between viscosity and extruder pressure and is a function of the machine geometry. The resulting system yields viscosity estimates within 1% error, shows excellent disturbance rejection properties and can be directly applied to model-based control. This is of major significance to achieving higher quality and reducing waste and set-up times in the polymer extrusion industry.
Resumo:
Cascade control is one of the routinely used control strategies in industrial processes because it can dramatically improve the performance of single-loop control, reducing both the maximum deviation and the integral error of the disturbance response. Currently, many control performance assessment methods of cascade control loops are developed based on the assumption that all the disturbances are subject to Gaussian distribution. However, in the practical condition, several disturbance sources occur in the manipulated variable or the upstream exhibits nonlinear behaviors. In this paper, a general and effective index of the performance assessment of the cascade control system subjected to the unknown disturbance distribution is proposed. Like the minimum variance control (MVC) design, the output variances of the primary and the secondary loops are decomposed into a cascade-invariant and a cascade-dependent term, but the estimated ARMA model for the cascade control loop based on the minimum entropy, instead of the minimum mean squares error, is developed for non-Gaussian disturbances. Unlike the MVC index, an innovative control performance index is given based on the information theory and the minimum entropy criterion. The index is informative and in agreement with the expected control knowledge. To elucidate wide applicability and effectiveness of the minimum entropy cascade control index, a simulation problem and a cascade control case of an oil refinery are applied. The comparison with MVC based cascade control is also included.
Resumo:
Inherently error-resilient applications in areas such as signal processing, machine learning and data analytics provide opportunities for relaxing reliability requirements, and thereby reducing the overhead incurred by conventional error correction schemes. In this paper, we exploit the tolerable imprecision of such applications by designing an energy-efficient fault-mitigation scheme for unreliable data memories to meet target yield. The proposed approach uses a bit-shuffling mechanism to isolate faults into bit locations with lower significance. This skews the bit-error distribution towards the low order bits, substantially limiting the output error magnitude. By controlling the granularity of the shuffling, the proposed technique enables trading-off quality for power, area, and timing overhead. Compared to error-correction codes, this can reduce the overhead by as much as 83% in read power, 77% in read access time, and 89% in area, when applied to various data mining applications in 28nm process technology.
Resumo:
Older adults use a different muscle strategy to cope with postural instability, in which they ‘co-contract’ the muscles around the ankle joint. It has been suggested that this is a compensatory response to age-related proprioceptive decline however this view has never been assessed directly. The current study investigated the association between proprioceptive acuity and muscle co-contraction in older adults. We compared muscle activity, by recording surface EMG from the bilateral tibalis anterior and gastrocnemius medialis muscles, in young (aged 18-34) and older adults (aged 65-82) during postural assessment on a fixed and sway-referenced surface at age-equivalent levels of sway. We performed correlations between muscle activity and proprioceptive acuity, which was assessed using an active contralateral matching task. Despite successfully inducing similar levels of sway in the two age groups, older adults still showed higher muscle co-contraction. A stepwise regression analysis showed that proprioceptive acuity measured using variable error was the best predictor of muscle co-contraction in older adults. However, despite suggestions from previous research, proprioceptive error and muscle co-contraction were negatively correlated in older adults, suggesting that better proprioceptive acuity predicts more co-contraction. Overall, these results suggest that although muscle co-contraction may be an age-specific strategy used by older adults, it is not to compensate for age-related proprioceptive deficits.
Resumo:
Wearable devices performing advanced bio-signal analysis algorithms are aimed to foster a revolution in healthcare provision of chronic cardiac diseases. In this context, energy efficiency is of paramount importance, as long-term monitoring must be ensured while relying on a tiny power source. Operating at a scaled supply voltage, just above the threshold voltage, effectively helps in saving substantial energy, but it makes circuits, and especially memories, more prone to errors, threatening the correct execution of algorithms. The use of error detection and correction codes may help to protect the entire memory content, however it incurs in large area and energy overheads which may not be compatible with the tight energy budgets of wearable systems. To cope with this challenge, in this paper we propose to limit the overhead of traditional schemes by selectively detecting and correcting errors only in data highly impacting the end-to-end quality of service of ultra-low power wearable electrocardiogram (ECG) devices. This partition adopts the protection of either significant words or significant bits of each data element, according to the application characteristics (statistical properties of the data in the application buffers), and its impact in determining the output. The proposed heterogeneous error protection scheme in real ECG signals allows substantial energy savings (11% in wearable devices) compared to state-of-the-art approaches, like ECC, in which the whole memory is protected against errors. At the same time, it also results in negligible output quality degradation in the evaluated power spectrum analysis application of ECG signals.
Resumo:
The fractional calculus of variations and fractional optimal control are generalizations of the corresponding classical theories, that allow problem modeling and formulations with arbitrary order derivatives and integrals. Because of the lack of analytic methods to solve such fractional problems, numerical techniques are developed. Here, we mainly investigate the approximation of fractional operators by means of series of integer-order derivatives and generalized finite differences. We give upper bounds for the error of proposed approximations and study their efficiency. Direct and indirect methods in solving fractional variational problems are studied in detail. Furthermore, optimality conditions are discussed for different types of unconstrained and constrained variational problems and for fractional optimal control problems. The introduced numerical methods are employed to solve some illustrative examples.
Resumo:
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large field of applications. In control and signal processing applications, MLPs are mainly used as nonlinear mapping approximators. The most common training algorithm used with MLPs is the error back-propagation (BP) alg. (1).
Resumo:
This paper presents a new rate-control algorithm for live video streaming over wireless IP networks, which is based on selective frame discarding. In the proposed mechanism excess 'P' frames are dropped from the output queue at the sender using a congestion estimate based on packet loss statistics obtained from RTCP feedback and from the Data Link (DL) layer. The performance of the algorithm is evaluated through computer simulation. This paper also presents a characterisation of packet losses owing to transmission errors and congestion, which can help in choosing appropriate strategies to maximise the video quality experienced by the end user. Copyright © 2007 Inderscience Enterprises Ltd.
Resumo:
Dragonflies show unique and superior flight performances than most of other insect species and birds. They are equipped with two pairs of independently controlled wings granting an unmatchable flying performance and robustness. In this paper, it is presented an adaptive scheme controlling a nonlinear model inspired in a dragonfly-like robot. It is proposed a hybrid adaptive (HA) law for adjusting the parameters analyzing the tracking error. At the current stage of the project it is considered essential the development of computational simulation models based in the dynamics to test whether strategies or algorithms of control, parts of the system (such as different wing configurations, tail) as well as the complete system. The performance analysis proves the superiority of the HA law over the direct adaptive (DA) method in terms of faster and improved tracking and parameter convergence.
Resumo:
Whereas the role of the anterior cingulate cortex (ACC) in cognitive control has received considerable attention, much less work has been done on the role of the ACC in autonomic regulation. Its connections through the vagus nerve to the sinoatrial node of the heart are thought to exert modulatory control over cardiovascular arousal. Therefore, ACC is not only responsible for the implementation of cognitive control, but also for the dynamic regulation of cardiovascular activity that characterizes healthy heart rate and adaptive behaviour. However, cognitive control and autonomic regulation are rarely examined together. Moreover, those studies that have examined the role of phasic vagal cardiac control in conjunction with cognitive performance have produced mixed results, finding relations for specific age groups and types of tasks but not consistently. So, while autonomic regulatory control appears to support effective cognitive performance under some conditions, it is not presently clear just what factors contribute to these relations. The goal of the present study was, therefore, to examine the relations between autonomic arousal, neural responsivity, and cognitive performance in the context of a task that required ACC support. Participants completed a primary inhibitory control task with a working memory load embedded. Pre-test cardiovascular measures were obtained, and ontask ERPs associated with response control (N2/P3) and error-related processes (ERN/Pe) were analyzed. Results indicated that response inhibition was unrelated to phasic vagal cardiac control, as indexed by respiratory sinus arrhythmia (RSA). However, higher resting RSA was associated with larger ERN ampUtude for the highest working memory load condition. This finding suggests that those individuals with greater autonomic regulatory control exhibited more robust ACC error-related responses on the most challenging task condition. On the other hand, exploratory analyses with rate pressure product (RPP), a measure of sympathetic arousal, indicated that higher pre-test RPP (i.e., more sympathetic influence) was associated with more errors on "catch" NoGo trials, i.e., NoGo trials that simultaneously followed other NoGo trials, and consequently, reqviired enhanced response control. Higher pre-test RPP was also associated with smaller amplitude ERNs for all three working memory loads and smaller ampUtude P3s for the low and medium working memory load conditions. Thus, higher pretest sympathetic arousal was associated with poorer performance on more demanding "catch" NoGo trials and less robust ACC-related electrocortical responses. The findings firom the present study highlight tiie interdependence of electrocortical and cardiovascular processes. While higher pre-test parasympathetic control seemed to relate to more robust ACC error-related responses, higher pre-test sympathetic arousal resulted in poorer inhibitory control performance and smaller ACC-generated electrocortical responses. Furthermore, these results provide a base from which to explore the relation between ACC and neuro/cardiac responses in older adults who may display greater variance due to the vulnerabihty of these systems to the normal aging process.
Resumo:
In studies of cognitive processing, the allocation of attention has been consistently linked to subtle, phasic adjustments in autonomic control. Both autonomic control of heart rate and control of the allocation of attention are known to decline with age. It is not known, however, whether characteristic individual differences in autonomic control and the ability to control attention are closely linked. To test this, a measure of parasympathetic function, vagal tone (VT) was computed from cardiac recordings from older and younger adults taken before and during performance of two attentiondemanding tasks - the Eriksen visual flanker task and the source memory task. Both tasks elicited event-related potentials (ERPs) that accompany errors, i.e., error-related negativities (ERNs) and error positivities (Pe's). The ERN is a negative deflection in the ERP signal, time-locked to responses made on incorrect trials, likely generated in the anterior cingulate. It is followed immediately by the Pe, a broad, positive deflection which may reflect conscious awareness of having committed an error. Age-attenuation ofERN amplitude has previously been found in paradigms with simple stimulus-response mappings, such as the flanker task, but has rarely been examined in more complex, conceptual tasks. Until now, there have been no reports of its being investigated in a source monitoring task. Age-attenuation of the ERN component was observed in both tasks. Results also indicated that the ERNs generated in these two tasks were generally comparable for young adults. For older adults, however, the ERN from the source monitoring task was not only shallower, but incorporated more frontal processing, apparently reflecting task demands. The error positivities elicited by 3 the two tasks were not comparable, however, and age-attenuation of the Pe was seen only in the more perceptual flanker task. For younger adults, it was Pe scalp topography that seemed to reflect task demands, being maximal over central parietal areas in the flanker task, but over very frontal areas in the source monitoring task. With respect to vagal tone, in the flanker task, neither the number of errors nor ERP amplitudes were predicted by baseline or on-task vagal tone measures. However, in the more difficult source memory task, lower VT was marginally associated with greater numbers of source memory errors in the older group. Thus, for older adults, relatively low levels of parasympathetic control over cardiac response coincided with poorer source memory discrimination. In both groups, lower levels of baseline VT were associated with larger amplitude ERNs, and smaller amplitude Pe's. Thus, low VT was associated in a conceptual task with a greater "emergency response" to errors, and at the same time, reduced awareness of having made them. The efficiency of an individual's complex cognitive processing was therefore associated with the flexibility of parasympathetic control of heart rate, in response to a cognitively challenging task.
Resumo:
This thesis tested a model of neurovisceral integration (Thayer & Lane, 2001) wherein parasympathetic autonomic regulation is considered to play a central role in cognitive control. We asked whether respiratory sinus arrhythmia (RSA), a parasympathetic index, and cardiac workload (rate pressure product, RPP) would influence cognition and whether this would change with age. Cognitive control was measured behaviourally and electrophysiologically through the error-related negativity (ERN) and error positivity (Pe). The ERN and Pe are thought to be generated by the anterior cingulate cortex (ACC), a region involved in regulating cognitive and autonomic control and susceptible to age-related change. In Study 1, older and younger adults completed a working memory Go/NoGo task. Although RSA did not relate to performance, higher pre-task RPP was associated with poorer NoGo performance among older adults. Relations between ERN/Pe and accuracy were indirect and more evident in younger adults. Thus, Study 1 supported the link between cognition and autonomic activity, specifically, cardiac workload in older adults. In Study 2, we included younger adults and manipulated a Stroop task to clarify conditions under which associations between RSA and performance will likely emerge. We varied task parameters to allow for proactive versus reactive strategies, and motivation was increased via financial incentive. Pre-task RSA predicted accuracy when response contingencies required maintenance of a specific item in memory. Thus, RSA was most relevant when performance required proactive control, a metabolically costly strategy that would presumably be more reliant on autonomic flexibility. In Study 3, we included older adults and examined RSA and proactive control in an additive factors framework. We maintained the incentive and measured fitness. Higher pre-task RSA among older adults was associated with greater accuracy when proactive control was needed most. Conversely, performance of young women was consistently associated with fitness. Relations between ERN/Pe and accuracy were modest; however, isolating ACC activity via independent component analysis allowed for more associations with accuracy to emerge in younger adults. Thus, performance in both groups appeared to be differentially dependent on RSA and ACC activation. Altogether, these data are consistent with a neurovisceral integration model in the context of cognitive control.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
Estudi de la implantació del control automàtic en la producció d'una indústria farmacèutica, concretament en sis línies de màquines. A part de portar un control es pretén millorar la producció i a la vegada detectar qualsevol error o anomalia que es produeixi en aquestes màquines
Resumo:
Two wavelet-based control variable transform schemes are described and are used to model some important features of forecast error statistics for use in variational data assimilation. The first is a conventional wavelet scheme and the other is an approximation of it. Their ability to capture the position and scale-dependent aspects of covariance structures is tested in a two-dimensional latitude-height context. This is done by comparing the covariance structures implied by the wavelet schemes with those found from the explicit forecast error covariance matrix, and with a non-wavelet- based covariance scheme used currently in an operational assimilation scheme. Qualitatively, the wavelet-based schemes show potential at modeling forecast error statistics well without giving preference to either position or scale-dependent aspects. The degree of spectral representation can be controlled by changing the number of spectral bands in the schemes, and the least number of bands that achieves adequate results is found for the model domain used. Evidence is found of a trade-off between the localization of features in positional and spectral spaces when the number of bands is changed. By examining implied covariance diagnostics, the wavelet-based schemes are found, on the whole, to give results that are closer to diagnostics found from the explicit matrix than from the nonwavelet scheme. Even though the nature of the covariances has the right qualities in spectral space, variances are found to be too low at some wavenumbers and vertical correlation length scales are found to be too long at most scales. The wavelet schemes are found to be good at resolving variations in position and scale-dependent horizontal length scales, although the length scales reproduced are usually too short. The second of the wavelet-based schemes is often found to be better than the first in some important respects, but, unlike the first, it has no exact inverse transform.