158 resultados para Stable Autoregressive Models
Resumo:
This paper concern the development of a stable model predictive controller (MPC) to be integrated with real time optimization (RTO) in the control structure of a process system with stable and integrating outputs. The real time process optimizer produces Optimal targets for the system inputs and for Outputs that Should be dynamically implemented by the MPC controller. This paper is based oil a previous work (Comput. Chem. Eng. 2005, 29, 1089) where a nominally stable MPC was proposed for systems with the conventional control approach where only the outputs have set points. This work is also based oil the work of Gonzalez et at. (J. Process Control 2009, 19, 110) where the zone control of stable systems is studied. The new control for is obtained by defining ail extended control objective that includes input targets and zone controller the outputs. Additional decision variables are also defined to increase the set of feasible solutions to the control problem. The hard constraints resulting from the cancellation of the integrating modes Lit the end of the control horizon are softened,, and the resulting control problem is made feasible to a large class of unknown disturbances and changes of the optimizing targets. The methods are illustrated with the simulated application of the proposed,approaches to a distillation column of the oil refining industry.
Resumo:
This work presents an alternative way to formulate the stable Model Predictive Control (MPC) optimization problem that allows the enlargement of the domain of attraction, while preserving the controller performance. Based on the dual MPC that uses the null local controller, it proposed the inclusion of an appropriate set of slacked terminal constraints into the control problem. As a result, the domain of attraction is unlimited for the stable modes of the system, and the largest possible for the non-stable modes. Although this controller does not achieve local optimality, simulations show that the input and output performances may be comparable to the ones obtained with the dual MPC that uses the LQR as a local controller. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Several MPC applications implement a control strategy in which some of the system outputs are controlled within specified ranges or zones, rather than at fixed set points [J.M. Maciejowski, Predictive Control with Constraints, Prentice Hall, New Jersey, 2002]. This means that these outputs will be treated as controlled variables only when the predicted future values lie outside the boundary of their corresponding zones. The zone control is usually implemented by selecting an appropriate weighting matrix for the output error in the control cost function. When an output prediction is inside its zone, the corresponding weight is zeroed, so that the controller ignores this output. When the output prediction lies outside the zone, the error weight is made equal to a specified value and the distance between the output prediction and the boundary of the zone is minimized. The main problem of this approach, as long as stability of the closed loop is concerned, is that each time an output is switched from the status of non-controlled to the status of controlled, or vice versa, a different linear controller is activated. Thus, throughout the continuous operation of the process, the control system keeps switching from one controller to another. Even if a stabilizing control law is developed for each of the control configurations, switching among stable controllers not necessarily produces a stable closed loop system. Here, a stable M PC is developed for the zone control of open-loop stable systems. Focusing on the practical application of the proposed controller, it is assumed that in the control structure of the process system there is an upper optimization layer that defines optimal targets to the system inputs. The performance of the proposed strategy is illustrated by simulation of a subsystem of an industrial FCC system. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
In order to model the synchronization of brain signals, a three-node fully-connected network is presented. The nodes are considered to be voltage control oscillator neurons (VCON) allowing to conjecture about how the whole process depends on synaptic gains, free-running frequencies and delays. The VCON, represented by phase-locked loops (PLL), are fully-connected and, as a consequence, an asymptotically stable synchronous state appears. Here, an expression for the synchronous state frequency is derived and the parameter dependence of its stability is discussed. Numerical simulations are performed providing conditions for the use of the derived formulae. Model differential equations are hard to be analytically treated, but some simplifying assumptions combined with simulations provide an alternative formulation for the long-term behavior of the fully-connected VCON network. Regarding this kind of network as models for brain frequency signal processing, with each PLL representing a neuron (VCON), conditions for their synchronization are proposed, considering the different bands of brain activity signals and relating them to synaptic gains, delays and free-running frequencies. For the delta waves, the synchronous state depends strongly on the delays. However, for alpha, beta and theta waves, the free-running individual frequencies determine the synchronous state. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The ideal conditions for the operation of tandem cold mills are connected to a set of references generated by models and used by dynamic regulators. Aiming at the optimization of the friction and yield stress coefficients an adaptation algorithm is proposed in this paper. Experimental results obtained from an industrial cold rolling mill are presented. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
This letter addresses the optimization and complexity reduction of switch-reconfigured antennas. A new optimization technique based on graph models is investigated. This technique is used to minimize the redundancy in a reconfigurable antenna structure and reduce its complexity. A graph modeling rule for switch-reconfigured antennas is proposed, and examples are presented.
Resumo:
Distribution of timing signals is an essential factor for the development of digital systems for telecommunication networks, integrated circuits and manufacturing automation. Originally, this distribution was implemented by using the master-slave architecture with a precise master clock generator sending signals to phase-locked loops (PLL) working as slave oscillators. Nowadays, wireless networks with dynamical connectivity and the increase in size and operation frequency of the integrated circuits suggest that the distribution of clock signals could be more efficient if mutually connected architectures were used. Here, mutually connected PLL networks are studied and conditions for synchronous states existence are analytically derived, depending on individual node parameters and network connectivity, considering that the nodes are nonlinear oscillators with nonlinear coupling conditions. An expression for the network synchronisation frequency is obtained. The lock-in range and the transmission error bounds are analysed providing hints to the design of this kind of clock distribution system.
Resumo:
Eight different models to represent the effect of friction in control valves are presented: four models based on physical principles and four empirical ones. The physical models, both static and dynamic, have the same structure. The models are implemented in Simulink/Matlab (R) and compared, using different friction coefficients and input signals. Three of the models were able to reproduce the stick-slip phenomenon and passed all the tests, which were applied following ISA standards. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents two strategies for the upgrade of set-up generation systems for tandem cold mills. Even though these mills have been modernized mainly due to quality requests, their upgrades may be made intending to replace pre-calculated reference tables. In this case, Bryant and Osborn mill model without adaptive technique is proposed. As a more demanding modernization, Bland and Ford model including adaptation is recommended, although it requires a more complex computational hardware. Advantages and disadvantages of these two systems are compared and discussed and experimental results obtained from an industrial cold mill are shown.
Resumo:
Computer viruses are an important risk to computational systems endangering either corporations of all sizes or personal computers used for domestic applications. Here, classical epidemiological models for disease propagation are adapted to computer networks and, by using simple systems identification techniques a model called SAIC (Susceptible, Antidotal, Infectious, Contaminated) is developed. Real data about computer viruses are used to validate the model. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we compare three residuals to assess departures from the error assumptions as well as to detect outlying observations in log-Burr XII regression models with censored observations. These residuals can also be used for the log-logistic regression model, which is a special case of the log-Burr XII regression model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to the modified martingale-type residual in log-Burr XII regression models with censored data.
Resumo:
Mixed models have become important in analyzing the results of experiments, particularly those that require more complicated models (e.g., those that involve longitudinal data). This article describes a method for deriving the terms in a mixed model. Our approach extends an earlier method by Brien and Bailey to explicitly identify terms for which autocorrelation and smooth trend arising from longitudinal observations need to be incorporated in the model. At the same time we retain the principle that the model used should include, at least, all the terms that are justified by the randomization. This is done by dividing the factors into sets, called tiers, based on the randomization and determining the crossing and nesting relationships between factors. The method is applied to formulate mixed models for a wide range of examples. We also describe the mixed model analysis of data from a three-phase experiment to investigate the effect of time of refinement on Eucalyptus pulp from four different sources. Cubic smoothing splines are used to describe differences in the trend over time and unstructured covariance matrices between times are found to be necessary.
Resumo:
In this paper, we present various diagnostic methods for polyhazard models. Polyhazard models are a flexible family for fitting lifetime data. Their main advantage over the single hazard models, such as the Weibull and the log-logistic models, is to include a large amount of nonmonotone hazard shapes, as bathtub and multimodal curves. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. A discussion of the computation of the likelihood displacement as well as the normal curvature in the local influence method are presented. Finally, an example with real data is given for illustration.
Resumo:
This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The zero-inflated negative binomial model is used to account for overdispersion detected in data that are initially analyzed under the zero-Inflated Poisson model A frequentist analysis a jackknife estimator and a non-parametric bootstrap for parameter estimation of zero-inflated negative binomial regression models are considered In addition an EM-type algorithm is developed for performing maximum likelihood estimation Then the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and some ways to perform global influence analysis are derived In order to study departures from the error assumption as well as the presence of outliers residual analysis based on the standardized Pearson residuals is discussed The relevance of the approach is illustrated with a real data set where It is shown that zero-inflated negative binomial regression models seems to fit the data better than the Poisson counterpart (C) 2010 Elsevier B V All rights reserved