183 resultados para plant models
Resumo:
The ideal conditions for the operation of tandem cold mills are connected to a set of references generated by models and used by dynamic regulators. Aiming at the optimization of the friction and yield stress coefficients an adaptation algorithm is proposed in this paper. Experimental results obtained from an industrial cold rolling mill are presented. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
This letter addresses the optimization and complexity reduction of switch-reconfigured antennas. A new optimization technique based on graph models is investigated. This technique is used to minimize the redundancy in a reconfigurable antenna structure and reduce its complexity. A graph modeling rule for switch-reconfigured antennas is proposed, and examples are presented.
Resumo:
Distribution of timing signals is an essential factor for the development of digital systems for telecommunication networks, integrated circuits and manufacturing automation. Originally, this distribution was implemented by using the master-slave architecture with a precise master clock generator sending signals to phase-locked loops (PLL) working as slave oscillators. Nowadays, wireless networks with dynamical connectivity and the increase in size and operation frequency of the integrated circuits suggest that the distribution of clock signals could be more efficient if mutually connected architectures were used. Here, mutually connected PLL networks are studied and conditions for synchronous states existence are analytically derived, depending on individual node parameters and network connectivity, considering that the nodes are nonlinear oscillators with nonlinear coupling conditions. An expression for the network synchronisation frequency is obtained. The lock-in range and the transmission error bounds are analysed providing hints to the design of this kind of clock distribution system.
Resumo:
Eight different models to represent the effect of friction in control valves are presented: four models based on physical principles and four empirical ones. The physical models, both static and dynamic, have the same structure. The models are implemented in Simulink/Matlab (R) and compared, using different friction coefficients and input signals. Three of the models were able to reproduce the stick-slip phenomenon and passed all the tests, which were applied following ISA standards. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents two strategies for the upgrade of set-up generation systems for tandem cold mills. Even though these mills have been modernized mainly due to quality requests, their upgrades may be made intending to replace pre-calculated reference tables. In this case, Bryant and Osborn mill model without adaptive technique is proposed. As a more demanding modernization, Bland and Ford model including adaptation is recommended, although it requires a more complex computational hardware. Advantages and disadvantages of these two systems are compared and discussed and experimental results obtained from an industrial cold mill are shown.
Resumo:
This paper presents the design and implementation of an embedded soft sensor, i. e., a generic and autonomous hardware module, which can be applied to many complex plants, wherein a certain variable cannot be directly measured. It is implemented based on a fuzzy identification algorithm called ""Limited Rules"", employed to model continuous nonlinear processes. The fuzzy model has a Takagi-Sugeno-Kang structure and the premise parameters are defined based on the Fuzzy C-Means (FCM) clustering algorithm. The firmware contains the soft sensor and it runs online, estimating the target variable from other available variables. Tests have been performed using a simulated pH neutralization plant. The results of the embedded soft sensor have been considered satisfactory. A complete embedded inferential control system is also presented, including a soft sensor and a PID controller. (c) 2007, ISA. Published by Elsevier Ltd. All rights reserved.
Resumo:
This experiment examined the effects of the discharge of water treatment plant (WTP) sludge into the following three types of wastewater treatment systems: a pilot-scale upflow anaerobic sludge blanket (UASB) reactor, a pilot-scale activated sludge system, and a full-scale activated sludge sequencing batch reactor (SBR). The UASB reactor received 50 mg of suspended solids ( SS) of WTP sludge per liter of wastewater in the first phase, and, in the second phase, it received 75 mg SS/L. The pilot-scale activated sludge system received 25 and 50 mg SS/L in the first and second phases, respectively. The full-scale WWTP ( SBR) received approximately 74 mg SS/L. The results of the experiments showed that, despite some negative effects on nitrification, there were positive effects on phosphorus removal, and, furthermore, there was the addition of solids in all systems. Water Environ. Res., 82, 392 ( 2010).
Resumo:
A thermodynamic information system for diagnosis and prognosis of an existing power plant was developed. The system is based on an analytic approach that informs the current thermodynamic condition of all cycle components, as well as the improvement that can be obtained in the cycle performance by the elimination of the discovered anomalies. The effects induced by components anomalies and repairs in other components efficiency, which have proven to be one of the main drawbacks in the diagnosis and prognosis analyses, are taken into consideration owing to the use of performance curves and corrected performance curves together with the thermodynamic data collected from the distributed control system. The approach used to develop the system is explained, the system implementation in a real gas turbine cogeneration combined cycle is described and the results are discussed. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Computer viruses are an important risk to computational systems endangering either corporations of all sizes or personal computers used for domestic applications. Here, classical epidemiological models for disease propagation are adapted to computer networks and, by using simple systems identification techniques a model called SAIC (Susceptible, Antidotal, Infectious, Contaminated) is developed. Real data about computer viruses are used to validate the model. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we compare three residuals to assess departures from the error assumptions as well as to detect outlying observations in log-Burr XII regression models with censored observations. These residuals can also be used for the log-logistic regression model, which is a special case of the log-Burr XII regression model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to the modified martingale-type residual in log-Burr XII regression models with censored data.
Resumo:
Mixed models have become important in analyzing the results of experiments, particularly those that require more complicated models (e.g., those that involve longitudinal data). This article describes a method for deriving the terms in a mixed model. Our approach extends an earlier method by Brien and Bailey to explicitly identify terms for which autocorrelation and smooth trend arising from longitudinal observations need to be incorporated in the model. At the same time we retain the principle that the model used should include, at least, all the terms that are justified by the randomization. This is done by dividing the factors into sets, called tiers, based on the randomization and determining the crossing and nesting relationships between factors. The method is applied to formulate mixed models for a wide range of examples. We also describe the mixed model analysis of data from a three-phase experiment to investigate the effect of time of refinement on Eucalyptus pulp from four different sources. Cubic smoothing splines are used to describe differences in the trend over time and unstructured covariance matrices between times are found to be necessary.
Resumo:
In this paper, we present various diagnostic methods for polyhazard models. Polyhazard models are a flexible family for fitting lifetime data. Their main advantage over the single hazard models, such as the Weibull and the log-logistic models, is to include a large amount of nonmonotone hazard shapes, as bathtub and multimodal curves. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. A discussion of the computation of the likelihood displacement as well as the normal curvature in the local influence method are presented. Finally, an example with real data is given for illustration.
Resumo:
This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The zero-inflated negative binomial model is used to account for overdispersion detected in data that are initially analyzed under the zero-Inflated Poisson model A frequentist analysis a jackknife estimator and a non-parametric bootstrap for parameter estimation of zero-inflated negative binomial regression models are considered In addition an EM-type algorithm is developed for performing maximum likelihood estimation Then the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and some ways to perform global influence analysis are derived In order to study departures from the error assumption as well as the presence of outliers residual analysis based on the standardized Pearson residuals is discussed The relevance of the approach is illustrated with a real data set where It is shown that zero-inflated negative binomial regression models seems to fit the data better than the Poisson counterpart (C) 2010 Elsevier B V All rights reserved
Resumo:
In this study, regression models are evaluated for grouped survival data when the effect of censoring time is considered in the model and the regression structure is modeled through four link functions. The methodology for grouped survival data is based on life tables, and the times are grouped in k intervals so that ties are eliminated. Thus, the data modeling is performed by considering the discrete models of lifetime regression. The model parameters are estimated by using the maximum likelihood and jackknife methods. To detect influential observations in the proposed models, diagnostic measures based on case deletion, which are denominated global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to those measures, the local influence and the total influential estimate are also employed. Various simulation studies are performed and compared to the performance of the four link functions of the regression models for grouped survival data for different parameter settings, sample sizes and numbers of intervals. Finally, a data set is analyzed by using the proposed regression models. (C) 2010 Elsevier B.V. All rights reserved.