932 resultados para Predictive controllers
Resumo:
Model Predictive Control (MPC) is a control method that solves in real time an optimal control problem over a finite horizon. The finiteness of the horizon is both the reason of MPC's success and its main limitation. In operational water resources management, MPC has been in fact successfully employed for controlling systems with a relatively short memory, such as canals, where the horizon length is not an issue. For reservoirs, which have generally a longer memory, MPC applications are presently limited to short term management only. Short term reservoir management can be effectively used to deal with fast process, such as floods, but it is not capable of looking sufficiently ahead to handle long term issues, such as drought. To overcome this limitation, we propose an Infinite Horizon MPC (IH-MPC) solution that is particularly suitable for reservoir management. We propose to structure the input signal by use of orthogonal basis functions, therefore reducing the optimization argument to a finite number of variables, and making the control problem solvable in a reasonable time. We applied this solution for the management of the Manantali Reservoir. Manantali is a yearly reservoir located in Mali, on the Senegal river, affecting water systems of Mali, Senegal, and Mauritania. The long term horizon offered by IH-MPC is necessary to deal with the strongly seasonal climate of the region.
Resumo:
This work aims to compare the forecast efficiency of different types of methodologies applied to Brazilian Consumer inflation (IPCA). We will compare forecasting models using disaggregated and aggregated data over twelve months ahead. The disaggregated models were estimated by SARIMA and will have different levels of disaggregation. Aggregated models will be estimated by time series techniques such as SARIMA, state-space structural models and Markov-switching. The forecasting accuracy comparison will be made by the selection model procedure known as Model Confidence Set and by Diebold-Mariano procedure. We were able to find evidence of forecast accuracy gains in models using more disaggregated data
Resumo:
The predictive control technique has gotten, on the last years, greater number of adepts in reason of the easiness of adjustment of its parameters, of the exceeding of its concepts for multi-input/multi-output (MIMO) systems, of nonlinear models of processes could be linearised around a operating point, so can clearly be used in the controller, and mainly, as being the only methodology that can take into consideration, during the project of the controller, the limitations of the control signals and output of the process. The time varying weighting generalized predictive control (TGPC), studied in this work, is one more an alternative to the several existing predictive controls, characterizing itself as an modification of the generalized predictive control (GPC), where it is used a reference model, calculated in accordance with parameters of project previously established by the designer, and the application of a new function criterion, that when minimized offers the best parameters to the controller. It is used technique of the genetic algorithms to minimize of the function criterion proposed and searches to demonstrate the robustness of the TGPC through the application of performance, stability and robustness criterions. To compare achieves results of the TGPC controller, the GCP and proportional, integral and derivative (PID) controllers are used, where whole the techniques applied to stable, unstable and of non-minimum phase plants. The simulated examples become fulfilled with the use of MATLAB tool. It is verified that, the alterations implemented in TGPC, allow the evidence of the efficiency of this algorithm
Resumo:
The aim of this study was to develop a laboratory method for time response evaluation on electronically controlled spray equipment using Programmable Logic Controllers (PLCs). For that purpose, a PLC controlled digital drive inverter was set up to drive an asynchronous electric motor linked to a centrifugal pump on a experimental sprayer equipped with electronic flow control. The PLC was operated via RS232 serial communication from a PC computer. A user program was written to control de motor by adjusting the following system variables, all related to the motor speed: time stopped; ramp up and ramp down times, time running at a given constant speed and ramp down time to stop the motor. This set up was used in conjunction with a data acquisition system to perform laboratory tests with an electronically controlled sprayer. Time response for pressure stabilization was measured while changing the pump speed by +/-20%. The results showed that for a 0.2 s ramp time increasing the motor speed, as an example, an AgLogix Flow Control system (Midwest Technologies Inc.) took 22 s in average to readjust the pressure. When decreasing the motor speed, this time response was down to 8 s. General results also showed that this kind of methodology could make easier the definition of standards for tests on electronically controlled application equipment.
Resumo:
In most cases, the cost of a control system increases based on its complexity. Proportional (P) controller is the simplest and most intuitive structure for the implementation of linear control systems. The difficulty to find the stability range of feedback systems with P controllers, using the Routh-Hurwitz criterion, increases with the order of the plant. For high order plants, the stability range cannot be easily obtained from the investigation of the coefficient signs in the first column of the Routh's array. A direct method for the determination of the stability range is presented. The method is easy to understand, to compute, and to offer the students a better comprehension on this subject. A program in MATLAB language, based on the proposed method, design examples, and class assessments, is provided in order to help the pedagogical issues. The method and the program enable the user to specify a decay rate and also extend to proportional-integral (PI), proportional-derivative (PD), and proportional-integral-derivative (PID) controllers.
Resumo:
Sometimes it is inconvenient or expensive to open the loop of a system to insert lag controllers-for instance, when this system is an open-loop system. A new controller structure where the loop is not opened, and that allows the design of lag controllers as in the case where one can open the loop, is presented. This result can be used by educators in undergraduate courses that deal with classic control system theory, because it allows a better comprehension of the concept of lag compensation and provides a new method for its design and implementation. An example illustrates the application of the proposed method.
Resumo:
Relaxed conditions for stability of nonlinear continuous-time systems given by fuzzy models axe presented. A theoretical analysis shows that the proposed method provides better or at least the same results of the methods presented in the literature. Digital simulations exemplify this fact. This result is also used for fuzzy regulators design. The nonlinear systems are represented by fuzzy models proposed by Takagi and Sugeno. The stability analysis and the design of controllers axe described by LMIs (Linear Matrix Inequalities), that can be solved efficiently using convex programming techniques.
Resumo:
In this paper we use the Hermite-Biehler theorem to establish results on the design of proportional plus integral plus derivative (PID) controllers for a class of time delay systems. Using the property of interlacing at high frequencies of the class of systems considered and linear programming we obtain the set of all stabilizing PID controllers. As far as we know, previous results on the synthesis of PID controllers rely on the solution of transcendental equations. This paper also extends previous results on the synthesis of proportional controllers for a class of delay systems of retarded type to a larger class of delay systems. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Background: Oral Squamous Cell Carcinoma (OSCC) is a major cause of cancer death worldwide, which is mainly due to recurrence leading to treatment failure and patient death. Histological status of surgical margins is a currently available assessment for recurrence risk in OSCC; however histological status does not predict recurrence, even in patients with histologically negative margins. Therefore, molecular analysis of histologically normal resection margins and the corresponding OSCC may aid in identifying a gene signature predictive of recurrence.Methods: We used a meta-analysis of 199 samples (OSCCs and normal oral tissues) from five public microarray datasets, in addition to our microarray analysis of 96 OSCCs and histologically normal margins from 24 patients, to train a gene signature for recurrence. Validation was performed by quantitative real-time PCR using 136 samples from an independent cohort of 30 patients.Results: We identified 138 significantly over-expressed genes (> 2-fold, false discovery rate of 0.01) in OSCC. By penalized likelihood Cox regression, we identified a 4-gene signature with prognostic value for recurrence in our training set. This signature comprised the invasion-related genes MMP1, COL4A1, P4HA2, and THBS2. Overexpression of this 4-gene signature in histologically normal margins was associated with recurrence in our training cohort (p = 0.0003, logrank test) and in our independent validation cohort (p = 0.04, HR = 6.8, logrank test).Conclusion: Gene expression alterations occur in histologically normal margins in OSCC. Over-expression of the 4-gene signature in histologically normal surgical margins was validated and highly predictive of recurrence in an independent patient cohort. Our findings may be applied to develop a molecular test, which would be clinically useful to help predict which patients are at a higher risk of local recurrence.
Resumo:
Background: The identification of patterns of inappropriate antimicrobial prescriptions in hospitals contributes to the improvement of antimicrobial stewardship programs (ASP). Methods: We conducted a cross-sectional study to identify predictors of inappropriateness in requests for parenteral antimicrobials (RPAs) in a teaching hospital with 285 beds. We reviewed 25% of RPAs for therapeutic purposes from y 2005. Appropriateness was evaluated according to current guidelines for antimicrobial therapy. We assessed predictors of inappropriateness through univariate and multivariate models. RPAs classified as 'appropriate' or 'probably appropriate' were selected as controls. Case groups comprised inappropriate RPAs, either in general or for specific errors. Results: Nine hundred and sixty-three RPAs were evaluated, 34.6% of which were considered inappropriate. In the multivariate analysis, general predictors of inappropriateness were: prescription on week-ends/holidays (odds ratio (OR) 1.67, 95% confidence interval (CI) 1.20-2.28, p = 0.002), patient in the intensive care unit (OR 1.57, 95% CI 1.11-2.23, p = 0.01), peritoneal infection (OR 2.15, 95% CI 1.27-3.65, p = 0.004), urinary tract infection (OR 1.89, 95% CI 1.25 -2.87, p = 0.01), combination therapy with 2 or more antimicrobials (OR 1.72, 95% CI 1.15-2.57, p = 0.008) and prescriptions including penicillins (OR 2.12, 95% CI 1.39-3.25, p = 0.001) or 1(st) generation cephalosporins (OR 1.74, 95% CI 1.01-3.00, p = 0.048). Previous consultation with an infectious diseases (ID) specialist had a protective effect against inappropriate prescription (OR 0.34, 95% CI 0.24-0.50, p < 0.001). Factors independently associated with specific prescription errors varied. However, consultation with an ID specialist was protective against both unnecessary antimicrobial use (OR 0.04, 95% CI 0.01-0.26, p = 0.001) and requests for agents with an insufficient antimicrobial spectrum (OR 0.14, 95% CI 0.03-0.30, p = 0.01). Conclusions: Our results demonstrate the importance of previous consultation with an ID specialist in assuring the quality of prescriptions. Also, they highlight prescription patterns that should be approached by ASP policies.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Background and aims: Staphylococcus epidermidis and other coagulase-negative staphylococci (CoNS) are the most common agents of continuous ambulatory peritoneal dialysis (CAPD) peritonitis. Episodes caused by Staphylococcus aureus evolve with a high method failure rate while CoNS peritonitis is generally benign. The purpose of this study was to compare episodes of peritonitis caused by CoNS species and S. aureus to evaluate the microbiological and host factors that affect outcome. Material and methods: Microbiological and clinical data were retrospectively studied from 86 new episodes of peritonitis caused by staphylococci species between January 1996 and December 2000 in a university dialysis center. The influence of microbiological and host factors (age, sex, diabetes, use of vancomycin, exchange system and treatment time on CAPD) was analyzed by logistic regression model. The clinical outcome was classified into two results (resolution and non-resolution). Results: the odds of peritonitis resolution were not influenced by host factors. Oxacillin susceptibility was present in 30 of 35 S. aureus lineages and 22 of 51 CoNS (p = 0.001). There were 32 of 52 (61.5%) episodes caused by oxacillin-susceptible and 20 of 34 (58.8%) by oxacillin-resistant lineages resolved (p = 0.9713). of the 35 cases caused by S. aureus, 17 (48.6%) resolved and among 51 CoNS episodes 40 (78.4%) resolved. Resolution odds were 7.1 times higher for S. epidermidis than S. aureus (p = 0.0278), while other CoNS had 7.6 times higher odds resolution than S. epidermidis cases (p = 0.052). Episodes caused by S. haemolyticus had similar resolution odds to S. epidermidis (p = 0.859). Conclusions: S. aureus etiology is an independent factor associated with peritonitis non-resolution in CAPD, while S. epidermidis and S. haemolyticus have a lower resolution rate than other CoNS. Possibly the aggressive nature of these agents, particularly S. aureus, can be explained by their recognized pathogenic factors, more than antibiotic resistance.
Resumo:
A novel strategy to handle divergences typical of perturbative calculations is implemented for the Nambu-Jona-Lasinio model and its phenomenological consequences investigated. The central idea of the method is to avoid the critical step involved in the regularization process, namely, the explicit evaluation of divergent integrals. This goal is achieved by assuming a regularization distribution in an implicit way and making use, in intermediary steps, only of very general properties of such regularization. The finite parts are separated from the divergent ones and integrated free from effects of the regularization. The divergent parts are organized in terms of standard objects, which are independent of the ( arbitrary) momenta running in internal lines of loop graphs. Through the analysis of symmetry relations, a set of properties for the divergent objects are identified, which we denominate consistency relations, reducing the number of divergent objects to only a few. The calculational strategy eliminates unphysical dependencies of the arbitrary choices for the routing of internal momenta, leading to ambiguity-free, and symmetry-preserving physical amplitudes. We show that the imposition of scale properties for the basic divergent objects leads to a critical condition for the constituent quark mass such that the remaining arbitrariness is removed. The model becomes predictive in the sense that its phenomenological consequences do not depend on possible choices made in intermediary steps. Numerical results are obtained for physical quantities at the one-loop level for the pion and sigma masses and pion-quark and sigma-quark coupling constants.
Resumo:
O objetivo do artigo foi avaliar o uso da lógica fuzzy para estimar possibilidade de óbito neonatal. Desenvolveu-se um modelo computacional com base na teoria dos conjuntos fuzzy, tendo como variáveis peso ao nascer, idade gestacional, escore de Apgar e relato de natimorto. Empregou-se o método de inferência de Mamdani, e a variável de saída foi o risco de morte neonatal. Criaram-se 24 regras de acordo com as variáveis de entrada, e a validação do modelo utilizou um banco de dados real de uma cidade brasileira. A acurácia foi estimada pela curva ROC; os riscos foram comparados pelo teste t de Student. O programa MATLAB 6.5 foi usado para construir o modelo. Os riscos médios foram menores para os que sobreviveram (p < 0,001). A acurácia do modelo foi 0,90. A maior acurácia foi com possibilidade de risco igual ou menor que 25% (sensibilidade = 0,70, especificidade = 0,98, valor preditivo negativo = 0,99 e valor preditivo positivo = 0,22). O modelo mostrou acurácia e valor preditivo negativo bons, podendo ser utilizado em hospitais gerais.