930 resultados para Input-output data
Resumo:
In this paper, a new model-based proportional–integral–derivative (PID) tuning and controller approach is introduced for Hammerstein systems that are identified on the basis of the observational input/output data. The nonlinear static function in the Hammerstein system is modelled using a B-spline neural network. The control signal is composed of a PID controller, together with a correction term. Both the parameters in the PID controller and the correction term are optimized on the basis of minimizing the multistep ahead prediction errors. In order to update the control signal, the multistep ahead predictions of the Hammerstein system based on B-spline neural networks and the associated Jacobian matrix are calculated using the de Boor algorithms, including both the functional and derivative recursions. Numerical examples are utilized to demonstrate the efficacy of the proposed approaches.
Resumo:
A new PID tuning and controller approach is introduced for Hammerstein systems based on input/output data. A B-spline neural network is used to model the nonlinear static function in the Hammerstein system. The control signal is composed of a PID controller together with a correction term. In order to update the control signal, the multistep ahead predictions of the Hammerstein system based on the B-spline neural networks and the associated Jacobians matrix are calculated using the De Boor algorithms including both the functional and derivative recursions. A numerical example is utilized to demonstrate the efficacy of the proposed approaches.
Resumo:
A novel technique for selecting the poles of orthonormal basis functions (OBF) in Volterra models of any order is presented. It is well-known that the usual large number of parameters required to describe the Volterra kernels can be significantly reduced by representing each kernel using an appropriate basis of orthonormal functions. Such a representation results in the so-called OBF Volterra model, which has a Wiener structure consisting of a linear dynamic generated by the orthonormal basis followed by a nonlinear static mapping given by the Volterra polynomial series. Aiming at optimizing the poles that fully parameterize the orthonormal bases, the exact gradients of the outputs of the orthonormal filters with respect to their poles are computed analytically by using a back-propagation-through-time technique. The expressions relative to the Kautz basis and to generalized orthonormal bases of functions (GOBF) are addressed; the ones related to the Laguerre basis follow straightforwardly as a particular case. The main innovation here is that the dynamic nature of the OBF filters is fully considered in the gradient computations. These gradients provide exact search directions for optimizing the poles of a given orthonormal basis. Such search directions can, in turn, be used as part of an optimization procedure to locate the minimum of a cost-function that takes into account the error of estimation of the system output. The Levenberg-Marquardt algorithm is adopted here as the optimization procedure. Unlike previous related work, the proposed approach relies solely on input-output data measured from the system to be modeled, i.e., no information about the Volterra kernels is required. Examples are presented to illustrate the application of this approach to the modeling of dynamic systems, including a real magnetic levitation system with nonlinear oscillatory behavior.
Resumo:
Economic growth is the increase in the inflation-adjusted market value of the goods and services produced by an economy over time. The total output is the quantity of goods or servicesproduced in a given time period within a country. Sweden was affected by two crises during the period 2000-2010: a dot-com bubble and a financial crisis. How did these two crises affect the economic growth? The changes of domestic output can be separated into four parts: changes in intermediate demand, final domestic demand, export demand and import substitution. The main purpose of this article is to analyze the economic growth during the period 2000-2010, with focus on the dot-com bubble in the beginning of the period 2000-2005, and the financial crisis at the end of the period 2005-2010. The methodology to be used is the structural decomposition method. This investigation shows that the main contributions to the Swedish total domestic output increase in both the period 2000-2005 and the period 2005-2010 were the effect of domestic demand. In the period 2005-2010, financial crisis weakened the effect of export. The output of the primary sector went from a negative change into a positive, explained mainly by strong export expansion. In the secondary sector, export had most effect in the period 2000-2005. Nevertheless, domestic demand and import ratio had more effect during the financial crisis period. Lastly, in the tertiary sector, domestic demand can mainly explain the output growth in the whole period 2000-2010.
Resumo:
Existing distributed hydrologic models are complex and computationally demanding for using as a rapid-forecasting policy-decision tool, or even as a class-room educational tool. In addition, platform dependence, specific input/output data structures and non-dynamic data-interaction with pluggable software components inside the existing proprietary frameworks make these models restrictive only to the specialized user groups. RWater is a web-based hydrologic analysis and modeling framework that utilizes the commonly used R software within the HUBzero cyber infrastructure of Purdue University. RWater is designed as an integrated framework for distributed hydrologic simulation, along with subsequent parameter optimization and visualization schemes. RWater provides platform independent web-based interface, flexible data integration capacity, grid-based simulations, and user-extensibility. RWater uses RStudio to simulate hydrologic processes on raster based data obtained through conventional GIS pre-processing. The program integrates Shuffled Complex Evolution (SCE) algorithm for parameter optimization. Moreover, RWater enables users to produce different descriptive statistics and visualization of the outputs at different temporal resolutions. The applicability of RWater will be demonstrated by application on two watersheds in Indiana for multiple rainfall events.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Este trabalho propõe a utilização de uma nova metodologia para a localização de falhas em linhas de transmissão (LT). Esta metodologia consiste na utilização da decomposição harmônica da corrente de fuga de uma linha e na aplicação de uma Rede Neural Artificial (RNA) capaz de distinguir padrões da condição normal de funcionamento e padrões de situações de falhas de uma LT. Foi utilizado um modelo Pi capaz de absorver dados reais de tensão e corrente de três fases e de alterar valores de R, L e C segundo modificações ambientais. Neste modelo foram geradas falhas em todas as torres com diferentes valores de capacitância. A saída fornecida pelo modelo é a decomposição da corrente de fuga do trecho considerado. Os dados de entrada e saída do modelo foram utilizados no treinamento da RNA desenvolvida. A aquisição de dados reais de tensão e corrente foi feita através de analisadores de parâmetros de qualidade de energia elétrica instalados nas extremidades de um trecho de LT, Guamá-Utinga, pertencente à Centrais Elétricas do Norte do Brasil ELETRONORTE. O cálculo dos parâmetros construtivos foi feito através do método matricial e melhorado através da utilização do Método de Elementos Finitos (MEF). A RNA foi desenvolvida com o auxílio do software Matlab. Para treinamento da RNA foi utilizado o algoritmo de Retropropagação Resiliente que apresentou um bom desempenho. A RNA foi treinada com dois conjuntos de dados de treinamento para analisar possíveis diferenças entre as saídas fornecidas pelos dois grupos. Nos dois casos apresentou resultados satisfatórios, possibilitando a localização de falhas no trecho considerado.
Resumo:
When we try to analyze and to control a system whose model was obtained only based on input/output data, accuracy is essential in the model. On the other hand, to make the procedure practical, the modeling stage must be computationally efficient. In this regard, this paper presents the application of extended Kalman filter for the parametric adaptation of a fuzzy model
Application of the Extended Kalman filter to fuzzy modeling: Algorithms and practical implementation
Resumo:
Modeling phase is fundamental both in the analysis process of a dynamic system and the design of a control system. If this phase is in-line is even more critical and the only information of the system comes from input/output data. Some adaptation algorithms for fuzzy system based on extended Kalman filter are presented in this paper, which allows obtaining accurate models without renounce the computational efficiency that characterizes the Kalman filter, and allows its implementation in-line with the process
Resumo:
If we classify variables in a program into various security levels, then a secure information flow analysis aims to verify statically that information in a program can flow only in ways consistent with the specified security levels. One well-studied approach is to formulate the rules of the secure information flow analysis as a type system. A major trend of recent research focuses on how to accommodate various sophisticated modern language features. However, this approach often leads to overly complicated and restrictive type systems, making them unfit for practical use. Also, problems essential to practical use, such as type inference and error reporting, have received little attention. This dissertation identified and solved major theoretical and practical hurdles to the application of secure information flow. ^ We adopted a minimalist approach to designing our language to ensure a simple lenient type system. We started out with a small simple imperative language and only added features that we deemed most important for practical use. One language feature we addressed is arrays. Due to the various leaking channels associated with array operations, arrays have received complicated and restrictive typing rules in other secure languages. We presented a novel approach for lenient array operations, which lead to simple and lenient typing of arrays. ^ Type inference is necessary because usually a user is only concerned with the security types for input/output variables of a program and would like to have all types for auxiliary variables inferred automatically. We presented a type inference algorithm B and proved its soundness and completeness. Moreover, algorithm B stays close to the program and the type system and therefore facilitates informative error reporting that is generated in a cascading fashion. Algorithm B and error reporting have been implemented and tested. ^ Lastly, we presented a novel framework for developing applications that ensure user information privacy. In this framework, core computations are defined as code modules that involve input/output data from multiple parties. Incrementally, secure flow policies are refined based on feedback from the type checking/inference. Core computations only interact with code modules from involved parties through well-defined interfaces. All code modules are digitally signed to ensure their authenticity and integrity. ^
Resumo:
This dissertation comprises three individual chapters in an effort to examine different explanatory variables that affect firm performance. Chapter Two proposes an additional determinant of firm survival. Based on a detailed examination of firm survival in the British automobile industry between 1895 and 1970, we conclude that a firm's selection of submarket (defined by quality level) influenced survival. In contrast to findings for the US automobile industry, there is no evidence of first-mover advantage in the market as a whole. However, we do find evidence of first-mover advantage after conditioning on submarket choice. Chapter Three examines the effects of product line expansion on firm performance in terms of survival time. Based on a detailed examination of firm survival time in the British automobile industry between 1895 and 1970, we find that diversification exerts a positive effect on firm survival. Furthermore, our findings support the literature with respect to the impacts of submarket types, pre-entry experience, and timing of entry on firm survival time. Chapter Four examines corporate diversification in U.S. manufacturing and service firms. We develop measures of how related a firm's diverse activities are using input-output data and the NAILS classification to construct indexes of "vertical relatedness" and "complementarity". Strong relationships between these two measures are found. We utilize profitability and excess value as the measure for firm performance. Econometric analysis reveals that there is no relationship between the degree of relatedness of diversification and firm performance for the study period.
Resumo:
Dissertação de mestrado em Bioinformática
Resumo:
The telemetry data processing operation intended for a given mission are pre-defined by an onboard telemetry configuration, mission trajectory and overall telemetry methodology have stabilized lately for ISRO vehicles. The given problem on telemetry data processing is reduced through hierarchical problem reduction whereby the sequencing of operations evolves as the control task and operations on data as the function task. The function task Input, Output and execution criteria are captured into tables which are examined by the control task and then schedules when the function task when the criteria is being met.
Resumo:
Infolge der durch die internationalen Schulvergleichstests eingeleiteten empirischen Wende in der Erziehungswissenschaft hat sich die Aufmerksamkeit vom Input schulischen Lehrens und Lernens zunehmend auf die Ergebnisse (Output) bzw. Wirkungen (Outcomes) verlagert. Die Kernfrage lautet nun: Was kommt am Ende in der Schule bzw. im Unterricht eigentlich heraus? Grundlegende Voraussetzung ergebnisorienterter Steuerung schulischen Unterrichts ist die Formulierung von Bildungsstandards. Wie Bildungsstandards mit Kompetenzmodellen und konkreten Aufgabenstellungen im Unterricht des Faches "Politik & Wirtschaft" verknüpft werden können, wird in diesem Beitrag einer genaueren Analyse unterzogen. Vor dem Hintergrund bildungstheoretischer Vorstellungen im Anschluss an Immanuel Kant kommen dabei das Literacy-Konzept der Pisa-Studie sowie die "Dokumentarische Methode" nach Karl Mannheim zur Anwendung.
Resumo:
Stock markets employ specialized traders, market-makers, designed to provide liquidity and volume to the market by constantly supplying both supply and demand. In this paper, we demonstrate a novel method for modeling the market as a dynamic system and a reinforcement learning algorithm that learns profitable market-making strategies when run on this model. The sequence of buys and sells for a particular stock, the order flow, we model as an Input-Output Hidden Markov Model fit to historical data. When combined with the dynamics of the order book, this creates a highly non-linear and difficult dynamic system. Our reinforcement learning algorithm, based on likelihood ratios, is run on this partially-observable environment. We demonstrate learning results for two separate real stocks.