955 resultados para NON-LINEAR MODELS
Resumo:
We present a new version (> 2.0) of the hglm package for fitting hierarchical generalized linear models (HGLMs) with spatially correlated random effects. CAR() and SAR() families for conditional and simultaneous autoregressive random effects were implemented. Eigen decomposition of the matrix describing the spatial structure (e.g., the neighborhood matrix) was used to transform the CAR/SAR random effects into an independent, but eteroscedastic, Gaussian random effect. A linear predictor is fitted for the random effect variance to estimate the parameters in the CAR and SAR models. This gives a computationally efficient algorithm for moderately sized problems.
Resumo:
Whether human capital increases or decreases wage uncertainty is an open ques- tion from an empirical standpoint. Yet, most policy prescriptions regarding human capital formation are based on models that impose riskiness on this type of invest- ment. We slightly deviate from the rest of the literature by allowing for non-linear income taxes in a two period model. This enables us to derive prescriptions that are robust to the risk characteristics of human capital: savings should be discouraged, human capital investments encouraged and both types of investment driven to an e¢ cient level from an aggregate perspective. These prescriptions are also robust to what choices are observed, even though the policy instruments used to implement them are not.
Resumo:
Estudos recentes apontam que diversas estratégias implementadas em hedge funds geram retornos com características não lineares. Seguindo as sugestões encontradas no paper de Agarwal e Naik (2004), este trabalho mostra que uma série de hedge funds dentro da indústria de fundos de investimentos no Brasil apresenta retornos que se assemelham ao de uma estratégia em opções de compra e venda no índice de mercado Bovespa. Partindo de um modelo de fatores, introduzimos um índice referenciado no retorno sobre opções de modo que tal fator possa explicar melhor que os tradicionais fatores de risco a característica não linear dos retornos dos fundos de investimento.
Resumo:
In this study, we verify the existence of predictability in the Brazilian equity market. Unlike other studies in the same sense, which evaluate original series for each stock, we evaluate synthetic series created on the basis of linear models of stocks. Following Burgess (1999), we use the “stepwise regression” model for the formation of models of each stock. We then use the variance ratio profile together with a Monte Carlo simulation for the selection of models with potential predictability. Unlike Burgess (1999), we carry out White’s Reality Check (2000) in order to verify the existence of positive returns for the period outside the sample. We use the strategies proposed by Sullivan, Timmermann & White (1999) and Hsu & Kuan (2005) amounting to 26,410 simulated strategies. Finally, using the bootstrap methodology, with 1,000 simulations, we find strong evidence of predictability in the models, including transaction costs.
Resumo:
The objective of this article is to study (understand and forecast) spot metal price levels and changes at monthly, quarterly, and annual horizons. The data to be used consists of metal-commodity prices in a monthly frequency from 1957 to 2012 from the International Financial Statistics of the IMF on individual metal series. We will also employ the (relatively large) list of co-variates used in Welch and Goyal (2008) and in Hong and Yogo (2009) , which are available for download. Regarding short- and long-run comovement, we will apply the techniques and the tests proposed in the common-feature literature to build parsimonious VARs, which possibly entail quasi-structural relationships between different commodity prices and/or between a given commodity price and its potential demand determinants. These parsimonious VARs will be later used as forecasting models to be combined to yield metal-commodity prices optimal forecasts. Regarding out-of-sample forecasts, we will use a variety of models (linear and non-linear, single equation and multivariate) and a variety of co-variates to forecast the returns and prices of metal commodities. With the forecasts of a large number of models (N large) and a large number of time periods (T large), we will apply the techniques put forth by the common-feature literature on forecast combinations. The main contribution of this paper is to understand the short-run dynamics of metal prices. We show theoretically that there must be a positive correlation between metal-price variation and industrial-production variation if metal supply is held fixed in the short run when demand is optimally chosen taking into account optimal production for the industrial sector. This is simply a consequence of the derived-demand model for cost-minimizing firms. Our empirical evidence fully supports this theoretical result, with overwhelming evidence that cycles in metal prices are synchronized with those in industrial production. This evidence is stronger regarding the global economy but holds as well for the U.S. economy to a lesser degree. Regarding forecasting, we show that models incorporating (short-run) commoncycle restrictions perform better than unrestricted models, with an important role for industrial production as a predictor for metal-price variation. Still, in most cases, forecast combination techniques outperform individual models.
Resumo:
The objective of this article is to study (understand and forecast) spot metal price levels and changes at monthly, quarterly, and annual frequencies. Data consists of metal-commodity prices at a monthly and quarterly frequencies from 1957 to 2012, extracted from the IFS, and annual data, provided from 1900-2010 by the U.S. Geological Survey (USGS). We also employ the (relatively large) list of co-variates used in Welch and Goyal (2008) and in Hong and Yogo (2009). We investigate short- and long-run comovement by applying the techniques and the tests proposed in the common-feature literature. One of the main contributions of this paper is to understand the short-run dynamics of metal prices. We show theoretically that there must be a positive correlation between metal-price variation and industrial-production variation if metal supply is held fixed in the short run when demand is optimally chosen taking into account optimal production for the industrial sector. This is simply a consequence of the derived-demand model for cost-minimizing firms. Our empirical evidence fully supports this theoretical result, with overwhelming evidence that cycles in metal prices are synchronized with those in industrial production. This evidence is stronger regarding the global economy but holds as well for the U.S. economy to a lesser degree. Regarding out-of-sample forecasts, our main contribution is to show the benefits of forecast-combination techniques, which outperform individual-model forecasts - including the random-walk model. We use a variety of models (linear and non-linear, single equation and multivariate) and a variety of co-variates and functional forms to forecast the returns and prices of metal commodities. Using a large number of models (N large) and a large number of time periods (T large), we apply the techniques put forth by the common-feature literature on forecast combinations. Empirically, we show that models incorporating (short-run) common-cycle restrictions perform better than unrestricted models, with an important role for industrial production as a predictor for metal-price variation.
Resumo:
O aumento da complexidade do mercado financeiro tem sido relatado por Rajan (2005), Gorton (2008) e Haldane e May (2011) como um dos principais fatores responsáveis pelo incremento do risco sistêmico que culminou na crise financeira de 2007/08. O Bank for International Settlements (2013) aborda a questão da complexidade no contexto da regulação bancária e discute a comparabilidade da adequação de capital entre os bancos e entre jurisdições. No entanto, as definições dos conceitos de complexidade e de sistemas adaptativos complexos são suprimidas das principais discussões. Este artigo esclarece alguns conceitos relacionados às teorias da Complexidade, como se dá a emergência deste fenômeno, como os conceitos podem ser aplicados ao mercado financeiro. São discutidas duas ferramentas que podem ser utilizadas no contexto de sistemas adaptativos complexos: Agent Based Models (ABMs) e entropia e comparadas com ferramentas tradicionais. Concluímos que ainda que a linha de pesquisa da complexidade deixe lacunas, certamente esta contribui com a agenda de pesquisa econômica para se compreender os mecanismos que desencadeiam riscos sistêmicos, bem como adiciona ferramentas que possibilitam modelar agentes heterogêneos que interagem, de forma a permitir o surgimento de fenômenos emergentes no sistema. Hipóteses de pesquisa são sugeridas para aprofundamento posterior.
Resumo:
In the first essay, "Determinants of Credit Expansion in Brazil", analyzes the determinants of credit using an extensive bank level panel dataset. Brazilian economy has experienced a major boost in leverage in the first decade of 2000 as a result of a set factors ranging from macroeconomic stability to the abundant liquidity in international financial markets before 2008 and a set of deliberate decisions taken by President Lula's to expand credit, boost consumption and gain political support from the lower social strata. As relevant conclusions to our investigation we verify that: credit expansion relied on the reduction of the monetary policy rate, international financial markets are an important source of funds, payroll-guaranteed credit and investment grade status affected positively credit supply. We were not able to confirm the importance of financial inclusion efforts. The importance of financial sector sanity indicators of credit conditions cannot be underestimated. These results raise questions over the sustainability of this expansion process and financial stability in the future. The second essay, “Public Credit, Monetary Policy and Financial Stability”, discusses the role of public credit. The supply of public credit in Brazil has successfully served to relaunch the economy after the Lehman-Brothers demise. It was later transformed into a driver for economic growth as well as a regulation device to force private banks to reduce interest rates. We argue that the use of public funds to finance economic growth has three important drawbacks: it generates inflation, induces higher loan rates and may induce financial instability. An additional effect is the prevention of market credit solutions. This study contributes to the understanding of the costs and benefits of credit as a fiscal policy tool. The third essay, “Bayesian Forecasting of Interest Rates: Do Priors Matter?”, discusses the choice of priors when forecasting short-term interest rates. Central Banks that commit to an Inflation Target monetary regime are bound to respond to inflation expectation spikes and product hiatus widening in a clear and transparent way by abiding to a Taylor rule. There are various reports of central banks being more responsive to inflationary than to deflationary shocks rendering the monetary policy response to be indeed non-linear. Besides that there is no guarantee that coefficients remain stable during time. Central Banks may switch to a dual target regime to consider deviations from inflation and the output gap. The estimation of a Taylor rule may therefore have to consider a non-linear model with time varying parameters. This paper uses Bayesian forecasting methods to predict short-term interest rates. We take two different approaches: from a theoretic perspective we focus on an augmented version of the Taylor rule and include the Real Exchange Rate, the Credit-to-GDP and the Net Public Debt-to-GDP ratios. We also take an ”atheoretic” approach based on the Expectations Theory of the Term Structure to model short-term interest. The selection of priors is particularly relevant for predictive accuracy yet, ideally, forecasting models should require as little a priori expert insight as possible. We present recent developments in prior selection, in particular we propose the use of hierarchical hyper-g priors for better forecasting in a framework that can be easily extended to other key macroeconomic indicators.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Análise genética de escores de avaliação visual de bovinos com modelos bayesianos de limiar e linear
Resumo:
O objetivo deste trabalho foi comparar as estimativas de parâmetros genéticos obtidas em análises bayesianas uni-característica e bi-característica, em modelo animal linear e de limiar, considerando-se as características categóricas morfológicas de bovinos da raça Nelore. Os dados de musculosidade, estrutura física e conformação foram obtidos entre 2000 e 2005, em 3.864 animais de 13 fazendas participantes do Programa Nelore Brasil. Foram realizadas análises bayesianas uni e bi-características, em modelos de limiar e linear. de modo geral, os modelos de limiar e linear foram eficientes na estimação dos parâmetros genéticos para escores visuais em análises bayesianas uni-características. Nas análises bi-características, observou-se que: com utilização de dados contínuos e categóricos, o modelo de limiar proporcionou estimativas de correlação genética de maior magnitude do que aquelas do modelo linear; e com o uso de dados categóricos, as estimativas de herdabilidade foram semelhantes. A vantagem do modelo linear foi o menor tempo gasto no processamento das análises. Na avaliação genética de animais para escores visuais, o uso do modelo de limiar ou linear não influenciou a classificação dos animais, quanto aos valores genéticos preditos, o que indica que ambos os modelos podem ser utilizados em programas de melhoramento genético.
Resumo:
Several mobile robots show non-linear behavior, mainly due friction phenomena between the mechanical parts of the robot or between the robot and the ground. Linear models are efficient in some cases, but it is necessary take the robot non-linearity in consideration when precise displacement and positioning are desired. In this work a parametric model identification procedure for a mobile robot with differential drive that considers the dead-zone in the robot actuators is proposed. The method consists in dividing the system into Hammerstein systems and then uses the key-term separation principle to present the input-output relations which shows the parameters from both linear and non-linear blocks. The parameters are then simultaneously estimated through a recursive least squares algorithm. The results shows that is possible to identify the dead-zone thresholds together with the linear parameters
Resumo:
The present work presents the study and implementation of an adaptive bilinear compensated generalized predictive controller. This work uses conventional techniques of predictive control and includes techniques of adaptive control for better results. In order to solve control problems frequently found in the chemical industry, bilinear models are considered to represent the dynamics of the studied systems. Bilinear models are simpler than general nonlinear model, however it can to represent the intrinsic not-linearities of industrial processes. The linearization of the model, by the approach to time step quasilinear , is used to allow the application of the equations of the generalized predictive controller (GPC). Such linearization, however, generates an error of prediction, which is minimized through a compensation term. The term in study is implemented in an adaptive form, due to the nonlinear relationship between the input signal and the prediction error.Simulation results show the efficiency of adaptive predictive bilinear controller in comparison with the conventional.
Resumo:
This work presents a modelling and identification method for a wheeled mobile robot, including the actuator dynamics. Instead of the classic modelling approach, where the robot position coordinates (x,y) are utilized as state variables (resulting in a non linear model), the proposed discrete model is based on the travelled distance increment Delta_l. Thus, the resulting model is linear and time invariant and it can be identified through classical methods such as Recursive Least Mean Squares. This approach has a problem: Delta_l can not be directly measured. In this paper, this problem is solved using an estimate of Delta_l based on a second order polynomial approximation. Experimental data were colected and the proposed method was used to identify the model of a real robot
Resumo:
Slugging is a well-known slugging phenomenon in multiphase flow, which may cause problems such as vibration in pipeline and high liquid level in the separator. It can be classified according to the place of its occurrence. The most severe, known as slugging in the riser, occurs in the vertical pipe which feeds the platform. Also known as severe slugging, it is capable of causing severe pressure fluctuations in the flow of the process, excessive vibration, flooding in separator tanks, limited production, nonscheduled stop of production, among other negative aspects that motivated the production of this work . A feasible solution to deal with this problem would be to design an effective method for the removal or reduction of the system, a controller. According to the literature, a conventional PID controller did not produce good results due to the high degree of nonlinearity of the process, fueling the development of advanced control techniques. Among these, the model predictive controller (MPC), where the control action results from the solution of an optimization problem, it is robust, can incorporate physical and /or security constraints. The objective of this work is to apply a non-conventional non-linear model predictive control technique to severe slugging, where the amount of liquid mass in the riser is controlled by the production valve and, indirectly, the oscillation of flow and pressure is suppressed, while looking for environmental and economic benefits. The proposed strategy is based on the use of the model linear approximations and repeatedly solving of a quadratic optimization problem, providing solutions that improve at each iteration. In the event where the convergence of this algorithm is satisfied, the predicted values of the process variables are the same as to those obtained by the original nonlinear model, ensuring that the constraints are satisfied for them along the prediction horizon. A mathematical model recently published in the literature, capable of representing characteristics of severe slugging in a real oil well, is used both for simulation and for the project of the proposed controller, whose performance is compared to a linear MPC