792 resultados para Time-varying Risk
Resumo:
This article aims to test the hypothesis of contagion between the indices of nancial markets from the United States to Brazil, Japan and England for the period 2000 to 2009. Time varying copulas were used to capture the impact of Sub-prime crisis in the dependence between markets. The implemented model was a ARMA(1,0) st-ARCH(1,2) to the marginal distributions and Normal and Joe Clayton (SJC) copulas for the joint distribution. The results obtained allow to conclude that both for the gaussiana copula and for the SJC copula there is evidence of contagion between the American market and the Brazilian market. For the other two markets Londoner and Japanese, the evidence of the presence of contagion between these markets and the American has not been suf ciently clear in both copula
Resumo:
In the first essay, "Determinants of Credit Expansion in Brazil", analyzes the determinants of credit using an extensive bank level panel dataset. Brazilian economy has experienced a major boost in leverage in the first decade of 2000 as a result of a set factors ranging from macroeconomic stability to the abundant liquidity in international financial markets before 2008 and a set of deliberate decisions taken by President Lula's to expand credit, boost consumption and gain political support from the lower social strata. As relevant conclusions to our investigation we verify that: credit expansion relied on the reduction of the monetary policy rate, international financial markets are an important source of funds, payroll-guaranteed credit and investment grade status affected positively credit supply. We were not able to confirm the importance of financial inclusion efforts. The importance of financial sector sanity indicators of credit conditions cannot be underestimated. These results raise questions over the sustainability of this expansion process and financial stability in the future. The second essay, “Public Credit, Monetary Policy and Financial Stability”, discusses the role of public credit. The supply of public credit in Brazil has successfully served to relaunch the economy after the Lehman-Brothers demise. It was later transformed into a driver for economic growth as well as a regulation device to force private banks to reduce interest rates. We argue that the use of public funds to finance economic growth has three important drawbacks: it generates inflation, induces higher loan rates and may induce financial instability. An additional effect is the prevention of market credit solutions. This study contributes to the understanding of the costs and benefits of credit as a fiscal policy tool. The third essay, “Bayesian Forecasting of Interest Rates: Do Priors Matter?”, discusses the choice of priors when forecasting short-term interest rates. Central Banks that commit to an Inflation Target monetary regime are bound to respond to inflation expectation spikes and product hiatus widening in a clear and transparent way by abiding to a Taylor rule. There are various reports of central banks being more responsive to inflationary than to deflationary shocks rendering the monetary policy response to be indeed non-linear. Besides that there is no guarantee that coefficients remain stable during time. Central Banks may switch to a dual target regime to consider deviations from inflation and the output gap. The estimation of a Taylor rule may therefore have to consider a non-linear model with time varying parameters. This paper uses Bayesian forecasting methods to predict short-term interest rates. We take two different approaches: from a theoretic perspective we focus on an augmented version of the Taylor rule and include the Real Exchange Rate, the Credit-to-GDP and the Net Public Debt-to-GDP ratios. We also take an ”atheoretic” approach based on the Expectations Theory of the Term Structure to model short-term interest. The selection of priors is particularly relevant for predictive accuracy yet, ideally, forecasting models should require as little a priori expert insight as possible. We present recent developments in prior selection, in particular we propose the use of hierarchical hyper-g priors for better forecasting in a framework that can be easily extended to other key macroeconomic indicators.
Resumo:
In this article we use factor models to describe a certain class of covariance structure for financiaI time series models. More specifical1y, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. We build on previous work by allowing the factor loadings, in the factor mo deI structure, to have a time-varying structure and to capture changes in asset weights over time motivated by applications with multi pIe time series of daily exchange rates. We explore and discuss potential extensions to the models exposed here in the prediction area. This discussion leads to open issues on real time implementation and natural model comparisons.
Resumo:
The past decade has wítenessed a series of (well accepted and defined) financial crises periods in the world economy. Most of these events aI,"e country specific and eventually spreaded out across neighbor countries, with the concept of vicinity extrapolating the geographic maps and entering the contagion maps. Unfortunately, what contagion represents and how to measure it are still unanswered questions. In this article we measure the transmission of shocks by cross-market correlation\ coefficients following Forbes and Rigobon's (2000) notion of shift-contagion,. Our main contribution relies upon the use of traditional factor model techniques combined with stochastic volatility mo deIs to study the dependence among Latin American stock price indexes and the North American indexo More specifically, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. From a theoretical perspective, we improve currently available methodology by allowing the factor loadings, in the factor model structure, to have a time-varying structure and to capture changes in the series' weights over time. By doing this, we believe that changes and interventions experienced by those five countries are well accommodated by our models which learns and adapts reasonably fast to those economic and idiosyncratic shocks. We empirically show that the time varying covariance structure can be modeled by one or two common factors and that some sort of contagion is present in most of the series' covariances during periods of economical instability, or crisis. Open issues on real time implementation and natural model comparisons are thoroughly discussed.
Resumo:
This paper investigates the relationship between consumer demand and corporate performance in several consumer industries in the UK, using two independent datasets. It uses data on consumer expenditures and the retail price index to estimate Almost Ideal Demand Systems on micro-data and compute timevarying price elasticities of demand for disaggregated commodity groups. Then, it matches the product definitions to the Standard Industry Classification and uses the estimated elasticities to investigate the impact of consumer behaviour on firm-level profitability equations. The time-varying household characteristics are ideal instruments for the demand effects in the firms' supply equation. The paper concludes that demand elasticities have a significant and tangible impact on the profitability of UK firms and that this impact can shed some light on the relationship between market structure and economic performance.
Resumo:
This work evaluates empirically the Taylor rule for the US and Brazil using Kalman Filter and Markov-Switching Regimes. We show that the parameters of the rule change significantly with variations in both output and output gap proxies, considering hidden variables and states. Such conclusions call naturally for robust optimal monetary rules. We also show that Brazil and US have very contrasting parameters, first because Brazil presents time-varying intercept, second because of the rigidity in the parameters of the Brazilian Taylor rule, regardless the output gap proxy, data frequency or sample data. Finally, we show that the long-run inflation parameter of the US Taylor rule is less than one in many periods, contrasting strongly with Orphanides (forthcoming) and Clarida, Gal´i and Gertler (2000), and the same happens with Brazilian monthly data.
Resumo:
In this paper, we show that the widely used stationarity tests such as the KPSS test has power close to size in the presence of time-varying unconditional variance. We propose a new test as a complement of the existing tests. Monte Carlo experiments show that the proposed test possesses the following characteristics: (i) In the presence of unit root or a structural change in the mean, the proposed test is as powerful as the KPSS and other tests; (ii) In the presence a changing variance, the traditional tests perform badly whereas the proposed test has high power comparing to the existing tests; (iii) The proposed test has the same size as traditional stationarity tests under the null hypothesis of covariance stationarity. An application to daily observations of return on US Dollar/Euro exchange rate reveals the existence of instability in the unconditional variance when the entire sample is considered, but stability is found in sub-samples.
Resumo:
Este trabalho visa analisar a relação entre política monetária e persistência inflacionária no período recente, após a introdução do regime de metas de inflação no Brasil. Através de um modelo novo-keynesiano simplificado, o grau de persistência do hiato de inflação é modelado como função dos pesos da regra de política monetária. A evolução temporal da regra de Taylor é confrontada com a curva estimada de persistência do hiato de inflação, demonstrando que mudanças na condução da política monetária levam a alterações do nível de persistência inflacionária na economia. Uma adaptação do modelo, com uma regra de Taylor que incorpora expectativas do hiato do produto, chega aos mesmos resultados com maior precisão.
Resumo:
Composite resins have been subjected to structural modifications aiming at improved optical and mechanical properties. The present study consisted in an in vitro evaluation of the staining behavior of two nanohybrid resins (NH1 and NH2), a nanoparticulated resin (NP) and a microhybrid resin (MH). Samples of these materials were prepared and immersed in commonly ingested drinks, i.e., coffee, red wine and acai berry for periods of time varying from 1 to 60 days. Cylindrical samples of each resin were shaped using a metallic die and polymerized during 30 s both on the bottom and top of its disk. All samples were polished and immersed in the staining solutions. After 24 hours, three samples of each resin immersed in each solution were removed and placed in a spectrofotome ter for analysis. To that end, the samples were previously diluted in HCl at 50%. Tukey tests were carried out in the statistical analysis of the results. The results revealed that there was a clear difference in the staining behavior of each material. The nanoparticulated resin did not show better color stability compared to the microhybrid resin. Moreover, all resins stained with time. The degree of staining decreased in the sequence nanoparticulated, microhybrid, nanohybrid MH2 and MH1. Wine was the most aggressive drink followed by coffee and acai berry. SEM and image analysis revealed significant porosity on the surface of MH resin and relatively large pores on a NP sample. The NH2 resin was characterized by homogeneous dispersion of particles and limited porosity. Finally, the NH1 resin depicted the lowest porosity level. The results revealed that staining is likely related to the concentration of inorganic pa rticles and surface porosity
Resumo:
The predictive control technique has gotten, on the last years, greater number of adepts in reason of the easiness of adjustment of its parameters, of the exceeding of its concepts for multi-input/multi-output (MIMO) systems, of nonlinear models of processes could be linearised around a operating point, so can clearly be used in the controller, and mainly, as being the only methodology that can take into consideration, during the project of the controller, the limitations of the control signals and output of the process. The time varying weighting generalized predictive control (TGPC), studied in this work, is one more an alternative to the several existing predictive controls, characterizing itself as an modification of the generalized predictive control (GPC), where it is used a reference model, calculated in accordance with parameters of project previously established by the designer, and the application of a new function criterion, that when minimized offers the best parameters to the controller. It is used technique of the genetic algorithms to minimize of the function criterion proposed and searches to demonstrate the robustness of the TGPC through the application of performance, stability and robustness criterions. To compare achieves results of the TGPC controller, the GCP and proportional, integral and derivative (PID) controllers are used, where whole the techniques applied to stable, unstable and of non-minimum phase plants. The simulated examples become fulfilled with the use of MATLAB tool. It is verified that, the alterations implemented in TGPC, allow the evidence of the efficiency of this algorithm
Resumo:
Postsurgical complication of hypertension may occur in cardiac patients. To decrease the chances of complication it is necessary to reduce elevated blood pressure as soon as possible. Continuous infusion of vasodilator drugs, such as sodium nitroprusside (Nipride), would quickly lower the blood pressure in most patients. However, each patient has a different sensitivity to infusion of Nipride. The parameters and the time delays of the system are initially unknown. Moreover, the parameters of the transfer function associated with a particular patient are time varying. the objective of the study is to develop a procedure for blood pressure control i the presence of uncertainty of parameters and considerable time delays. So, a methodology was developed multi-model, and for each such model a Preditive Controller can be a priori designed. An adaptive mechanism is then needed for deciding which controller should be dominant for a given plant
Resumo:
Complex network analysis is a powerful tool into research of complex systems like brain networks. This work aims to describe the topological changes in neural functional connectivity networks of neocortex and hippocampus during slow-wave sleep (SWS) in animals submited to a novel experience exposure. Slow-wave sleep is an important sleep stage where occurs reverberations of electrical activities patterns of wakeness, playing a fundamental role in memory consolidation. Although its importance there s a lack of studies that characterize the topological dynamical of functional connectivity networks during that sleep stage. There s no studies that describe the topological modifications that novel exposure leads to this networks. We have observed that several topological properties have been modified after novel exposure and this modification remains for a long time. Major part of this changes in topological properties by novel exposure are related to fault tolerance
Resumo:
Wavelet coding has emerged as an alternative coding technique to minimize the fading effects of wireless channels. This work evaluates the performance of wavelet coding, in terms of bit error probability, over time-varying, frequency-selective multipath Rayleigh fading channels. The adopted propagation model follows the COST207 norm, main international standards reference for GSM, UMTS, and EDGE applications. The results show the wavelet coding s efficiency against the inter symbolic interference which characterizes these communication scenarios. This robustness of the presented technique enables its usage in different environments, bringing it one step closer to be applied in practical wireless communication systems
Resumo:
Navigation based on visual feedback for robots, working in a closed environment, can be obtained settling a camera in each robot (local vision system). However, this solution requests a camera and capacity of local processing for each robot. When possible, a global vision system is a cheapest solution for this problem. In this case, one or a little amount of cameras, covering all the workspace, can be shared by the entire team of robots, saving the cost of a great amount of cameras and the associated processing hardware needed in a local vision system. This work presents the implementation and experimental results of a global vision system for mobile mini-robots, using robot soccer as test platform. The proposed vision system consists of a camera, a frame grabber and a computer (PC) for image processing. The PC is responsible for the team motion control, based on the visual feedback, sending commands to the robots through a radio link. In order for the system to be able to unequivocally recognize each robot, each one has a label on its top, consisting of two colored circles. Image processing algorithms were developed for the eficient computation, in real time, of all objects position (robot and ball) and orientation (robot). A great problem found was to label the color, in real time, of each colored point of the image, in time-varying illumination conditions. To overcome this problem, an automatic camera calibration, based on clustering K-means algorithm, was implemented. This method guarantees that similar pixels will be clustered around a unique color class. The obtained experimental results shown that the position and orientation of each robot can be obtained with a precision of few millimeters. The updating of the position and orientation was attained in real time, analyzing 30 frames per second
Resumo:
Ionospheric scintillations are caused by time-varying electron density irregularities in the ionosphere, occurring more often at equatorial and high latitudes. This paper focuses exclusively on experiments undertaken in Europe, at geographic latitudes between similar to 50 degrees N and similar to 80 degrees N, where a network of GPS receivers capable of monitoring Total Electron Content and ionospheric scintillation parameters was deployed. The widely used ionospheric scintillation indices S4 and sigma(phi) represent a practical measure of the intensity of amplitude and phase scintillation affecting GNSS receivers. However, they do not provide sufficient information regarding the actual tracking errors that degrade GNSS receiver performance. Suitable receiver tracking models, sensitive to ionospheric scintillation, allow the computation of the variance of the output error of the receiver PLL (Phase Locked Loop) and DLL (Delay Locked Loop), which expresses the quality of the range measurements used by the receiver to calculate user position. The ability of such models of incorporating phase and amplitude scintillation effects into the variance of these tracking errors underpins our proposed method of applying relative weights to measurements from different satellites. That gives the least squares stochastic model used for position computation a more realistic representation, vis-a-vis the otherwise 'equal weights' model. For pseudorange processing, relative weights were computed, so that a 'scintillation-mitigated' solution could be performed and compared to the (non-mitigated) 'equal weights' solution. An improvement between 17 and 38% in height accuracy was achieved when an epoch by epoch differential solution was computed over baselines ranging from 1 to 750 km. The method was then compared with alternative approaches that can be used to improve the least squares stochastic model such as weighting according to satellite elevation angle and by the inverse of the square of the standard deviation of the code/carrier divergence (sigma CCDiv). The influence of multipath effects on the proposed mitigation approach is also discussed. With the use of high rate scintillation data in addition to the scintillation indices a carrier phase based mitigated solution was also implemented and compared with the conventional solution. During a period of occurrence of high phase scintillation it was observed that problems related to ambiguity resolution can be reduced by the use of the proposed mitigated solution.