923 resultados para Vector controls
Resumo:
This work concerns forecasting with vector nonlinear time series models when errorsare correlated. Point forecasts are numerically obtained using bootstrap methods andillustrated by two examples. Evaluation concentrates on studying forecast equality andencompassing. Nonlinear impulse responses are further considered and graphically sum-marized by highest density region. Finally, two macroeconomic data sets are used toillustrate our work. The forecasts from linear or nonlinear model could contribute usefulinformation absent in the forecasts form the other model.
Resumo:
This thesis consists of four manuscripts in the area of nonlinear time series econometrics on topics of testing, modeling and forecasting nonlinear common features. The aim of this thesis is to develop new econometric contributions for hypothesis testing and forecasting in these area. Both stationary and nonstationary time series are concerned. A definition of common features is proposed in an appropriate way to each class. Based on the definition, a vector nonlinear time series model with common features is set up for testing for common features. The proposed models are available for forecasting as well after being well specified. The first paper addresses a testing procedure on nonstationary time series. A class of nonlinear cointegration, smooth-transition (ST) cointegration, is examined. The ST cointegration nests the previously developed linear and threshold cointegration. An Ftypetest for examining the ST cointegration is derived when stationary transition variables are imposed rather than nonstationary variables. Later ones drive the test standard, while the former ones make the test nonstandard. This has important implications for empirical work. It is crucial to distinguish between the cases with stationary and nonstationary transition variables so that the correct test can be used. The second and the fourth papers develop testing approaches for stationary time series. In particular, the vector ST autoregressive (VSTAR) model is extended to allow for common nonlinear features (CNFs). These two papers propose a modeling procedure and derive tests for the presence of CNFs. Including model specification using the testing contributions above, the third paper considers forecasting with vector nonlinear time series models and extends the procedures available for univariate nonlinear models. The VSTAR model with CNFs and the ST cointegration model in the previous papers are exemplified in detail,and thereafter illustrated within two corresponding macroeconomic data sets.
Resumo:
Recent investigations of various quantum-gravity theories have revealed a variety of possible mechanisms that lead to Lorentz violation. One of the more elegant of these mechanisms is known as Spontaneous Lorentz Symmetry Breaking (SLSB), where a vector or tensor field acquires a nonzero vacuum expectation value. As a consequence of this symmetry breaking, massless Nambu-Goldstone modes appear with properties similar to the photon in Electromagnetism. This thesis considers the most general class of vector field theories that exhibit spontaneous Lorentz violation-known as bumblebee models-and examines their candidacy as potential alternative explanations of E&M, offering the possibility that Einstein-Maxwell theory could emerge as a result of SLSB rather than of local U(1) gauge invariance. With this aim we employ Dirac's Hamiltonian Constraint Analysis procedure to examine the constraint structures and degrees of freedom inherent in three candidate bumblebee models, each with a different potential function, and compare these results to those of Electromagnetism. We find that none of these models share similar constraint structures to that of E&M, and that the number of degrees of freedom for each model exceeds that of Electromagnetism by at least two, pointing to the potential existence of massive modes or propagating ghost modes in the bumblebee theories.
Resumo:
Classical electromagnetism predicts two massless propagating modes, which are known as the two polarizations of the photon. On the other hand, if the Lorentz symmetry of classical electromagnetism is spontaneously broken, the new theory will still have two massless Nambu-Goldstone modes resembling the photon. If the Lorentz symmetry is broken by a bumblebee potential that allows for excitations out of the minimum, then massive modes arise. Furthermore, in curved spacetime, such massive modes will be created through a process other than the usual Higgs mechanism because of the dependence of the bumblebee potential on both the vector field and the metric tensor. Also, it is found that these massive modes do not propagate due to the extra constraints.
Resumo:
Recently, two international standard organizations, ISO and OGC, have done the work of standardization for GIS. Current standardization work for providing interoperability among GIS DB focuses on the design of open interfaces. But, this work has not considered procedures and methods for designing river geospatial data. Eventually, river geospatial data has its own model. When we share the data by open interface among heterogeneous GIS DB, differences between models result in the loss of information. In this study a plan was suggested both to respond to these changes in the information envirnment and to provide a future Smart River-based river information service by understanding the current state of river geospatial data model, improving, redesigning the database. Therefore, primary and foreign key, which can distinguish attribute information and entity linkages, were redefined to increase the usability. Database construction of attribute information and entity relationship diagram have been newly redefined to redesign linkages among tables from the perspective of a river standard database. In addition, this study was undertaken to expand the current supplier-oriented operating system to a demand-oriented operating system by establishing an efficient management of river-related information and a utilization system, capable of adapting to the changes of a river management paradigm.
Resumo:
Resumo não disponível
Resumo:
Esta dissertação enfoca o relacionamento entre acumulação de competências tecnológicas e os processos subjacentes de aprendizagem. Este relacionamento é examinado na empresa Invensys Appliance Controls Ltda. - Unidade de Vacaria/RS, durante o período de 1981 a 2000. Trata-se de uma empresa metal-mecânica, fornecedora da indústria de linha branca. Em outras palavras, a dissertação examina as implicações dos processos subjacentes de aprendizagem tecnológica para a acumulação de competências para três funções tecnológicas: processo e organização da produção, produtos e equipamentos. Os processos de aprendizagem são examinados à luz de quatro características: variedade, intensidade, funcionamento e interação, a partir do uso de estruturas de análise existente na literatura. Baseado em estudo de caso individual, este estudo encontrou que a maneira e a velocidade de acumulação de competências tecnológicas na empresa estudada estão associadas aos diversos processos usados para adquirir conhecimento tecnológico e convertê-lo em organizacional. Ademais, a simples incidência desses processos na empresa não garantiu na empresa uma implicação positiva para a acumulação de competências tecnológicas. Através do uso de estrutura existente na literatura, porém aplicada a uma indústria diferente de estudos anteriores, esta dissertação sugere que deve existir um esforço organizado, contínuo e integrado para geração e disseminação de conhecimento em toda a empresa a fim de que a acumulação de capacitação tecnológica seja acelerada na empresa.
Resumo:
Os controles de capitais estão novamente em voga em razão dos países emergentes reintroduzirem essas medidas nos últimos anos face a abundante entrada de capital internacional. As autoridades argumentam que tais medidas protegem as economias no caso de uma “parada abrupta” desses fluxos. Será demonstrado que os controles de capitais parecem fazer com que as economias emergentes (EMEs) fiquem mais resistentes diante de uma crise financeira (por exemplo, uma queda na atividade econômica seguida de uma crise é menor quando o controle é maior). No entanto, os controles de capitais parecem deixar as economias emergentes (EMEs) também mais propícias a uma crise. Deste modo, as autoridades devem ser cautelosas na avaliação quanto aos riscos e benefícios relativos a aplicação das medidas dos controles de capitais.
Resumo:
Capital controls are again in vogue as a number of emerging markets have reintroduced these measures in recent years in response to a “flood” of international capital. Policymakers use these tools to buttress their economies against the “sudden stop” risk that accompanies international capital flows. Using a panel VAR model, we show that capital controls appear to make emerging market economies (EMEs) more resistant to financial crises by showing that lower post-crisis output loss is correlated with stronger capital controls. However, EMEs that employ capital controls seem to be more crisis-prone. Thus, policymakers should carefully evaluate whether the benefits of capital controls outweigh their costs.
Resumo:
Brazil has demonstrated resilience in relation to the recent economic crises and has an auspicious development potential projected for the coming decades, which, linked to the globalization process, provides important opportunities for our people. Gradually we have established ourselves as one of the leading nations in the world and we have become a reference in questions linked to economic equilibrium, development, energy, agriculture and the environment. This international recognition favors the exchange of experiences with other cultures, governments and organizations, bringing with it the possibility of stimulating a dynamic process of development and innovation.
Resumo:
Real exchange rate is an important macroeconomic price in the economy and a ects economic activity, interest rates, domestic prices, trade and investiments ows among other variables. Methodologies have been developed in empirical exchange rate misalignment studies to evaluate whether a real e ective exchange is overvalued or undervalued. There is a vast body of literature on the determinants of long-term real exchange rates and on empirical strategies to implement the equilibrium norms obtained from theoretical models. This study seeks to contribute to this literature by showing that it is possible to calculate the misalignment from a mixed ointegrated vector error correction framework. An empirical exercise using United States' real exchange rate data is performed. The results suggest that the model with mixed frequency data is preferred to the models with same frequency variables
Resumo:
The synthetic control (SC) method has been recently proposed as an alternative method to estimate treatment e ects in comparative case studies. Abadie et al. [2010] and Abadie et al. [2015] argue that one of the advantages of the SC method is that it imposes a data-driven process to select the comparison units, providing more transparency and less discretionary power to the researcher. However, an important limitation of the SC method is that it does not provide clear guidance on the choice of predictor variables used to estimate the SC weights. We show that such lack of speci c guidances provides signi cant opportunities for the researcher to search for speci cations with statistically signi cant results, undermining one of the main advantages of the method. Considering six alternative speci cations commonly used in SC applications, we calculate in Monte Carlo simulations the probability of nding a statistically signi cant result at 5% in at least one speci cation. We nd that this probability can be as high as 13% (23% for a 10% signi cance test) when there are 12 pre-intervention periods and decay slowly with the number of pre-intervention periods. With 230 pre-intervention periods, this probability is still around 10% (18% for a 10% signi cance test). We show that the speci cation that uses the average pre-treatment outcome values to estimate the weights performed particularly bad in our simulations. However, the speci cation-searching problem remains relevant even when we do not consider this speci cation. We also show that this speci cation-searching problem is relevant in simulations with real datasets looking at placebo interventions in the Current Population Survey (CPS). In order to mitigate this problem, we propose a criterion to select among SC di erent speci cations based on the prediction error of each speci cations in placebo estimations