877 resultados para Stationary


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies a smooth-transition (ST) type cointegration. The proposed ST cointegration allows for regime switching structure in a cointegrated system. It nests the linear cointegration developed by Engle and Granger (1987) and the threshold cointegration studied by Balke and Fomby (1997). We develop F-type tests to examine linear cointegration against ST cointegration in ST-type cointegrating regression models with or without time trends. The null asymptotic distributions of the tests are derived with stationary transition variables in ST cointegrating regression models. And it is shown that our tests have nonstandard limiting distributions expressed in terms of standard Brownian motion when regressors are pure random walks, while have standard asymptotic distributions when regressors contain random walks with nonzero drift. Finite-sample distributions of those tests are studied by Monto Carlo simulations. The small-sample performance of the tests states that our F-type tests have a better power when the system contains ST cointegration than when the system is linearly cointegrated. An empirical example for the purchasing power parity (PPP) data (monthly US dollar, Italy lira and dollar-lira exchange rate from 1973:01 to 1989:10) is illustrated by applying the testing procedures in this paper. It is found that there is no linear cointegration in the system, but there exits the ST-type cointegration in the PPP data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis consists of four manuscripts in the area of nonlinear time series econometrics on topics of testing, modeling and forecasting nonlinear common features. The aim of this thesis is to develop new econometric contributions for hypothesis testing and forecasting in these area. Both stationary and nonstationary time series are concerned. A definition of common features is proposed in an appropriate way to each class. Based on the definition, a vector nonlinear time series model with common features is set up for testing for common features. The proposed models are available for forecasting as well after being well specified. The first paper addresses a testing procedure on nonstationary time series. A class of nonlinear cointegration, smooth-transition (ST) cointegration, is examined. The ST cointegration nests the previously developed linear and threshold cointegration. An Ftypetest for examining the ST cointegration is derived when stationary transition variables are imposed rather than nonstationary variables. Later ones drive the test standard, while the former ones make the test nonstandard. This has important implications for empirical work. It is crucial to distinguish between the cases with stationary and nonstationary transition variables so that the correct test can be used. The second and the fourth papers develop testing approaches for stationary time series. In particular, the vector ST autoregressive (VSTAR) model is extended to allow for common nonlinear features (CNFs). These two papers propose a modeling procedure and derive tests for the presence of CNFs. Including model specification using the testing contributions above, the third paper considers forecasting with vector nonlinear time series models and extends the procedures available for univariate nonlinear models. The VSTAR model with CNFs and the ST cointegration model in the previous papers are exemplified in detail,and thereafter illustrated within two corresponding macroeconomic data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the photoassociation of Bose-Einstein condensed atoms into molecules using an optical cavity field. The driven cavity field introduces a dynamical degree of freedom into the photoassociation process, whose role in determining the stationary behavior has not previously been considered. The semiclassical stationary solutions for the atom and molecules as well as the intracavity field are found and their stability and scaling properties are determined in terms of experimentally controllable parameters including driving amplitude of the cavity and the nonlinear interactions between atoms and molecules. For weak cavity driving, we find a bifurcation in the atom and molecule number occurs that signals a transition from a stable steady state to nonlinear Rabi oscillations. For a strongly driven cavity, there exists bistability in the atom and molecule number.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Voice processing in real-time is challenging. A drawback of previous work for Hypokinetic Dysarthria (HKD) recognition is the requirement of controlled settings in a laboratory environment. A personal digital assistant (PDA) has been developed for home assessment of PD patients. The PDA offers sound processing capabilities, which allow for developing a module for recognition and quantification HKD. Objective: To compose an algorithm for assessment of PD speech severity in the home environment based on a review synthesis. Methods: A two-tier review methodology is utilized. The first tier focuses on real-time problems in speech detection. In the second tier, acoustics features that are robust to medication changes in Levodopa-responsive patients are investigated for HKD recognition. Keywords such as Hypokinetic Dysarthria , and Speech recognition in real time were used in the search engines. IEEE explorer produced the most useful search hits as compared to Google Scholar, ELIN, EBRARY, PubMed and LIBRIS. Results: Vowel and consonant formants are the most relevant acoustic parameters to reflect PD medication changes. Since relevant speech segments (consonants and vowels) contains minority of speech energy, intelligibility can be improved by amplifying the voice signal using amplitude compression. Pause detection and peak to average power rate calculations for voice segmentation produce rich voice features in real time. Enhancements in voice segmentation can be done by inducing Zero-Crossing rate (ZCR). Consonants have high ZCR whereas vowels have low ZCR. Wavelet transform is found promising for voice analysis since it quantizes non-stationary voice signals over time-series using scale and translation parameters. In this way voice intelligibility in the waveforms can be analyzed in each time frame. Conclusions: This review evaluated HKD recognition algorithms to develop a tool for PD speech home-assessment using modern mobile technology. An algorithm that tackles realtime constraints in HKD recognition based on the review synthesis is proposed. We suggest that speech features may be further processed using wavelet transforms and used with a neural network for detection and quantification of speech anomalies related to PD. Based on this model, patients' speech can be automatically categorized according to UPDRS speech ratings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study the monitoring results of prototype installation of a recently developed solar combisystem have been evaluated. The system, that uses a water jacketed pellet stove as auxiliary heater, was installed in a single family house in Borlänge/Sweden. In order to allow an evaluation under realistic conditions the system has been monitored for a time period of one year. From the measurements of the system it could be seen that it is important that the pellet stove has a sufficient buffer store volume to minimize cycling. The measurements showed also that the stove gives a lower share of the produced heat to the water loop than measured under stationary conditions. The solar system works as expected and covers the heat demand during the summer and a part of the heat demand during spring and autumn. Potential for optimization exists for the parasitic electricity demand. The system consumes 680 kWh per year for pumps, valves and controllers which is more than 4% of the total primary heating energy demand.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gaseous and particulate emissions from a residential pellet boiler and a stove are measured at a realistic 6-day operation sequence and during steady state operation. The aim is to characterize the emissions during each phase in order to identify when the major part of the emissions occur to enable actions for emission reduction where the savings can be highest. The characterized emissions comprised carbon monoxide (CO), nitrogen oxide (NO), total organic carbon (TOC) and particulate matter (PM 2.5). In this study, emissions were characterised by mass concentration and emissions during start-up and stop phases were also presented in accumulated mass. The influence of start-up and stop phases on the emissions, average emission factors for the boiler and stove were analysed using the measured data from a six-days test. The share of start-up and stop emissions are significant for CO and TOC contributing 95% and 89% respectively at the 20kW boiler and 82% and 89% respectively at the 12 kW stove. NO and particles emissions are shown to dominate during stationary operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Developing successful navigation and mapping strategies is an essential part of autonomous robot research. However, hardware limitations often make for inaccurate systems. This project serves to investigate efficient alternatives to mapping an environment, by first creating a mobile robot, and then applying machine learning to the robot and controlling systems to increase the robustness of the robot system. My mapping system consists of a semi-autonomous robot drone in communication with a stationary Linux computer system. There are learning systems running on both the robot and the more powerful Linux system. The first stage of this project was devoted to designing and building an inexpensive robot. Utilizing my prior experience from independent studies in robotics, I designed a small mobile robot that was well suited for simple navigation and mapping research. When the major components of the robot base were designed, I began to implement my design. This involved physically constructing the base of the robot, as well as researching and acquiring components such as sensors. Implementing the more complex sensors became a time-consuming task, involving much research and assistance from a variety of sources. A concurrent stage of the project involved researching and experimenting with different types of machine learning systems. I finally settled on using neural networks as the machine learning system to incorporate into my project. Neural nets can be thought of as a structure of interconnected nodes, through which information filters. The type of neural net that I chose to use is a type that requires a known set of data that serves to train the net to produce the desired output. Neural nets are particularly well suited for use with robotic systems as they can handle cases that lie at the extreme edges of the training set, such as may be produced by "noisy" sensor data. Through experimenting with available neural net code, I became familiar with the code and its function, and modified it to be more generic and reusable for multiple applications of neural nets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estudou-se, através do método eletroquímico de permeação, a difusão de hidrogênio no ferro puro e no aço DIN 90MnV8. Observou-se nestes materiais, que o coeficiente aparente de difusão de hidrogênio diminui no decorrer dos transientes de permeação, tendo-se interpretado este comportamento como causado pela presença de armadilhas reversíveis. Deduziu-se, para transientes de permeação obtidos entre dois estados estacionários de difusão, que a variação da concentração de hidrogênio preso a armadilhas reversiveis é dada por: Delta Ca = Na. In [(1+alfa)1/2 dividido por (1+beta)1/2], alfa = K.Co/p beta=K.Cl/p sendo Na a densidade de armadilhas; k e p, os parâmetros de captura e de liberação de uma armadilha e Co e Cl, a concentrasão inicial e final de H. Mediu-se, no ferro recozido, um coeficiente médio de difusão de hidrogênio de 6,4.10-5 cm²/s. A deformação com 50% de trabalho a frio no ferro reduz este valor em aproximadamente uma ordem de grandeza. Verificou-se no aço DIN 90MnV8 que o coeficiente de difusão de hidrogênio aumenta com o aumento dn temperatura de revenimento após a têmpera, atingindo o valor máximo de 7,0.10-6 cm²/s no aço esferoidizado. A dependência térmica do coeficiente de difusão no aço DIN 90MnV8 esferoidizado pode ser expressa por: D = 1,95 (+0,49) exp (-30,43kJ/mol.R.T.) (cm²/s) e neste mesmo aço temperado e revenido com 55RC de dureza por : D=0,29 (+- 0,05 ) exp (-28,47kJ/mol.R.T.) ( cm²/s). Uma nova metodolagia fractomecânica foi introduzida, a qual permite a determinação da cnncentração crítica de hidrogênio necessária para provocar o avanço de uma trinca estacionária. Observou-se no aço DIN 9OMnV8, que a concentração crítica de hidrogênio diminui com o aumento da dureza do aço e é menor quando um valor maior de KI é aplicado. O modo de fratura apresentado Por este aço revelou ser independente da concentração de hidrogênio e do valor de KI, sendo função unicamente de sua dureza.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neste trabalho propõe-se um sistema para medição de torque em dispositivos girantes, que utiliza extensômetros de resistência elétrica colados nos próprios elementos constituintes do arranjo mecânico sob análise. Um conjunto de circuitos eletrônicos foi especialmente desenvolvido para o sensoreamento das pequenas deformações que ocorrem nos disposotivos girantes. O sistema opera sem contato eletro-mecânico entre a parte estacionária e a parte girante. Para tanto desenvolveu-se também uma metodologia de projeto e construção de transformadores rotativos que são utilizados para transferência da energia que alimenta os circuitos eletrônicos solidários ao elemento mecânico instrumentado. Também foi necessário utilizar um transmissor em freqüência modulada do sinal elétrico proporcional ao torque medido. Uma análise comparativa, dos resultados obtidos entre os sistemas existentes e aqueles alcançados com a técnica proposta neste trabalho, demonstra sua aplicabilidade em diversas situações práticas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

o objetivo deste trabalho é a análise de barragens de gravidade de concreto desde a faseda sua construção até sua completa entrada em serviço. Inicialmente é feita a análise da fase construtiva, onde o problema fundamental é devido às tensões térmicas decorrentes do calor de hidratação. O método dos elementos finitos é empregado para a solução dos problemasde transferência de calor e de tensões. A influência da construção em camadas é introduzidaatravés da redefinição da malha de elementos finitos, logo após o lançamento de cadacamada de concreto. Uma atenção especial é dada ao problema de fissuração em estruturas de concreto simples.Algunsmodelos usuais são apresentados, discutindo-se a eficiência dos mesmos. Os modelosde fissuração distribuída têm sido preferidos, em virtude dos vários inconvenientes apresentados pelas formulações discretas. Esses modelos, entretanto, fornecem resultados dependentesda malha de elementos finitos e alguma consideração adicional deve ser feita para corrigiressas distorções. Normalmente, tenta-se corrigir esse problema através da adoção de umaresistênciaà tração minorada que é definida em função da energia de fratura do material. Neste trabalho, é demonstrado que esse procedimento não é satisfatório e é proposta uma novaformulaçãopara a análise de grandes estruturas de concreto. A análise das tensões na etapa de construção da barragem é feita com o emprego de um modelo constitutivo viscoelástico com envelhecimento para o concreto. Em virtude do envelhecimento,a matriz de rigidez da estrutura é variável no tempo, devendo ser redefinida e triangularizadaem cada instante. Isto leva a um grande esforço computacional, sobretudo, quandoa barragem é construída em muitas camadas. Para evitar esse inconveniente, adota-se um procedimento iterativo que permite que a matriz de rigidez seja redefinida em poucas idadesde referência. Numa segunda etapa da análise, a barragem é submetida à pressão hidrostática e a uma excitação sísmica. A análise dinâmica é realizada considerando-se o movimento do sistema acoplado barragem-reservatório-fundação. O sismo é considerado um processo estocásticonão estacionário e a segurança da estrutura é determinada em relação aos principais modos de falha

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neste trabalho analisamos processos estocásticos com decaimento polinomial (também chamado hiperbólico) da função de autocorrelação. Nosso estudo tem enfoque nas classes dos Processos ARFIMA e dos Processos obtidos à partir de iterações da transformação de Manneville-Pomeau. Os objetivos principais são comparar diversos métodos de estimação para o parâmetro fracionário do processo ARFIMA, nas situações de estacionariedade e não estacionariedade e, além disso, obter resultados similares para o parâmetro do processo de Manneville-Pomeau. Entre os diversos métodos de estimação para os parâmetros destes dois processos destacamos aquele baseado na teoria de wavelets por ser aquele que teve o melhor desempenho.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The presence of deterministic or stochastic trend in U.S. GDP has been a continuing debate in the literature of macroeconomics. Ben-David and Papell (1995) found evindence in favor of trend stationarity using the secular sample of Maddison (1995). More recently, Murray and Nelson (2000) correctly criticized this nding arguing that the Maddison data are plagued with additive outliers (AO), which bias inference towards stationarity. Hence, they propose to set the secular sample aside and conduct inference using a more homogeneous but shorter time-span post-WWII sample. In this paper we re-visit the Maddison data by employing a test that is robust against AO s. Our results suggest the U.S. GDP can be modeled as a trend stationary process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several empirical studies in the literature have documented the existence of a positive correlation between income inequalitiy and unemployment. I provide a theoretical framework under which this correlation can be better understood. The analysis is based on a dynamic job search under uncertainty. I start by proving the uniqueness of a stationary distribution of wages in the economy. Drawing upon this distribution, I provide a general expression for the Gini coefficient of income inequality. The expression has the advantage of not requiring a particular specification of the distribution of wage offers. Next, I show how the Gini coefficient varies as a function of the parameters of the model, and how it can be expected to be positively correlated with the rate of unemployment. Two examples are offered. The first, of a technical nature, to show that the convergence of the measures implied by the underlying Markov process can fail in some cases. The second, to provide a quantitative assessment of the model and of the mechanism linking unemployment and inequality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using national accounts data for the revenue-GDP and expenditure GDP ratios from 1947 to 1992, we examine two central issues in public finance. First, was the path of public debt sustainable during this period? Second, if debt is sustainable, how has the government historically balanced the budget after hocks to either revenues or expenditures? The results show that (i) public deficit is stationary (bounded asymptotic variance), with the budget in Brazil being balanced almost entirely through changes in taxes, regardless of the cause of the initial imbalance. Expenditures are weakly exogenous, but tax revenues are not;(ii) a rational Brazilian consumer can have a behavior consistent with Ricardian Equivalence (iii) seignorage revenues are critical to restore intertemporal budget equilibrium, since, when we exclude them from total revenues, debt is not sustainable in econometric tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we investigate fiscal sustainability by using a quantile autoregression (QAR) model. We propose a novel methodology to separate periods of nonstationarity from stationary ones, which allows us to identify various trajectories of public debt that are compatible with fiscal sustainability. We use such trajectories to construct a debt ceiling, that is, the largest value of public debt that does not jeopardize long-run fiscal sustainability. We make out-of-sample forecast of such a ceiling and show how it could be used by Policy makers interested in keeping the public debt on a sustainable path. We illustrate the applicability of our results using Brazilian data.