10 resultados para WORKING MEMORY
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
Chambers (1998) explores the interaction between long memory and aggregation. For continuous-time processes, he takes the aliasing effect into account when studying temporal aggregation. For discrete-time processes, however, he seems to fail to do so. This note gives the spectral density function of temporally aggregated long memory discrete-time processes in light of the aliasing effect. The results are different from those in Chambers (1998) and are supported by a small simulation exercise. As a result, the order of aggregation may not be invariant to temporal aggregation, specifically if d is negative and the aggregation is of the stock type.
Resumo:
This paper investigates the presence of long memory in financiaI time series using four test statistics: V/S, KPSS, KS and modified R/S. There has been a large amount of study on the long memory behavior in economic and financiaI time series. However, there is still no consensus. We argue in this paper that spurious short-term memory may be found due to the incorrect use of data-dependent bandwidth to estimating the longrun variance. We propose a partially adaptive lag truncation procedure that is robust against the presence of long memory under the alternative hypothesis and revisit several economic and financiaI time series using the proposed bandwidth choice. Our results indicate the existence of spurious short memory in real exchange rates when Andrews' formula is employed, but long memory is detected when the proposed lag truncation procedure is used. Using stock market data, we also found short memory in returns and long memory in volatility.
Resumo:
This paper investigates the relationship between memory and the essentiality of money. We consider a random matching economy with a large finite population in which commitment is not possible and memory is limited in the sense that only a fraction m E(0; 1) of the population has publicly observable histories. We show that no matter how limited memory is, there exists a social norm that achieves the first best regardless of the population size. In other words, money can fail to be essential irrespective of the amount of memory in the economy. This suggests that the emphasis on limited memory as a fundamental friction for money to be essential deserves a deeper examination.
Resumo:
A well–established fact in monetary theory is that a key ingredient for the essentiality of money is its role as a form of memory. In this paper we study a notion of memory that includes information about an agent’s past actions and trading opportunities but, in contrast to Kocherlakota (1998), does not include information about the past actions and trading opportunities of an agent’s past partners. We first show that the first–best can be achieved with memory even if it only includes information about an agent’s very recent past. Thus, money can fail to be essential even if memory is minimal. We then establish, more interestingly, that if information about trading opportunities is not part of an agent’s record, then money can be better than memory. This shows that the societal benefit of money lies not only on being a record of past actions, but also on being a record of past trading opportunities, a fact that has been overlooked by the monetary literature.
Resumo:
This paper derives the spectral density function of aggregated long memory processes in light of the aliasing effect. The results are different from previous analyses in the literature and a small simulation exercise provides evidence in our favour. The main result point to that flow aggregates from long memory processes shall be less biased than stock ones, although both retain the degree of long memory. This result is illustrated with the daily US Dollar/ French Franc exchange rate series.
Resumo:
We study constrained efficient aggregate risk sharing and its consequence for the behavior of macro-aggregates in a dynamic Mirrlees’s (1971) setting. Privately observed idiosyncratic productivity shocks are assumed to be independent of i.i.d. publicly observed aggregate shocks. Yet, private allocations display memory with respect to past aggregate shocks, when idosyncratic shocks are also i.i.d.. Under a mild restriction on the nature of optimal allocations the result extends to more persistent idiosyncratic shocks, for all but the limit at which idiosyncratic risk disappears, and the model collapses to a pure heterogeneity repeated Mirrlees economy identical to Werning [2007]. When preferences are iso-elastic we show that an allocation is memoryless only if it displays a strong form of separability with respect to aggregate shocks. Separability characterizes the pure heterogeneity limit as well as the general case with log preferences. With less than full persistence and risk aversion different from unity both memory and non-separability characterize optimal allocations. Exploiting the fact that non-separability is associated with state-varying labor wedges, we apply a business cycle accounting procedure (e.g. Chari et al. [2007]) to the aggregate data generated by the model. We show that, whenever risk aversion is great than one our model produces efficient counter-cyclical labor wedges.
Resumo:
Convex combinations of long memory estimates using the same data observed at different sampling rates can decrease the standard deviation of the estimates, at the cost of inducing a slight bias. The convex combination of such estimates requires a preliminary correction for the bias observed at lower sampling rates, reported by Souza and Smith (2002). Through Monte Carlo simulations, we investigate the bias and the standard deviation of the combined estimates, as well as the root mean squared error (RMSE), which takes both into account. While comparing the results of standard methods and their combined versions, the latter achieve lower RMSE, for the two semi-parametric estimators under study (by about 30% on average for ARFIMA(0,d,0) series).
Resumo:
This paper studies the electricity hourly load demand in the area covered by a utility situated in the southeast of Brazil. We propose a stochastic model which employs generalized long memory (by means of Gegenbauer processes) to model the seasonal behavior of the load. The model is proposed for sectional data, that is, each hour’s load is studied separately as a single series. This approach avoids modeling the intricate intra-day pattern (load profile) displayed by the load, which varies throughout days of the week and seasons. The forecasting performance of the model is compared with a SARIMA benchmark using the years of 1999 and 2000 as the out-of-sample. The model clearly outperforms the benchmark. We conclude for general long memory in the series.
Resumo:
This paper reinterprets results of Ohanissian et al (2003) to show the asymptotic equivalence of temporally aggregating series and using less bandwidth in estimating long memory by Geweke and Porter-Hudak’s (1983) estimator, provided that the same number of periodogram ordinates is used in both cases. This equivalence is in the sense that their joint distribution is asymptotically normal with common mean and variance and unity correlation. Furthermore, I prove that the same applies to the estimator of Robinson (1995). Monte Carlo simulations show that this asymptotic equivalence is a good approximation in finite samples. Moreover, a real example with the daily US Dollar/French Franc exchange rate series is provided.
Resumo:
This paper derives the spectral density function of aggregated long memory processes in light of the aliasing effect. The results are different from previous analyses in the literature and a small simulation exercise provides evidence in our favour. The main result point to that flow aggregates from long memory processes shall be less biased than stock ones, although both retain the degree of long memory. This result is illustrated with the daily US Dollar/ French Franc exchange rate series.