9 resultados para US macroeconomic variables
em Aston University Research Archive
Resumo:
This article tests whether macroeconomic variables and market sentiment influence the size of momentum profits. It finds that although returns to the winner and loser portfolios are influenced by a range of macroeconomic and market wide variables; momentum profits are influenced only by the scale of portfolio outflows. Thus, when investors are sending their capital elsewhere, reduced funds at home, dampen the profitability of the momentum trading strategy. It also finds that when the market closes, below its opening level in the previous six months, momentum profits are higher, which might be a reflection of mean reversion in the market. © 2004 Taylor and Francis Ltd.
Resumo:
We examine the efficiency of multivariate macroeconomic forecasts by estimating a vector autoregressive model on the forecast revisions of four variables (GDP, inflation, unemployment and wages). Using a data set of professional forecasts for the G7 countries, we find evidence of cross‐series revision dynamics. Specifically, forecasts revisions are conditionally correlated to the lagged forecast revisions of other macroeconomic variables, and the sign of the correlation is as predicted by conventional economic theory. This indicates that forecasters are slow to incorporate news across variables. We show that this finding can be explained by forecast underreaction.
Resumo:
Divisia money is a monetary aggregate that gives each component asset an assigned weight. We use an evolutionary neural network to calculate new Divisia weights for each component utilising the Bank of England monetary data for the U.K. We propose a new monetary aggregate using our newly derived weights to carry out quantitative inflation prediction. The results show that this new monetary aggregate has better inflation forecasting performance than the traditionally constructed Bank of England Divisa money. This result is important for monetary policymakers, as improved construction of monetary aggregates will yield tighter relationships between key macroeconomic variables and ultimately, greater macroeconomic control. Research is ongoing to establish the extent of the increased information content and parameter stability of this new monetary aggregate.
Resumo:
The purpose of this study is to provide a comparative analysis of the efficiency of Islamic and conventional banks in Gulf Cooperation Council (GCC) countries. In this study, we explain inefficiencies obtained by introducing firm-specific as well as macroeconomic variables. Our findings indicate that during the eight years of study, conventional banks largely outperform Islamic banks with an average technical efficiency score of 81% compared to 95.57%. However, it is clear that since 2008, efficiency of conventional banks was in a downward trend while the efficiency of their Islamic counterparts was in an upward trend since 2009. This indicates that Islamic banks have succeeded to maintain a level of efficiency during the subprime crisis period. Finally, for the whole sample, the analysis demonstrates the strong link of macroeconomic indicators with efficiency for GCC banks. Surprisingly, we have not found any significant relationship in the case of Islamic banks.
Resumo:
In this study, we developed a DEA-based performance measurement methodology that is consistent with performance assessment frameworks such as the Balanced Scorecard. The methodology developed in this paper takes into account the direct or inverse relationships that may exist among the dimensions of performance to construct appropriate production frontiers. The production frontiers we obtained are deemed appropriate as they consist solely of firms with desirable levels for all dimensions of performance. These levels should be at least equal to the critical values set by decision makers. The properties and advantages of our methodology against competing methodologies are presented through an application to a real-world case study from retail firms operating in the US. A comparative analysis between the new methodology and existing methodologies explains the failure of the existing approaches to define appropriate production frontiers when directly or inversely related dimensions of performance are present and to express the interrelationships between the dimensions of performance.
Resumo:
Most of the common techniques for estimating conditional probability densities are inappropriate for applications involving periodic variables. In this paper we introduce three novel techniques for tackling such problems, and investigate their performance using synthetic data. We then apply these techniques to the problem of extracting the distribution of wind vector directions from radar scatterometer data gathered by a remote-sensing satellite.
Resumo:
Interpolated data are an important part of the environmental information exchange as many variables can only be measured at situate discrete sampling locations. Spatial interpolation is a complex operation that has traditionally required expert treatment, making automation a serious challenge. This paper presents a few lessons learnt from INTAMAP, a project that is developing an interoperable web processing service (WPS) for the automatic interpolation of environmental data using advanced geostatistics, adopting a Service Oriented Architecture (SOA). The “rainbow box” approach we followed provides access to the functionality at a whole range of different levels. We show here how the integration of open standards, open source and powerful statistical processing capabilities allows us to automate a complex process while offering users a level of access and control that best suits their requirements. This facilitates benchmarking exercises as well as the regular reporting of environmental information without requiring remote users to have specialized skills in geostatistics.
Resumo:
We uncover high persistence in credit spread series that can obscure the relationship between the theoretical determinants of credit risk and observed credit spreads. We use a Markovswitching model, which also captures the stability (low frequency changes) of credit ratings, to show why credit spreads may continue to respond to past levels of credit risk, even though the state of the economy has changed. A bivariate model of credit spreads and either macroeconomic activity or equity market volatility detects large and significant correlations that are consistent with theory but have not been observed in previous studies. © 2010 Nova Science Publishers, Inc. All rights reserved.
Resumo:
Contingent Protection has grown to become an important trade restricting device. In the European Union, protection instruments like antidumping are used extensively. This paper analyses whether macroeconomic pressures may contribute to explain the variations in the intensity of antidumping protectionism in the EU. The empirical analysis uses count data models, applying various specification tests to derive the most appropriate specification. Our results suggest that the filing activity is inversely related to the macroeconomic conditions. Moreover, they confirm existing evidence for the US suggesting that domestic macroeconomic pressures are a more important determinant of contingent protection policy than external pressures.