989 resultados para Empirical processes
Resumo:
When speech is degraded, word report is higher for semantically coherent sentences (e.g., her new skirt was made of denim) than for anomalous sentences (e.g., her good slope was done in carrot). Such increased intelligibility is often described as resulting from "top-down" processes, reflecting an assumption that higher-level (semantic) neural processes support lower-level (perceptual) mechanisms. We used time-resolved sparse fMRI to test for top-down neural mechanisms, measuring activity while participants heard coherent and anomalous sentences presented in speech envelope/spectrum noise at varying signal-to-noise ratios (SNR). The timing of BOLD responses to more intelligible speech provides evidence of hierarchical organization, with earlier responses in peri-auditory regions of the posterior superior temporal gyrus than in more distant temporal and frontal regions. Despite Sentence content × SNR interactions in the superior temporal gyrus, prefrontal regions respond after auditory/perceptual regions. Although we cannot rule out top-down effects, this pattern is more compatible with a purely feedforward or bottom-up account, in which the results of lower-level perceptual processing are passed to inferior frontal regions. Behavioral and neural evidence that sentence content influences perception of degraded speech does not necessarily imply "top-down" neural processes.
Resumo:
Spatial econometrics has been criticized by some economists because some model specifications have been driven by data-analytic considerations rather than having a firm foundation in economic theory. In particular this applies to the so-called W matrix, which is integral to the structure of endogenous and exogenous spatial lags, and to spatial error processes, and which are almost the sine qua non of spatial econometrics. Moreover it has been suggested that the significance of a spatially lagged dependent variable involving W may be misleading, since it may be simply picking up the effects of omitted spatially dependent variables, incorrectly suggesting the existence of a spillover mechanism. In this paper we review the theoretical and empirical rationale for network dependence and spatial externalities as embodied in spatially lagged variables, arguing that failing to acknowledge their presence at least leads to biased inference, can be a cause of inconsistent estimation, and leads to an incorrect understanding of true causal processes.
Resumo:
This paper estimates individual wage equations in order to test two rival non-nested theories of economic agglomeration, namely New Economic Geography (NEG), as represented by the NEG wage equation and urban economic (UE) theory , in which wages relate to employment density. The paper makes an original contribution by evidently being the first empirical paper to examine the issue of agglomeration processes associated with contemporary theory working with micro-level data, highlighting the role of gender and other individual-level characteristics. For male respondents, there is no significant evidence that wage levels are an outcome of the mechanisms suggested by NEG or UE theory, but this is not the case for female respondents. We speculate on the reasons for the gender difference.
Resumo:
The efficient markets hypothesis implies that arbitrage opportunities in markets such as those for foreign exchange (FX) would be, at most, short-lived. The present paper surveys the fragmented nature of FX markets, revealing that information in these markets is also likely to be fragmented. The “quant” workforce in the hedge fund featured in The Fear Index novel by Robert Harris would have little or no reason for their existence in an EMH world. The four currency combinatorial analysis of arbitrage sequences contained in Cross, Kozyakin, O’Callaghan, Pokrovskii and Pokrovskiy (2012) is then considered. Their results suggest that arbitrage processes, rather than being self-extinguishing, tend to be periodic in nature. This helps explain the fact that arbitrage dealing tends to be endemic in FX markets.
Resumo:
The efficient markets hypothesis implies that arbitrage opportunities in markets such as those for foreign exchange (FX) would be, at most, short-lived. The present paper surveys the fragmented nature of FX markets, revealing that information in these markets is also likely to be fragmented. The “quant” workforce in the hedge fund featured in The Fear Index novel by Robert Harris would have little or no reason for their existence in an EMH world. The four currency combinatorial analysis of arbitrage sequences contained in Cross, Kozyakin, O’Callaghan, Pokrovskii and Pokrovskiy (2012) is then considered. Their results suggest that arbitrage processes, rather than being self-extinguishing, tend to be periodic in nature. This helps explain the fact that arbitrage dealing tends to be endemic in FX markets.
Resumo:
Report for the scientific sojourn carried out at the Department of Structure and Constituents of Matter during 2007.The main focus of the work was on phenomena related to nano-electromechanical processes that take place on a cellular level. Additionally, it has also been performed independent work related to charge and energy transfer in bio molecules, energy transfer in coupled spin systems as well as electrodynamics of nonlinear metamaterials.
Resumo:
This paper investigates the conduct of monetary and fiscal policy in the post-ERM period in the UK. Using a simple DSGE New Keynesian model of non-cooperative monetary and fiscal policy interactions under fiscal intra-period leadership, we demonstrate that the past policy in the UK is better explained by optimal policy under discretion than under commitment. We estimate policy objectives of both policy makers. We demonstrate that fiscal policy plays an important role in identifying the monetary policy regime.
Resumo:
The efficient markets hypothesis implies that arbitrage opportunities in markets such as those for foreign exchange (FX) would be, at most, short-lived. The present paper surveys the fragmented nature of FX markets, revealing that information in these markets is also likely to be fragmented. The “quant” workforce in the hedge fund featured in The Fear Index novel by Robert Harris would have little or no reason for their existence in an EMH world. The four currency combinatorial analysis of arbitrage sequences contained in Cross, Kozyakin, O’Callaghan, Pokrovskii and Pokrovskiy (2012) is then considered. Their results suggest that arbitrage processes, rather than being self-extinguishing, tend to be periodic in nature. This helps explain the fact that arbitrage dealing tends to be endemic in FX markets.
Resumo:
Genuine Savings has emerged as a widely-used indicator of sustainable development. In this paper, we use long-term data stretching back to 1870 to undertake empirical tests of the relationship between Genuine Savings (GS) and future well-being for three countries: Britain, the USA and Germany. Our tests are based on an underlying theoretical relationship between GS and changes in the present value of future consumption. Based on both single country and panel results, we find evidence supporting the existence of a cointegrating (long run equilibrium) relationship between GS and future well-being, and fail to reject the basic theoretical result on the relationship between these two macroeconomic variables. This provides some support for the GS measure of weak sustainability. We also show the effects of modelling shocks, such as World War Two and the Great Depression.
Resumo:
We consider, both theoretically and empirically, how different organization modes are aligned to govern the efficient solving of technological problems. The data set is a sample from the Chinese consumer electronics industry. Following mainly the problem solving perspective (PSP) within the knowledge based view (KBV), we develop and test several PSP and KBV hypotheses, in conjunction with competing transaction cost economics (TCE) alternatives, in an examination of the determinants of the R&D organization mode. The results show that a firm’s existing knowledge base is the single most important explanatory variable. Problem complexity and decomposability are also found to be important, consistent with the theoretical predictions of the PSP, but it is suggested that these two dimensions need to be treated as separate variables. TCE hypotheses also receive some support, but the estimation results seem more supportive of the PSP and the KBV than the TCE.
Resumo:
Modern macroeconomic theory utilises optimal control techniques to model the maximisation of individual well-being using a lifetime utility function. Agents face choices over current and future consumption (with resultant implied savings decisions) seeking to maximise the present value of current plus future well-being. However, such inter-temporal welfare-maximising assumptions remain empirically untested. In the work presented here we test whether welfare was in (historical) fact maximised in the US between 1870-2000 and find empirical support for the optimising basis of growth theory, but only once a comprehensive view of what constitutes a country’s wealth or capital is taken into account.
Resumo:
We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.
Resumo:
We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.
Resumo:
This paper examines the antecedents and innovation consequences of the methods firms adopt in organizing their search strategies. From a theoretical perspective, organizational search is described using a typology that shows how firms implement exploration and exploitation search activities that span their organizational boundaries. This typology includes three models of implementation: ambidextrous, specialized, and diversified implementation. From an empirical perspective, the paper examines the performance consequences when applying these models, and compares their capacity to produce complementarities. Additionally, since firms' choices in matters of organizational search are viewed as endogenous variables, the paper examines the drivers affecting them and identifies the importance of firms' absorptive capacity and diversified technological opportunities in determining these choices. The empirical design of the paper draws on new data for manufacturing firms in Spain, surveyed between 2003 and 2006.