49 resultados para INFLATION
Resumo:
In order to examine metacognitive accuracy (i.e., the relationship between metacognitive judgment and memory performance), researchers often rely on by-participant analysis, where metacognitive accuracy (e.g., resolution, as measured by the gamma coefficient or signal detection measures) is computed for each participant and the computed values are entered into group-level statistical tests such as the t-test. In the current work, we argue that the by-participant analysis, regardless of the accuracy measurements used, would produce a substantial inflation of Type-1 error rates, when a random item effect is present. A mixed-effects model is proposed as a way to effectively address the issue, and our simulation studies examining Type-1 error rates indeed showed superior performance of mixed-effects model analysis as compared to the conventional by-participant analysis. We also present real data applications to illustrate further strengths of mixed-effects model analysis. Our findings imply that caution is needed when using the by-participant analysis, and recommend the mixed-effects model analysis.
Resumo:
Survey respondents who make point predictions and histogram forecasts of macro-variables reveal both how uncertain they believe the future to be, ex ante, as well as their ex post performance. Macroeconomic forecasters tend to be overconfident at horizons of a year or more, but overestimate (i.e., are underconfident regarding) the uncertainty surrounding their predictions at short horizons. Ex ante uncertainty remains at a high level compared to the ex post measure as the forecast horizon shortens. There is little evidence of a link between individuals’ ex post forecast accuracy and their ex ante subjective assessments.
Resumo:
In recent years an increasing number of papers have employed meta-analysis to integrate effect sizes of researchers’ own series of studies within a single paper (“internal meta-analysis”). Although this approach has the obvious advantage of obtaining narrower confidence intervals, we show that it could inadvertently inflate false-positive rates if researchers are motivated to use internal meta-analysis in order to obtain a significant overall effect. Specifically, if one decides whether to stop or continue a further replication experiment depending on the significance of the results in an internal meta-analysis, false-positive rates would increase beyond the nominal level. We conducted a set of Monte-Carlo simulations to demonstrate our argument, and provided a literature review to gauge awareness and prevalence of this issue. Furthermore, we made several recommendations when using internal meta-analysis to make a judgment on statistical significance.
Resumo:
What explains the cross-national variation in inflation rates in developed countries? Previous literature has emphasised the role of ideas and institutions, and to a lesser extent interest groups, while leaving the role of electoral politics comparatively unexplored. This paper seeks to redress this neglect by focusing on one case where electoral politics matters for inflation: the share of the population above 65 years old in a country. I argue that countries with a larger share of elderly have lower inflation because older people are both more inflation averse and politically powerful, forcing governments to pursue lower inflation. I test my argument in three steps. First, logistic regression analysis of survey data confirms older people are more inflation averse. Second, panel data regression analysis of party manifesto data reveals that European countries with more old people have more economically orthodox political parties. Third, time series cross-section regression analyses demonstrate that the share of the elderly is negatively correlated with inflation in both a sample of 21 advanced OECD economies and a larger sample of 175 countries. Ageing may therefore push governments to adopt a low inflation regime.
Resumo:
While over-dispersion in capture–recapture studies is well known to lead to poor estimation of population size, current diagnostic tools to detect the presence of heterogeneity have not been specifically developed for capture–recapture studies. To address this, a simple and efficient method of testing for over-dispersion in zero-truncated count data is developed and evaluated. The proposed method generalizes an over-dispersion test previously suggested for un-truncated count data and may also be used for testing residual over-dispersion in zero-inflation data. Simulations suggest that the asymptotic distribution of the test statistic is standard normal and that this approximation is also reasonable for small sample sizes. The method is also shown to be more efficient than an existing test for over-dispersion adapted for the capture–recapture setting. Studies with zero-truncated and zero-inflated count data are used to illustrate the test procedures.
Resumo:
Lava domes comprise core, carapace, and clastic talus components. They can grow endogenously by inflation of a core and/or exogenously with the extrusion of shear bounded lobes and whaleback lobes at the surface. Internal structure is paramount in determining the extent to which lava dome growth evolves stably, or conversely the propensity for collapse. The more core lava that exists within a dome, in both relative and absolute terms, the more explosive energy is available, both for large pyroclastic flows following collapse and in particular for lateral blast events following very rapid removal of lateral support to the dome. Knowledge of the location of the core lava within the dome is also relevant for hazard assessment purposes. A spreading toe, or lobe of core lava, over a talus substrate may be both relatively unstable and likely to accelerate to more violent activity during the early phases of a retrogressive collapse. Soufrière Hills Volcano, Montserrat has been erupting since 1995 and has produced numerous lava domes that have undergone repeated collapse events. We consider one continuous dome growth period, from August 2005 to May 2006 that resulted in a dome collapse event on 20th May 2006. The collapse event lasted 3 h, removing the whole dome plus dome remnants from a previous growth period in an unusually violent and rapid collapse event. We use an axisymmetrical computational Finite Element Method model for the growth and evolution of a lava dome. Our model comprises evolving core, carapace and talus components based on axisymmetrical endogenous dome growth, which permits us to model the interface between talus and core. Despite explicitly only modelling axisymmetrical endogenous dome growth our core–talus model simulates many of the observed growth characteristics of the 2005–2006 SHV lava dome well. Further, it is possible for our simulations to replicate large-scale exogenous characteristics when a considerable volume of talus has accumulated around the lower flanks of the dome. Model results suggest that dome core can override talus within a growing dome, potentially generating a region of significant weakness and a potential locus for collapse initiation.
Resumo:
Analyses of high-density single-nucleotide polymorphism (SNP) data, such as genetic mapping and linkage disequilibrium (LD) studies, require phase-known haplotypes to allow for the correlation between tightly linked loci. However, current SNP genotyping technology cannot determine phase, which must be inferred statistically. In this paper, we present a new Bayesian Markov chain Monte Carlo (MCMC) algorithm for population haplotype frequency estimation, particulary in the context of LD assessment. The novel feature of the method is the incorporation of a log-linear prior model for population haplotype frequencies. We present simulations to suggest that 1) the log-linear prior model is more appropriate than the standard coalescent process in the presence of recombination (>0.02cM between adjacent loci), and 2) there is substantial inflation in measures of LD obtained by a "two-stage" approach to the analysis by treating the "best" haplotype configuration as correct, without regard to uncertainty in the recombination process. Genet Epidemiol 25:106-114, 2003. (C) 2003 Wiley-Liss, Inc.
Resumo:
The rheological properties of dough and gluten are important for end-use quality of flour but there is a lack of knowledge of the relationships between fundamental and empirical tests and how they relate to flour composition and gluten quality. Dough and gluten from six breadmaking wheat qualities were subjected to a range of rheological tests. Fundamental (small-deformation) rheological characterizations (dynamic oscillatory shear and creep recovery) were performed on gluten to avoid the nonlinear influence of the starch component, whereas large deformation tests were conducted on both dough and gluten. A number of variables from the various curves were considered and subjected to a principal component analysis (PCA) to get an overview of relationships between the various variables. The first component represented variability in protein quality, associated with elasticity and tenacity in large deformation (large positive loadings for resistance to extension and initial slope of dough and gluten extension curves recorded by the SMS/Kieffer dough and gluten extensibility rig, and the tenacity and strain hardening index of dough measured by the Dobraszczyk/Roberts dough inflation system), the elastic character of the hydrated gluten proteins (large positive loading for elastic modulus [G'], large negative loadings for tan delta and steady state compliance [J(e)(0)]), the presence of high molecular weight glutenin subunits (HMW-GS) 5+10 vs. 2+12, and a size distribution of glutenin polymers shifted toward the high-end range. The second principal component was associated with flour protein content. Certain rheological data were influenced by protein content in addition to protein quality (area under dough extension curves and dough inflation curves [W]). The approach made it possible to bridge the gap between fundamental rheological properties, empirical measurements of physical properties, protein composition, and size distribution. The interpretation of this study gave indications of the molecular basis for differences in breadmaking performance.
Resumo:
The relationships between wheat protein quality and baking properties of 20 flour samples were studied for two breadmaking processes; a hearth bread test and the Chorleywood Bread Process (CBP). The strain hardening index obtained from dough inflation measurements, the proportion of unextractable polymeric protein, and mixing properties were among the variables found to be good indicators of protein quality and suitable for predicting potential baking quality of wheat flours. By partial least squares regression, flour and dough test variables were able to account for 71-93% of the variation in crumb texture, form ratio and volume of hearth loaves made using optimal mixing and fixed proving times. These protein quality variables were, however, not related to the volume of loaves produced by the CBP using mixing to constant work input and proving to constant height. On the other hand, variation in crumb texture of CBP loaves (54-55%) could be explained by protein quality. The results underline that the choice of baking procedure and loaf characteristics is vital in assessing the protein quality of flours. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Three large deformation rheological tests, the Kieffer dough extensibility system, the D/R dough inflation system and the 2 g mixograph test, were carried out on doughs made from a large number of winter wheat lines and cultivars grown in Poland. These lines and cultivars represented a broad spread in baking performance in order to assess their suitability as predictors of baking volume. The parameters most closely associated with baking volume were strain hardening index, bubble failure strain, and mixograph bandwidth at 10min. Simple correlations with baking volume indicate that bubble failure strain and strain hardening index give the highest correlations, whilst the use of best subsets regression, which selects the best combination of parameters, gave increased correlations with R-2 = 0.865 for dough inflation parameters, R-2 = 0. 842 for Kieffer parameters and R-2 = 0.760 for mixograph parameters. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This paper demonstrates that recent influential contributions to monetary policy imply an emerging consensus whereby neither rigid rules nor complete discretion are found optimal. Instead, middle-ground monetary regimes based on rules (operative under 'normal' circumstances) to anchor inflation expectations over the long run, but designed with enough flexibility to mitigate the short-run effect of shocks (with communicated discretion in 'exceptional' circumstances temporarily overriding these rules), are gaining support in theoretical models and policy formulation and implementation. The opposition of 'rules versus discretion' has, thus, reappeared as the synthesis of 'rules cum discretion', in essence as inflation-forecast targeting. But such synthesis is not without major theoretical problems, as we argue in this contribution. Furthermore, the very recent real-world events have made it obvious that the inflation targeting strategy of monetary policy, which rests upon the new consensus paradigm in modern macroeconomics is at best a 'fair weather' model. In the turbulent economic climate of highly unstable inflation, deep financial crisis and world-wide, abrupt economic slowdown nowadays this approach needs serious rethinking to say the least, if not abandoning it altogether
Resumo:
This paper develops and tests formulas for representing playing strength at chess by the quality of moves played, rather than by the results of games. Intrinsic quality is estimated via evaluations given by computer chess programs run to high depth, ideally so that their playing strength is sufficiently far ahead of the best human players as to be a `relatively omniscient' guide. Several formulas, each having intrinsic skill parameters s for `sensitivity' and c for `consistency', are argued theoretically and tested by regression on large sets of tournament games played by humans of varying strength as measured by the internationally standard Elo rating system. This establishes a correspondence between Elo rating and the parameters. A smooth correspondence is shown between statistical results and the century points on the Elo scale, and ratings are shown to have stayed quite constant over time. That is, there has been little or no `rating inflation'. The theory and empirical results are transferable to other rational-choice settings in which the alternatives have well-defined utilities, but in which complexity and bounded information constrain the perception of the utility values.
Resumo:
Recent research documents the importance of uncertainty in determining macroeconomic outcomes, but little is known about the transmission of uncertainty across such outcomes. This paper examines the response of uncertainty about inflation and output growth to shocks documenting statistically significant size and sign bias and spillover effects. Uncertainty about inflation is a determinant of output uncertainty, whereas higher growth volatility tends to raise inflation volatility. Both inflation and growth volatility respond asymmetrically to positive and negative shocks. Negative growth and inflation shocks lead to higher and more persistent uncertainty than shocks of equal magnitude but opposite sign.
Resumo:
Following the attack on the World Trade Center on 9/11 volatility of daily returns of the US stock market rose sharply. This increase in volatility may reflect fundamental changes in the economic determinants of prices such as expected earnings, interest rates, real growth and inflation. Alternatively, the increase in volatility may simply reflect the effects of increased uncertainty in the financial markets. This study therefore sets out to determine if the effects of the attack on the World Trade Center on 9/11 had a fundamental or purely financial impact on US real estate returns. In order to do this we compare pre- and post-9/11 crisis returns for a number of US REIT indexes using an approach suggested by French and Roll (1986), as extended by Tuluca et al (2003). In general we find no evidence that the effects of 9/11 had a fundamental effect on REIT returns. In other words, we find that the effect of the attack on the World Trade Center on 9/11 had only a financial effect on REIT returns and therefore was transitory.
Resumo:
This paper presents evidence for several features of the population of chess players, and the distribution of their performances measured in terms of Elo ratings and by computer analysis of moves. Evidence that ratings have remained stable since the inception of the Elo system in the 1970’s is given in several forms: by showing that the population of strong players fits a simple logistic-curve model without inflation, by plotting players’ average error against the FIDE category of tournaments over time, and by skill parameters from a model that employs computer analysis keeping a nearly constant relation to Elo rating across that time. The distribution of the model’s Intrinsic Performance Ratings can hence be used to compare populations that have limited interaction, such as between players in a national chess federation and FIDE, and ascertain relative drift in their respective rating systems.