15 resultados para Conditional autoregressive random effects model
em Aston University Research Archive
Resumo:
There is an alternative model of the 1-way ANOVA called the 'random effects' model or ‘nested’ design in which the objective is not to test specific effects but to estimate the degree of variation of a particular measurement and to compare different sources of variation that influence the measurement in space and/or time. The most important statistics from a random effects model are the components of variance which estimate the variance associated with each of the sources of variation influencing a measurement. The nested design is particularly useful in preliminary experiments designed to estimate different sources of variation and in the planning of appropriate sampling strategies.
Resumo:
In Statnote 9, we described a one-way analysis of variance (ANOVA) ‘random effects’ model in which the objective was to estimate the degree of variation of a particular measurement and to compare different sources of variation in space and time. The illustrative scenario involved the role of computer keyboards in a University communal computer laboratory as a possible source of microbial contamination of the hands. The study estimated the aerobic colony count of ten selected keyboards with samples taken from two keys per keyboard determined at 9am and 5pm. This type of design is often referred to as a ‘nested’ or ‘hierarchical’ design and the ANOVA estimated the degree of variation: (1) between keyboards, (2) between keys within a keyboard, and (3) between sample times within a key. An alternative to this design is a 'fixed effects' model in which the objective is not to measure sources of variation per se but to estimate differences between specific groups or treatments, which are regarded as 'fixed' or discrete effects. This statnote describes two scenarios utilizing this type of analysis: (1) measuring the degree of bacterial contamination on 2p coins collected from three types of business property, viz., a butcher’s shop, a sandwich shop, and a newsagent and (2) the effectiveness of drugs in the treatment of a fungal eye infection.
Resumo:
Molecular transport in phase space is crucial for chemical reactions because it defines how pre-reactive molecular configurations are found during the time evolution of the system. Using Molecular Dynamics (MD) simulated atomistic trajectories we test the assumption of the normal diffusion in the phase space for bulk water at ambient conditions by checking the equivalence of the transport to the random walk model. Contrary to common expectations we have found that some statistical features of the transport in the phase space differ from those of the normal diffusion models. This implies a non-random character of the path search process by the reacting complexes in water solutions. Our further numerical experiments show that a significant long period of non-stationarity in the transition probabilities of the segments of molecular trajectories can account for the observed non-uniform filling of the phase space. Surprisingly, the characteristic periods in the model non-stationarity constitute hundreds of nanoseconds, that is much longer time scales compared to typical lifetime of known liquid water molecular structures (several picoseconds).
Resumo:
BACKGROUND: The behavioral and psychological symptoms related to dementia (BPSD) are difficult to manage and are associated with adverse patient outcomes. OBJECTIVE: To systematically analyze the data on memantine in the treatment of BPSD. METHODS: We searched MEDLINE, EMBASE, Pharm-line, the Cochrane Centre Collaboration, www.clinicaltrials.gov, www.controlled-trials.com, and PsycINFO (1966-July 2007). We contacted manufacturers and scrutinized the reference sections of articles identified in our search for further references, including conference proceedings. Two researchers (IM and CF) independently reviewed all studies identified by the search strategy. We included 6 randomized, parallel-group, double-blind studies that rated BPSD with the Neuropsychiatric Inventory (NPI) in our meta-analysis. Patients had probable Alzheimer's disease and received treatment with memantine for at least one month. Overall efficacy of memantine on the NPI was established with a t-test for the average difference between means across studies, using a random effects model. RESULTS: Five of the 6 studies identified had NPI outcome data. In these 5 studies, 868 patients were treated with memantine and 882 patients were treated with placebo. Patients on memantine improved by 1.99 on the NPI scale (95% Cl -0.08 to -3.91; p = 0.041) compared with the placebo group. CONCLUSIONS: Initial data appear to indicate that memantine decreases NPI scores and may have a role in managing BPSD. However, there are a number of limitations with the current data; the effect size was relatively small, and whether memantine produces significant clinical benefit is not clear.
Resumo:
Assessing factors that predict new product success (NPS) holds critical importance for companies, as research shows that despite considerable new product investment, success rates are generally below 25%. Over the decades, meta-analytical attempts have been made to summarize empirical findings on NPS factors. However, market environment changes such as increased global competition, as well as methodological advancements in meta-analytical research, present a timely opportunity to augment their results. Hence, a key objective of this research is to provide an updated and extended meta-analytic investigation of the factors affecting NPS. Using Henard and Szymanski's meta-analysis as the most comprehensive recent summary of empirical findings, this study updates their findings by analyzing articles published from 1999 through 2011, the period following the original meta-analysis. Based on 233 empirical studies (from 204 manuscripts) on NPS, with a total 2618 effect sizes, this study also takes advantage of more recent methodological developments by re-calculating effects of the meta-analysis employing a random effects model. The study's scope broadens by including overlooked but important additional variables, notably “country culture,” and discusses substantive differences between the updated meta-analysis and its predecessor. Results reveal generally weaker effect sizes than those reported by Henard and Szymanski in 2001, and provide evolutionary evidence of decreased effects of common success factors over time. Moreover, culture emerges as an important moderating factor, weakening effect sizes for individualistic countries and strengthening effects for risk-averse countries, highlighting the importance of further investigating culture's role in product innovation studies, and of tracking changes of success factors of product innovations. Finally, a sharp increase since 1999 in studies investigating product and process characteristics identifies a significant shift in research interest in new product development success factors. The finding that the importance of success factors generally declines over time calls for new theoretical approaches to better capture the nature of new product development (NPD) success factors. One might speculate that the potential to create competitive advantages through an understanding of NPD success factors is reduced as knowledge of these factors becomes more widespread among managers. Results also imply that managers attempting to improve success rates of NPDs need to consider national culture as this factor exhibits a strong moderating effect: Working in varied cultural contexts will result in differing antecedents of successful new product ventures.
Resumo:
This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. We use non-linear, artificial intelligence techniques, namely, recurrent neural networks, evolution strategies and kernel methods in our forecasting experiment. In the experiment, these three methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. There is evidence in the literature that evolutionary methods can be used to evolve kernels hence our future work should combine the evolutionary and kernel methods to get the benefits of both.
Resumo:
One of the central explanations of the recent Asian Crisis has been the problem of moral hazard as the source of over-investment and excessive external borrowing. There is however rather limited firm-level empirical evidence to characterise inefficient use of internal and external finances. Using a large firm-level panel data-set from four badly affected Asian countries, this paper compares the rates of return to various internal and external funds among firms with low and high debt financing (relative to equity) among financially constrained and other firms. Selectivity-corrected estimates obtained from random effects panel data model do suggest evidence of significantly lower rates of return to long-term debt, even among firms relying more on debt relative to equity in our sample. There is also evidence that average effective interest rates often significantly exceeded the average returns to long-term debt in the sample countries in the pre-crisis period. © 2006 Elsevier Inc. All rights reserved.
Resumo:
This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two non-linear techniques, namely, recurrent neural networks and kernel recursive least squares regression - techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation.
Resumo:
Background - Agitation in Alzheimer’s disease (AD) is common and associated with poor patient life-quality and carer distress. The best evidence-based pharmacological treatments are antipsychotics which have limited benefits with increased morbidity and mortality. There are no memantine trials in clinically significant agitation but post-hoc analyses in other populations found reduced agitation. We tested the primary hypothesis, memantine is superior to placebo for clinically significant agitation, in patients with moderate-to-severe AD. Methods and Findings - We recruited 153 participants with AD and clinically significant agitation from care-homes or hospitals for a double-blind randomised-controlled trial and 149 people started the trial of memantine versus placebo. The primary outcome was 6 weeks mixed model autoregressive analysis of Cohen-Mansfield Agitation Inventory (CMAI). Secondary outcomes were: 12 weeks CMAI; 6 and 12 weeks Neuropsychiatric symptoms (NPI), Clinical Global Impression Change (CGI-C), Standardised Mini Mental State Examination, Severe Impairment Battery. Using a mixed effects model we found no significant differences in the primary outcome, 6 weeks CMAI, between memantine and placebo (memantine lower -3.0; -8.3 to 2.2, p = 0.26); or 12 weeks CMAI; or CGI-C or adverse events at 6 or 12 weeks. NPI mean difference favoured memantine at weeks 6 (-6.9; -12.2 to -1.6; p = 0.012) and 12 (-9.6; -15.0 to -4.3 p = 0.0005). Memantine was significantly better than placebo for cognition. The main study limitation is that it still remains to be determined whether memantine has a role in milder agitation in AD. Conclusions - Memantine did not improve significant agitation in people with in moderate-to-severe AD. Future studies are urgently needed to test other pharmacological candidates in this group and memantine for neuropsychiatric symptoms.
Resumo:
This paper compares the experience of forecasting the UK government bond yield curve before and after the dramatic lowering of short-term interest rates from October 2008. Out-of-sample forecasts for 1, 6 and 12 months are generated from each of a dynamic Nelson-Siegel model, autoregressive models for both yields and the principal components extracted from those yields, a slope regression and a random walk model. At short forecasting horizons, there is little difference in the performance of the models both prior to and after 2008. However, for medium- to longer-term horizons, the slope regression provided the best forecasts prior to 2008, while the recent experience of near-zero short interest rates coincides with a period of forecasting superiority for the autoregressive and dynamic Nelson-Siegel models. © 2014 John Wiley & Sons, Ltd.
Resumo:
This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regressiontechniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a nave random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists' long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies. © 2010 Elsevier B.V. All rights reserved.
Resumo:
The dynamical evolution of dislocations in plastically deformed metals is controlled by both deterministic factors arising out of applied loads and stochastic effects appearing due to fluctuations of internal stress. Such type of stochastic dislocation processes and the associated spatially inhomogeneous modes lead to randomness in the observed deformation structure. Previous studies have analyzed the role of randomness in such textural evolution but none of these models have considered the impact of a finite decay time (all previous models assumed instantaneous relaxation which is "unphysical") of the stochastic perturbations in the overall dynamics of the system. The present article bridges this knowledge gap by introducing a colored noise in the form of an Ornstein-Uhlenbeck noise in the analysis of a class of linear and nonlinear Wiener and Ornstein-Uhlenbeck processes that these structural dislocation dynamics could be mapped on to. Based on an analysis of the relevant Fokker-Planck model, our results show that linear Wiener processes remain unaffected by the second time scale in the problem but all nonlinear processes, both Wiener type and Ornstein-Uhlenbeck type, scale as a function of the noise decay time τ. The results are expected to ramify existing experimental observations and inspire new numerical and laboratory tests to gain further insight into the competition between deterministic and random effects in modeling plastically deformed samples.
Resumo:
The Dirichlet process mixture model (DPMM) is a ubiquitous, flexible Bayesian nonparametric statistical model. However, full probabilistic inference in this model is analytically intractable, so that computationally intensive techniques such as Gibbs sampling are required. As a result, DPMM-based methods, which have considerable potential, are restricted to applications in which computational resources and time for inference is plentiful. For example, they would not be practical for digital signal processing on embedded hardware, where computational resources are at a serious premium. Here, we develop a simplified yet statistically rigorous approximate maximum a-posteriori (MAP) inference algorithm for DPMMs. This algorithm is as simple as DP-means clustering, solves the MAP problem as well as Gibbs sampling, while requiring only a fraction of the computational effort. (For freely available code that implements the MAP-DP algorithm for Gaussian mixtures see http://www.maxlittle.net/.) Unlike related small variance asymptotics (SVA), our method is non-degenerate and so inherits the “rich get richer” property of the Dirichlet process. It also retains a non-degenerate closed-form likelihood which enables out-of-sample calculations and the use of standard tools such as cross-validation. We illustrate the benefits of our algorithm on a range of examples and contrast it to variational, SVA and sampling approaches from both a computational complexity perspective as well as in terms of clustering performance. We demonstrate the wide applicabiity of our approach by presenting an approximate MAP inference method for the infinite hidden Markov model whose performance contrasts favorably with a recently proposed hybrid SVA approach. Similarly, we show how our algorithm can applied to a semiparametric mixed-effects regression model where the random effects distribution is modelled using an infinite mixture model, as used in longitudinal progression modelling in population health science. Finally, we propose directions for future research on approximate MAP inference in Bayesian nonparametrics.
Resumo:
Numerous studies find that monetary models of exchange rates cannot beat a random walk model. Such a finding, however, is not surprising given that such models are built upon money demand functions and traditional money demand functions appear to have broken down in many developed countries. In this article, we investigate whether using a more stable underlying money demand function results in improvements in forecasts of monetary models of exchange rates. More specifically, we use a sweep-adjusted measure of US monetary aggregate M1 which has been shown to have a more stable money demand function than the official M1 measure. The results suggest that the monetary models of exchange rates contain information about future movements of exchange rates, but the success of such models depends on the stability of money demand functions and the specifications of the models.
Resumo:
This paper examines the source country determinants of FDI into Japan. The paper highlights certain methodological and theoretical weaknesses in the previous literature and offers some explanations for hitherto ambiguous results. Specifically, the paper highlights the importance of panel data analysis, and the identification of fixed effects in the analysis rather than simply pooling the data. Indeed, we argue that many of the results reported elsewhere are a feature of this mis-specification. To this end, pooled, fixed effects and random effects estimates are compared. The results suggest that FDI into Japan is inversely related to trade flows, such that trade and FDI are substitutes. Moreover, the results also suggest that FDI increases with home country political and economic stability. The paper also shows that previously reported results, regarding the importance of exchange rates, relative borrowing costs and labour costs in explaining FDI flows, are sensitive to the econometric specification and estimation approach. The paper also discusses the importance of these results within a policy context. In recent years Japan has sought to attract FDI, though many firms still complain of barriers to inward investment penetration in Japan. The results show that cultural and geographic distance are only of marginal importance in explaining FDI, and that the results are consistent with the market-seeking explanation of FDI. As such, the attitude to risk in the source country is strongly related to the size of FDI flows to Japan. © 2007 The Authors Journal compilation © 2007 Blackwell Publishing Ltd.