979 resultados para Forecast error variance


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study examines the rationality and momentum in forecasts for rental, capital value and total returns for the real estate investment market in the United Kingdom. In order to investigate if forecasters are affected by the general economic conditions present at the time of forecast we incorporate into the analysis Gross Domestic Product(GDP) and the Default Spread (DS). The empirical findings show high levels of momentum in the forecasts, with highly persistent forecast errors. The results also indicate that forecasters are affected by adverse conditions. This is consistent with the finding that they tend to exhibit greater forecast error when the property market is underperforming and vice-versa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Extreme variability of the winter- and spring-time stratospheric polar vortex has been shown to affect extratropical tropospheric weather. Therefore, reducing stratospheric forecast error may be one way to improve the skill of tropospheric weather forecasts. In this review, the basis for this idea is examined. A range of studies of different stratospheric extreme vortex events shows that they can be skilfully forecasted beyond five days and into the sub-seasonal range (0-30 days) in some cases. Separate studies show that typical errors in forecasting a stratospheric extreme vortex event can alter tropospheric forecasts skill by 5-7% in the extratropics on sub-seasonal timescales. Thus understanding what limits stratospheric predictability is of significant interest to operational forecasting centres. Both limitations in forecasting tropospheric planetary waves and stratospheric model biases have been shown to be important in this context.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Satellite-based (e.g., Synthetic Aperture Radar [SAR]) water level observations (WLOs) of the floodplain can be sequentially assimilated into a hydrodynamic model to decrease forecast uncertainty. This has the potential to keep the forecast on track, so providing an Earth Observation (EO) based flood forecast system. However, the operational applicability of such a system for floods developed over river networks requires further testing. One of the promising techniques for assimilation in this field is the family of ensemble Kalman (EnKF) filters. These filters use a limited-size ensemble representation of the forecast error covariance matrix. This representation tends to develop spurious correlations as the forecast-assimilation cycle proceeds, which is a further complication for dealing with floods in either urban areas or river junctions in rural environments. Here we evaluate the assimilation of WLOs obtained from a sequence of real SAR overpasses (the X-band COSMO-Skymed constellation) in a case study. We show that a direct application of a global Ensemble Transform Kalman Filter (ETKF) suffers from filter divergence caused by spurious correlations. However, a spatially-based filter localization provides a substantial moderation in the development of the forecast error covariance matrix, directly improving the forecast and also making it possible to further benefit from a simultaneous online inflow error estimation and correction. Additionally, we propose and evaluate a novel along-network metric for filter localization, which is physically-meaningful for the flood over a network problem. Using this metric, we further evaluate the simultaneous estimation of channel friction and spatially-variable channel bathymetry, for which the filter seems able to converge simultaneously to sensible values. Results also indicate that friction is a second order effect in flood inundation models applied to gradually varied flow in large rivers. The study is not conclusive regarding whether in an operational situation the simultaneous estimation of friction and bathymetry helps the current forecast. Overall, the results indicate the feasibility of stand-alone EO-based operational flood forecasting.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: There are few validated measures of organizational context and none that we located are parsimonious and address modifiable characteristics of context. The Alberta Context Tool (ACT) was developed to meet this need. The instrument assesses 8 dimensions of context, which comprise 10 concepts. The purpose of this paper is to report evidence to further the validity argument for ACT. The specific objectives of this paper are to: (1) examine the extent to which the 10 ACT concepts discriminate between patient care units and (2) identify variables that significantly contribute to between-unit variation for each of the 10 concepts.

Methods: 859 professional nurses (844 valid responses) working in medical, surgical and critical care units of 8 Canadian pediatric hospitals completed the ACT. A random intercept, fixed effects hierarchical linear modeling (HLM) strategy was used to quantify and explain variance in the 10 ACT concepts to establish the ACT’s ability to discriminate between units. We ran 40 models (a series of 4 models for each of the 10 concepts) in which we systematically assessed the unique contribution (i.e., error variance reduction) of different variables to between-unit variation. First, we constructed a null model in which we quantified the variance overall, in each of the concepts. Then we controlled for the contribution of individual level variables (Model 1). In Model 2, we assessed the contribution of practice specialty (medical, surgical, critical care) to variation since it was central to construction of the sampling frame for the study. Finally, we assessed the contribution of additional unit level variables (Model 3).

Results: The null model (unadjusted baseline HLM model) established that there was significant variation between units in each of the 10 ACT concepts (i.e., discrimination between units). When we controlled for individual characteristics, significant variation in the 10 concepts remained. Assessment of the contribution of specialty to between-unit variation enabled us to explain more variance (1.19% to 16.73%) in 6 of the 10 ACT concepts. Finally, when we assessed the unique contribution of the unit level variables available to us, we were able to explain additional variance (15.91% to 73.25%) in 7 of the 10 ACT concepts.

Conclusion: The findings reported here represent the third published argument for validity of the ACT and adds to the evidence supporting its use to discriminate patient care units by all 10 contextual factors. We found evidence of relationships between a variety of individual and unit-level variables that explained much of this between-unit variation for each of the 10 ACT concepts. Future research will include examination of the relationships between the ACT’s contextual factors and research utilization by nurses and ultimately the relationships between context, research utilization, and outcomes for patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

 Objective: This study investigated the relationship between motor performance and social-communicative impairment in children with ADHD-combined type (ADHD-CT). Method: An upper limb Fitts’ aiming task was used as a measure of motor performance and the Social Responsiveness Scale as a measure of social-communicative/autistic impairment in the following groups: ADHD-CT (n = 11) and typically developing (TD) controls (n = 10). Results: Children with ADHD-CT displayed greater variability in their movements, reflected in increased error variance over repeated aiming trials compared with TD controls. Motor performance variability was associated with social-communicative deficits in the ADHD-CT but not in the TD group. Conclusion: Social-communicative impairments further complicate the clinical picture of ADHD-CT; therefore, further research in this area is warranted to ascertain whether a particular pattern of motor disturbance in children with ADHD-CT may be clinically useful in identifying and assessing children with a more complex ADHD presentation. © 2012 SAGE Publications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diverse strain types of methicillin-resistant Staphylococcus aureus (MRSA) cause infections in community settings worldwide. To examine heterogeneity of spread within households and to identify common risk factors for household transmission across settings, primary data from studies conducted in New York (USA), Breda (The Netherlands), and Melbourne (Australia) were pooled. Following MRSA infection of the index patient, household members completed questionnaires and provided nasal swabs. Swabs positive for S. aureus were genotyped by spa sequencing. Poisson regression with robust error variance was used to estimate prevalence odds ratios for transmission of the clinical isolate to non-index household members. Great diversity of strain types existed across studies. Despite differences between studies, the index patient being colonized with the clinical isolate at the home visit (P < 0·01) and the percent of household members aged <18 years (P < 0·01) were independently associated with transmission. Targeted decolonization strategies could be used across geographical settings to limit household MRSA transmission.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study demonstrates, for the first time, how Bayesian hierarchical modeling can be applied to yield novel insights into the long-term temporal dynamics of subjective well-being (SWB). Several models were proposed and examined using Bayesian methods. The models were assessed using a sample of Australian adults (. n=. 1081) who provided annual SWB scores on between 5 and 10 occasions. The best fitting models involved a probit transformation, allowed error variance to vary across participants, and did not include a lag parameter. Including a random linear and quadratic effect resulted in only a small improvement over the intercept only model. Examination of individual-level fits suggested that most participants were stable with a small subset exhibiting patterns of systematic change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper makes use of the idea of prediction intervals (PIs) to capture the uncertainty associated with wind power generation in power systems. Since the forecasting errors cannot be appropriately modeled using distribution probability functions, here we employ a powerful nonparametric approach called lower upper bound estimation (LUBE) method to construct the PIs. The proposed LUBE method uses a new framework based on a combination of PIs to overcome the performance instability of neural networks (NNs) used in the LUBE method. Also, a new fuzzy-based cost function is proposed with the purpose of having more freedom and flexibility in adjusting NN parameters used for construction of PIs. In comparison with the other cost functions in the literature, this new formulation allows the decision-makers to apply their preferences for satisfying the PI coverage probability and PI normalized average width individually. As the optimization tool, bat algorithm with a new modification is introduced to solve the problem. The feasibility and satisfying performance of the proposed method are examined using datasets taken from different wind farms in Australia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

I start presenting an explicit solution to Taylorís (2001) model, in order to illustrate the link between the target interest rate and the overnight interest rate prevailing in the economy. Next, I use Vector Auto Regressions to shed some light on the evolution of key macroeconomic variables after the Central Bank of Brazil increases the target interest rate by 1%. Point estimates show a four-year accumulated output loss ranging from 0:04% (whole sample, 1980 : 1-2004 : 2; quarterly data) to 0:25% (Post-Real data only) with a Örst-year peak output response between 0:04% and 1:0%; respectively. Prices decline between 2% and 4% in a 4-year horizon. The accumulated output response is found to be between 3:5 and 6 times higher after the Real Plan than when the whole sample is considered. The 95% confidence bands obtained using bias-corrected bootstrap always include the null output response when the whole sample is used, but not when the data is restricted to the Post-Real period. Innovations to interest rates explain between 4:9% (whole sample) and 9:2% (post-Real sample) of the forecast error of GDP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is well known that cointegration between the level of two variables (labeled Yt and yt in this paper) is a necessary condition to assess the empirical validity of a present-value model (PV and PVM, respectively, hereafter) linking them. The work on cointegration has been so prevalent that it is often overlooked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. The basis of this result is the use of rational expectations in forecasting future values of variables in the PVM. If this condition fails, the present-value equation will not be valid, since it will contain an additional term capturing the (non-zero) conditional expected value of future error terms. Our article has a few novel contributions, but two stand out. First, in testing for PVMs, we advise to split the restrictions implied by PV relationships into orthogonality conditions (or reduced rank restrictions) before additional tests on the value of parameters. We show that PV relationships entail a weak-form common feature relationship as in Hecq, Palm, and Urbain (2006) and in Athanasopoulos, Guillén, Issler and Vahid (2011) and also a polynomial serial-correlation common feature relationship as in Cubadda and Hecq (2001), which represent restrictions on dynamic models which allow several tests for the existence of PV relationships to be used. Because these relationships occur mostly with nancial data, we propose tests based on generalized method of moment (GMM) estimates, where it is straightforward to propose robust tests in the presence of heteroskedasticity. We also propose a robust Wald test developed to investigate the presence of reduced rank models. Their performance is evaluated in a Monte-Carlo exercise. Second, in the context of asset pricing, we propose applying a permanent-transitory (PT) decomposition based on Beveridge and Nelson (1981), which focus on extracting the long-run component of asset prices, a key concept in modern nancial theory as discussed in Alvarez and Jermann (2005), Hansen and Scheinkman (2009), and Nieuwerburgh, Lustig, Verdelhan (2010). Here again we can exploit the results developed in the common cycle literature to easily extract permament and transitory components under both long and also short-run restrictions. The techniques discussed herein are applied to long span annual data on long- and short-term interest rates and on price and dividend for the U.S. economy. In both applications we do not reject the existence of a common cyclical feature vector linking these two series. Extracting the long-run component shows the usefulness of our approach and highlights the presence of asset-pricing bubbles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The broader objective of this study undertaking can briefly be articulated in particulate aims as follows: to measure the attitudes of consumers regarding the brand displayed by this strategy as well as to highlight recall, recognition and purchase intentions generated by product placement on consumers. In addition, check the differences and similarities between the behavior of Brazilian and American consumers caused by the influence of product placements. The study was undertaken targeting consumer audience in Brazil and the U.S. A rang3 modeling set ups were performed in order to realign study instruments and hypothesis towards the research objectives. This study gave focus on the following hypothesized models. H1: Consumers / Participants who viewed the brands / products in the movie have a higher brand / product recall compared to the consumers / participants who did not view the brands / products in the movie. H2: US Consumers / Participants are able to recognize and recall brands / products which appear in the background of the movie than Brazil. H3: Consumers / participants from USA are more accepting of product placements compared to their counterparts in Brazil. H4: There are discernible similarities in consumer / participant brand attitudes and purchase intentions in consumers / participants from USA and Brazil in spite of the fact that their country of origin is different. Cronbach’s Alpha Coefficient ensured the reliability of survey instruments. The study involved the use of the Structural Equation Modeling (SEM) for the hypothesis testing. This study used the Confirmatory Factor Analysis (CFA) to assess both the convergent and discriminant validities instead of using the Exploratory Factor Analysis (EFA) or the Principal Component Analysis (PCA). This reinforced for the use of the regression Chi Square and T statistical tests in further. Only hypothesis H3 was rejected, the rest were not. T test provided insight findings on specific subgroup significant differences. In the SEM testing, the error variance for product placement attitudes was negative for both the groups. On this The Heywood Case came in handy to fix negative values. The researcher used both quantitative and qualitative approach where closed ended questionnaires and interviews respectively were used to collect primary data. The results were additionally provided with tabulations. It can be concluded that, product placement varies markedly in the U.S. from Brazil based on the influence a range of factors provided in the study. However, there are elements of convergence probably driven by the convergence in technology. In order, product placement to become more competitive in the promotional marketing, there will be the need for researchers to extend focus from the traditional variables and add knowledge on the conventional marketplace factors that is the sell-ability of the product placement technologies and strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is well known that cointegration between the level of two variables (e.g. prices and dividends) is a necessary condition to assess the empirical validity of a present-value model (PVM) linking them. The work on cointegration,namelyon long-run co-movements, has been so prevalent that it is often over-looked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. This amounts to investigate whether short-run co-movememts steming from common cyclical feature restrictions are also present in such a system. In this paper we test for the presence of such co-movement on long- and short-term interest rates and on price and dividend for the U.S. economy. We focuss on the potential improvement in forecasting accuracies when imposing those two types of restrictions coming from economic theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neste trabalho, propomos uma especificação de modelo econométrico na forma reduzida, estimado por mínimos quadrados ordinários (MQO) e baseado em variáveis macroeconômicas, com o objetivo de explicar os retornos trimestrais do índice de ações IBRX-100, entre 2001 e 2015. Testamos ainda a eficiência preditiva do modelo e concluímos que o erro de previsão estimado em janela móvel, com re-estimação de MQO a cada rodada, e utilização de VAR auxiliar para projeção dos regressores, é significativamente inferior ao erro de previsão associado à hipótese de Random Walk para o horizonte de previsão de um trimestre a frente.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O objetivo deste estudo é fazer uma análise da relação entre o erro de previsão dos analistas de mercado quanto à rentabilidade das empresas listadas na BM&FBOVESPA S.A. (Bovespa) e os requerimentos de divulgação do International Financial Reporting Standards (IFRS). Isto foi feito através da regressão do erro de previsão dos analistas, utilizando a metodologia de dados em painel no ano de implantação do IFRS no Brasil, 2010, e, complementarmente em 2012, para referenciamento desses dados. Partindo desse pressuposto, foi determinado o erro de previsão das empresas listadas na Bovespa através de dados de rentabilidade (índice de lucro por ação/earnings per share) previstos e realizados, disponíveis nas bases de dados I/B/E/S Earnings Consensus Information, providos pela plataforma Thomson ONE Investment Banking e Economática Pro®, respectivamente. Os resultados obtidos indicam uma relação negativa entre o erro de previsão e o cumprimento dos requisitos de divulgação do IFRS, ou seja, quanto maior a qualidade nas informações divulgadas, menor o erro de previsão dos analistas. Portanto, esses resultados sustentam a perspectiva de que o grau de cumprimento das normas contábeis é tão ou mais importante do que as próprias normas. Adicionalmente, foi verificado que quando a empresa listada na BM&FBOVESPA é vinculada a Agência Reguladora, seu erro de previsão não é alterado. Por fim, esses resultados sugerem que é importante que haja o aprimoramento dos mecanismos de auditoria das firmas quanto ao cumprimento dos requerimentos normativos de divulgação, tais como: penalidades pela não observância da norma (enforcement), estruturas de governança corporativa e auditorias interna e externa.