984 resultados para Model-specification


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The use of hedonic models to estimate the effects of various factors on house prices is well established. This paper examines a number of international hedonic house price models that seek to quantify the effect of infrastructure charges on new house prices. This work is an important factor in the housing affordability debate, with many governments in high growth areas having user-pays infrastructure charging policies operating in tandem with housing affordability objectives, with no empirical evidence on the impact of one on the other. This research finds there is little consistency between existing models and the data sets utilised. Specification appears dependent upon data availability rather than sound theoretical grounding. This may lead to a lack of external validity with model specification dependent upon data availability rather than sound theoretical grounding.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this article we introduce and evaluate testing procedures for specifying the number k of nearest neighbours in the weights matrix of spatial econometric models. The spatial J-test is used for specification search. Two testing procedures are suggested: an increasing neighbours testing procedure and a decreasing neighbours testing procedure. Simulations show that the increasing neighbours testing procedures can be used in large samples to determine k. The decreasing neighbours testing procedure is found to have low power, and is not recommended for use in practice. An empirical example involving house price data is provided to show how to use the testing procedures with real data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Quantifying scientific uncertainty when setting total allowable catch limits for fish stocks is a major challenge, but it is a requirement in the United States since changes to national fisheries legislation. Multiple sources of error are readily identifiable, including estimation error, model specification error, forecast error, and errors associated with the definition and estimation of reference points. Our focus here, however, is to quantify the influence of estimation error and model specification error on assessment outcomes. These are fundamental sources of uncertainty in developing scientific advice concerning appropriate catch levels and although a study of these two factors may not be inclusive, it is feasible with available information. For data-rich stock assessments conducted on the U.S. west coast we report approximate coefficients of variation in terminal biomass estimates from assessments based on inversion of the assessment of the model’s Hessian matrix (i.e., the asymptotic standard error). To summarize variation “among” stock assessments, as a proxy for model specification error, we characterize variation among multiple historical assessments of the same stock. Results indicate that for 17 groundfish and coastal pelagic species, the mean coefficient of variation of terminal biomass is 18%. In contrast, the coefficient of variation ascribable to model specification error (i.e., pooled among-assessment variation) is 37%. We show that if a precautionary probability of overfishing equal to 0.40 is adopted by managers, and only model specification error is considered, a 9% reduction in the overfishing catch level is indicated.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We discuss a general approach to dynamic sparsity modeling in multivariate time series analysis. Time-varying parameters are linked to latent processes that are thresholded to induce zero values adaptively, providing natural mechanisms for dynamic variable inclusion/selection. We discuss Bayesian model specification, analysis and prediction in dynamic regressions, time-varying vector autoregressions, and multivariate volatility models using latent thresholding. Application to a topical macroeconomic time series problem illustrates some of the benefits of the approach in terms of statistical and economic interpretations as well as improved predictions. Supplementary materials for this article are available online. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Using conjoint choice experiments, we surveyed 473 Swiss homeowners about their preferences for energy efficiency home renovations.We find that homeowners are responsive to the upfront costs of the renovation projects, governmentoffered rebates, savings in energy expenses, time horizon over which such savings would be realized, and thermal comfort improvement. The implicit discount rate is low, ranging from 1.5 to 3%, depending on model specification. This is consistent with Hassett and Metcalf (1993) and Metcalf and Rosenthal (1995), and with the fact that our scenarios contain no uncertainty. Respondents who feel completely uncertain about future energy prices are more likely to select the status quo (no renovations) in any given choice task and weight the costs of the investments more heavily than the financial gains (subsidies and savings on the energy bills). Renovations are more likely when respondents believe that climate change considerations are important determinants of home renovations. Copyright © 2013 by the IAEE. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tese de dout., Ciências e Tecnologia das Pescas, Faculdade de Ciências do Mar e do Ambiente, Universidade do Algarve, 2005

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Contexte et objectifs. En 1995, le gouvernement canadien a promulgué la Loi C-68, rendant ainsi obligatoire l’enregistrement de toutes les armes à feu et affermissant les vérifications auprès des futurs propriétaires. Faute de preuves scientifiques crédibles, le potentiel de cette loi à prévenir les homicides est présentement remis en question. Tout en surmontant les biais potentiels retrouvés dans les évaluations antérieures, l’objectif de ce mémoire est d’évaluer l’effet de la Loi C-68 sur les homicides au Québec entre 1974 et 2006. Méthodologie. L’effet de la Loi C-68 est évalué à l’aide d’une analyse des bornes extrêmes. Les effets immédiats et graduels de la Loi sont évalués à l’aide de 372 équations. Brièvement, il s’agit d’analyses de séries chronologiques interrompues où toutes les combinaisons de variables indépendantes sont envisagées afin d’éviter les biais relatifs à une spécification arbitraire des modèles. Résultats. L’introduction de la Loi C-68 est associée à une baisse graduelle des homicides commis à l’aide d’armes longues (carabines et fusils de chasse), sans qu’aucun déplacement tactique ne soit observé. Les homicides commis par des armes à feu à autorisation restreinte ou prohibées semblent influencés par des facteurs différents. Conclusion. Les résultats suggèrent que le contrôle des armes à feu est une mesure efficace pour prévenir les homicides. L’absence de déplacement tactique suggère également que l’arme à feu constitue un important facilitateur et que les homicides ne sont pas tous prémédités. D’autres études sont toutefois nécessaires pour clairement identifier les mécanismes de la Loi responsables de la baisse des homicides.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pharmacogenetic trials investigate the effect of genotype on treatment response. When there are two or more treatment groups and two or more genetic groups, investigation of gene-treatment interactions is of key interest. However, calculation of the power to detect such interactions is complicated because this depends not only on the treatment effect size within each genetic group, but also on the number of genetic groups, the size of each genetic group, and the type of genetic effect that is both present and tested for. The scale chosen to measure the magnitude of an interaction can also be problematic, especially for the binary case. Elston et al. proposed a test for detecting the presence of gene-treatment interactions for binary responses, and gave appropriate power calculations. This paper shows how the same approach can also be used for normally distributed responses. We also propose a method for analysing and performing sample size calculations based on a generalized linear model (GLM) approach. The power of the Elston et al. and GLM approaches are compared for the binary and normal case using several illustrative examples. While more sensitive to errors in model specification than the Elston et al. approach, the GLM approach is much more flexible and in many cases more powerful. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider the issue of performing residual and local influence analyses in beta regression models with varying dispersion, which are useful for modelling random variables that assume values in the standard unit interval. In such models, both the mean and the dispersion depend upon independent variables. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes. An application using real data is presented and discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis consists of four manuscripts in the area of nonlinear time series econometrics on topics of testing, modeling and forecasting nonlinear common features. The aim of this thesis is to develop new econometric contributions for hypothesis testing and forecasting in these area. Both stationary and nonstationary time series are concerned. A definition of common features is proposed in an appropriate way to each class. Based on the definition, a vector nonlinear time series model with common features is set up for testing for common features. The proposed models are available for forecasting as well after being well specified. The first paper addresses a testing procedure on nonstationary time series. A class of nonlinear cointegration, smooth-transition (ST) cointegration, is examined. The ST cointegration nests the previously developed linear and threshold cointegration. An Ftypetest for examining the ST cointegration is derived when stationary transition variables are imposed rather than nonstationary variables. Later ones drive the test standard, while the former ones make the test nonstandard. This has important implications for empirical work. It is crucial to distinguish between the cases with stationary and nonstationary transition variables so that the correct test can be used. The second and the fourth papers develop testing approaches for stationary time series. In particular, the vector ST autoregressive (VSTAR) model is extended to allow for common nonlinear features (CNFs). These two papers propose a modeling procedure and derive tests for the presence of CNFs. Including model specification using the testing contributions above, the third paper considers forecasting with vector nonlinear time series models and extends the procedures available for univariate nonlinear models. The VSTAR model with CNFs and the ST cointegration model in the previous papers are exemplified in detail,and thereafter illustrated within two corresponding macroeconomic data sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thesis looks at the macroeconomic impact of foreign aid. It is specially concerned with aid's impact on the public sector of less developed countries < LDCs> . Since the overwhelming majority of aid is directed to the public sector of LDCs, one can only understand the broader macroeconomic impact of aid if one first understands its impact on this sector. To this end, the thesis econometrically estimates " fiscal response" models of aid. These models, in essence, attempt to shed light on public sector fiscal behaviour in the presence of aid inflows, being specially concerned with the way aid is used to finance various categories of expenditures. The underlaying concern is to extent to which aid is " fungible" -that is, whether it finances consumption expenditure and reductions in taxation revenue in LDCs. A number of alternative models are derived from a utility maximisation framework. These alternatives reflect different assumptions regarding the behaviour of LDC public sectors and relate to the endogeniety of aid, whether or not recurrent expenditure is financed from domestic borrowing and the determination of domestic borrowing. The original frameworks of earlier studies are extended in a number of ways, including the use of a public sector utility function which is fully consistent with expected maximising behaviour. Estimates of these models' parameters are obtained using both time-series and cross-section data, dating from the 1960s, for Bangladesh, India, Pakistan and the Philippines. Both structural and reduced-form equations are estimated. Results suggest that foreign aid is indeed fungible, albeit at different levels. Moreover, the overall impact of aid on public sector investment, consumption, domestic borrowing and taxation varies between countries. Generally speaking, aid leads to increases in investment and consumption expenditure, but reduces taxation and domestic borrowing. Comparative analysis does, however, show that these results are highly sensitive to alternative behavioural assumptions and, therefore, model specification.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Learning preference models from human generated data is an important task in modern information processing systems. Its popular setting consists of simple input ratings, assigned with numerical values to indicate their relevancy with respect to a specific query. Since ratings are often specified within a small range, several objects may have the same ratings, thus creating ties among objects for a given query. Dealing with this phenomena presents a general problem of modelling preferences in the presence of ties and being query-specific. To this end, we present in this paper a novel approach by constructing probabilistic models directly on the collection of objects exploiting the combinatorial structure induced by the ties among them. The proposed probabilistic setting allows exploration of a super-exponential combinatorial state-space with unknown numbers of partitions and unknown order among them. Learning and inference in such a large state-space are challenging, and yet we present in this paper efficient algorithms to perform these tasks. Our approach exploits discrete choice theory, imposing generative process such that the finite set of objects is partitioned into subsets in a stagewise procedure, and thus reducing the state-space at each stage significantly. Efficient Markov chain Monte Carlo algorithms are then presented for the proposed models. We demonstrate that the model can potentially be trained in a large-scale setting of hundreds of thousands objects using an ordinary computer. In fact, in some special cases with appropriate model specification, our models can be learned in linear time. We evaluate the models on two application areas: (i) document ranking with the data from the Yahoo! challenge and (ii) collaborative filtering with movie data. We demonstrate that the models are competitive against state-of-the-arts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigates the impact of FDI on the productivity of Portuguese manufacturing sectors. Model specification is improved by considering the choice of the most appropriate interval of the technological gap for spillovers diffusion. We also allow for sectoral variation in the coefficients of the spillover effect; idiosyncratic sectoral factors are identified by means of a fixed effects model. Inter-sectoral positive spillover effects are examined. Significant spillovers require a proper technological differential between foreign and domestic producers and favourable sectoral characteristics. They may occur in modern industries in which the foreign firms have a clear, but not too sharp, edge on the domestic ones. Agglomeration effects are also one pertinent specific influence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigates the impact of foreign direct investment on the productivity performance of domestic firms in Portugal. The data comprise nine manufacturing sectors for the period 1992-95. Relatively to previous studies, model specification is improved by taking into consideration several aspects: the influence of the “technological gap” on spill-overs diffusion and the choice of its most appropriate interval; sectoral variation in the coefficients of the spill-overs effect; identification of constant, idiosyncratic sectoral factors by means of a fixed effects model; and the search for inter-sectoral positive spillover effects. The relationship between domestic firms productivity and the foreign presence does take place in a positive way, only if a proper technology differential between the foreign and domestic producers exists and the sectoral characteristics are favourable. In broad terms, spillovers diffusion is associated to modern industries in which the foreign owned establishments have a clear, but not too sharp, edge on the domestic ones. Besides, other specific sectoral influences can be pertinent; agglomerative location factors being one example.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Neste trabalho, propomos uma especificação de modelo econométrico na forma reduzida, estimado por mínimos quadrados ordinários (MQO) e baseado em variáveis macroeconômicas, com o objetivo de explicar os retornos trimestrais do índice de ações IBRX-100, entre 2001 e 2015. Testamos ainda a eficiência preditiva do modelo e concluímos que o erro de previsão estimado em janela móvel, com re-estimação de MQO a cada rodada, e utilização de VAR auxiliar para projeção dos regressores, é significativamente inferior ao erro de previsão associado à hipótese de Random Walk para o horizonte de previsão de um trimestre a frente.