961 resultados para 140300 ECONOMETRICS
Resumo:
Published as an article in: Studies in Nonlinear Dynamics & Econometrics, 2004, vol. 8, issue 3, article 6.
Resumo:
This paper analyzes the trend processes characterized by two standard growth models using simple econometrics. The first model is the basic neoclassical growth model that postulates a deterministic trend for output. The second model is the Uzawa-Lucas model that postulates a stochastic trend for output. The aim is to understand how the different trend processes for output assumed by these two standard growth models determine the ability of each model to explain the observed trend processes of other macroeconomic variables such as consumption and investment. The results show that the two models reproduce the output trend process. Moreover, the results show that the basic growth model captures properly the consumption trend process, but fails in characterizing the investment trend process. The reverse is true for the Uzawa-Lucas model.
Resumo:
Published as an article in: Studies in Nonlinear Dynamics & Econometrics, 2004, vol. 8, issue 1, pages 5.
Resumo:
[ES]Este es un Trabajo Fin de Grado interdisciplinar que aúna la Innovación y el Emprendimiento con la Econometría. En la parte teórica, se sintetiza la literatura relacionada con la innovación y el emprendimiento, los tipos y medidas de innovación, los factores que influyen tanto positiva como negativamente en la innovación, y se explica la importancia de innovar en las empresas. Se analizan también las bases de datos sobre innovación disponibles en Internet. Mientras que en la parte empírica, se analiza el impacto del Producto Interior Bruto sobre la actividad emprendedora total y por motivos de necesidad u oportunidad de 67 países a nivel mundial. Los países están divididos en tres tipos, dependiendo de cual es el motor impulsor de su economía: los factores tradicionales de producción, los factores que mejoran su eficiencia y las economías basadas en la sofisticación de sus empresas y la innovación. Así mismo, se analiza la influencia del PIB y otras variables macroeconómicas como el nivel de educación superior, la inversión pública y privada en I+D+i y el número de patentes PCT con el emprendimiento innovador. Para ello, se han teniendo en cuenta los datos aportados en los Informes Global Entrepreneurship Monitor (GEM) y Innovation Union Scoreboard (IUS) para el año 2012.
Resumo:
[ES]Este es un Trabajo Fin de Grado interdisciplinar que aúna el Micro-mecenazgo con la Econometría. En la parte teórica, se sintetiza la literatura relacionada con el crowdfunding, los tipos y su funcionamiento, así como los factores que influyen tanto positiva como negativamente en la consecución de los proyectos llevados a cabo por esta práctica. Se analizan también las bases de datos sobre plataformas de crowdfunding disponibles en Internet y se explica la búsqueda realizada para encontrar una muestra adecuada. Mientras que en la parte empírica, se analiza el impacto de los factores elegidos sobre la probabilidad de éxito de los proyectos. Para ello se utilizan modelos econométricos no estudiados anteriormente que se explican y analizan en la parte metodológica del trabajo. De esta forma, se obtienen resultados que ayudan a conocer las características que han de tener los proyectos publicados en las plataformas de crowdfunding para conseguir la financiación exigida. Para ello, se han teniendo en cuenta los datos encontrados en diferentes plataformas de crowdfunding entre el año 2013 y el 2015.
Resumo:
Tras la creación de la CEE y bajo decenas de años de cuestionamiento ideológico, división en zonas económicas y comprobadas debilidades, se llega a la conclusión de que la Unión Europea debe redefinir su existencia mediante la solución de problemas estructurales como la existencia de shocks asimétricos , así como se demuestra la influencia mayor de la demanda exterior en el saldo exportador que del tipo de cambio en el caso español sin subestimar su efecto reequilibrador y su capacidad de generar crecimiento como paliativo para las coyunturas negativas
Resumo:
Quantile regression refers to the process of estimating the quantiles of a conditional distribution and has many important applications within econometrics and data mining, among other domains. In this paper, we show how to estimate these conditional quantile functions within a Bayes risk minimization framework using a Gaussian process prior. The resulting non-parametric probabilistic model is easy to implement and allows non-crossing quantile functions to be enforced. Moreover, it can directly be used in combination with tools and extensions of standard Gaussian Processes such as principled hyperparameter estimation, sparsification, and quantile regression with input-dependent noise rates. No existing approach enjoys all of these desirable properties. Experiments on benchmark datasets show that our method is competitive with state-of-the-art approaches. © 2009 IEEE.
Resumo:
Alexander, N.; Rhodes, M.; and Myers, H. (2007). International market selection: measuring actions instead of intentions. Journal of Services Marketing. 21(6), pp.424-434 RAE2008
Resumo:
This paper analyses the asymptotic properties of nonlinear least squares estimators of the long run parameters in a bivariate unbalanced cointegration framework. Unbalanced cointegration refers to the situation where the integration orders of the observables are different, but their corresponding balanced versions (with equal integration orders after filtering) are cointegrated in the usual sense. Within this setting, the long run linkage between the observables is driven by both the cointegrating parameter and the difference between the integration orders of the observables, which we consider to be unknown. Our results reveal three noticeable features. First, superconsistent (faster than √ n-consistent) estimators of the difference between memory parameters are achievable. Next, the joint limiting distribution of the estimators of both parameters is singular, and, finally, a modified version of the ‘‘Type II’’ fractional Brownian motion arises in the limiting theory. A Monte Carlo experiment and the discussion of an economic example are included.
Resumo:
We exploit the distributional information contained in high-frequency intraday data in constructing a simple conditional moment estimator for stochastic volatility diffusions. The estimator is based on the analytical solutions of the first two conditional moments for the latent integrated volatility, the realization of which is effectively approximated by the sum of the squared high-frequency increments of the process. Our simulation evidence indicates that the resulting GMM estimator is highly reliable and accurate. Our empirical implementation based on high-frequency five-minute foreign exchange returns suggests the presence of multiple latent stochastic volatility factors and possible jumps. © 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Recent empirical findings suggest that the long-run dependence in U.S. stock market volatility is best described by a slowly mean-reverting fractionally integrated process. The present study complements this existing time-series-based evidence by comparing the risk-neutralized option pricing distributions from various ARCH-type formulations. Utilizing a panel data set consisting of newly created exchange traded long-term equity anticipation securities, or leaps, on the Standard and Poor's 500 stock market index with maturity times ranging up to three years, we find that the degree of mean reversion in the volatility process implicit in these prices is best described by a Fractionally Integrated EGARCH (FIEGARCH) model. © 1999 Elsevier Science S.A. All rights reserved.
Resumo:
This paper uses dynamic impulse response analysis to investigate the interrelationships among stock price volatility, trading volume, and the leverage effect. Dynamic impulse response analysis is a technique for analyzing the multi-step-ahead characteristics of a nonparametric estimate of the one-step conditional density of a strictly stationary process. The technique is the generalization to a nonlinear process of Sims-style impulse response analysis for linear models. In this paper, we refine the technique and apply it to a long panel of daily observations on the price and trading volume of four stocks actively traded on the NYSE: Boeing, Coca-Cola, IBM, and MMM.
Resumo:
Empirical modeling of high-frequency currency market data reveals substantial evidence for nonnormality, stochastic volatility, and other nonlinearities. This paper investigates whether an equilibrium monetary model can account for nonlinearities in weekly data. The model incorporates time-nonseparable preferences and a transaction cost technology. Simulated sample paths are generated using Marcet's parameterized expectations procedure. The paper also develops a new method for estimation of structural economic models. The method forces the model to match (under a GMM criterion) the score function of a nonparametric estimate of the conditional density of observed data. The estimation uses weekly U.S.-German currency market data, 1975-90. © 1995.
Resumo:
This paper considers forecasting the conditional mean and variance from a single-equation dynamic model with autocorrelated disturbances following an ARMA process, and innovations with time-dependent conditional heteroskedasticity as represented by a linear GARCH process. Expressions for the minimum MSE predictor and the conditional MSE are presented. We also derive the formula for all the theoretical moments of the prediction error distribution from a general dynamic model with GARCH(1, 1) innovations. These results are then used in the construction of ex ante prediction confidence intervals by means of the Cornish-Fisher asymptotic expansion. An empirical example relating to the uncertainty of the expected depreciation of foreign exchange rates illustrates the usefulness of the results. © 1992.