906 resultados para Classical measurement error model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper derives the second-order biases Of maximum likelihood estimates from a multivariate normal model where the mean vector and the covariance matrix have parameters in common. We show that the second order bias can always be obtained by means of ordinary weighted least-squares regressions. We conduct simulation studies which indicate that the bias correction scheme yields nearly unbiased estimators. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Studies of the viscoelastic properties of the vocal folds are normally performed with rheometers that use parallel assigned a fixed value. In tissues subject to variation of thickness plates whose interplate space is usually at between samples, fixed gaps could result in different compressions, compromising the comparison among them. We performed,in experimental study to determine whether different compressions call lead to different results in measurements of dynamic viscosity (DV) of vocal fold samples. Methods: We Measured the DV of vocal fold samples of 10 larynges of cadavers under 3 different compression levels, corresponding to 0.2, 0.5, and 10 N on an 8-mm-diameter parallel-plate rheometer. Results: The DV directly varied with compression. We observed statistically significant differences between the results of 0.2 and 10 N (p = 0.0396) and 0.5 and 10 N (p = 0.0442). Conclusions: The study demonstrated that the level of compression influences the DV measure and Suggests that a defined compression level should be used in rheometric studies of biological tissues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective Levodopa in presence of decarboxylase inhibitors is following two-compartment kinetics and its effect is typically modelled using sigmoid Emax models. Pharmacokinetic modelling of the absorption phase of oral distributions is problematic because of irregular gastric emptying. The purpose of this work was to identify and estimate a population pharmacokinetic- pharmacodynamic model for duodenal infusion of levodopa/carbidopa (Duodopa®) that can be used for in numero simulation of treatment strategies. Methods The modelling involved pooling data from two studies and fixing some parameters to values found in literature (Chan et al. J Pharmacokinet Pharmacodyn. 2005 Aug;32(3-4):307-31). The first study involved 12 patients on 3 occasions and is described in Nyholm et al. Clinical Neuropharmacology 2003:26:156-63. The second study, PEDAL, involved 3 patients on 2 occasions. A bolus dose (normal morning dose plus 50%) was given after a washout during night. Plasma samples and motor ratings (clinical assessment of motor function from video recordings on a treatment response scale between -3 and 3, where -3 represents severe parkinsonism and 3 represents severe dyskinesia.) were repeatedly collected until the clinical effect was back at baseline. At this point, the usual infusion rate was started and sampling continued for another two hours. Different structural absorption models and effect models were evaluated using the value of the objective function in the NONMEM package. Population mean parameter values, standard error of estimates (SE) and if possible, interindividual/interoccasion variability (IIV/IOV) were estimated. Results Our results indicate that Duodopa absorption can be modelled with an absorption compartment with an added bioavailability fraction and a lag time. The most successful effect model was of sigmoid Emax type with a steep Hill coefficient and an effect compartment delay. Estimated parameter values are presented in the table. Conclusions The absorption and effect models were reasonably successful in fitting observed data and can be used in simulation experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the techniques of likelihood prediction for the generalized linear mixed models. Methods of likelihood prediction is explained through a series of examples; from a classical one to more complicated ones. The examples show, in simple cases, that the likelihood prediction (LP) coincides with already known best frequentist practice such as the best linear unbiased predictor. The paper outlines a way to deal with the covariate uncertainty while producing predictive inference. Using a Poisson error-in-variable generalized linear model, it has been shown that in complicated cases LP produces better results than already know methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project constructs a structural model of the United States Economy. This task is tackled in two separate ways: first econometric methods and then using a neural network, both with a structure that mimics the structure of the U.S. economy. The structural model tracks the performance of U.S. GDP rather well in a dynamic simulation, with an average error of just over 1 percent. The neural network performed well, but suffered from some theoretical, as well as some implementation issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Short-term Water Information and Forecasting Tools (SWIFT) is a suite of tools for flood and short-term streamflow forecasting, consisting of a collection of hydrologic model components and utilities. Catchments are modeled using conceptual subareas and a node-link structure for channel routing. The tools comprise modules for calibration, model state updating, output error correction, ensemble runs and data assimilation. Given the combinatorial nature of the modelling experiments and the sub-daily time steps typically used for simulations, the volume of model configurations and time series data is substantial and its management is not trivial. SWIFT is currently used mostly for research purposes but has also been used operationally, with intersecting but significantly different requirements. Early versions of SWIFT used mostly ad-hoc text files handled via Fortran code, with limited use of netCDF for time series data. The configuration and data handling modules have since been redesigned. The model configuration now follows a design where the data model is decoupled from the on-disk persistence mechanism. For research purposes the preferred on-disk format is JSON, to leverage numerous software libraries in a variety of languages, while retaining the legacy option of custom tab-separated text formats when it is a preferred access arrangement for the researcher. By decoupling data model and data persistence, it is much easier to interchangeably use for instance relational databases to provide stricter provenance and audit trail capabilities in an operational flood forecasting context. For the time series data, given the volume and required throughput, text based formats are usually inadequate. A schema derived from CF conventions has been designed to efficiently handle time series for SWIFT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

this article addresses the welfare and macroeconomics effects of fiscal policy in a frarnework where govemment chooses tax rates and the distribution of revenues between consumption and investment. We construct and simulate a model where public consumption affects individuaIs' utility and public capital is an argument of the production function. The simulations suggest that by simply reallocating expenditures from consumption to investment, the govemment can increase the equilibrium leveIs of capital stock, hours worked, output and labor productivity. Funhennore, we 'show that the magnitude and direction of the long run impact of fiscal policy depends on the size of the elasticity of output to public capital. If this parameter is high enough, it may be the case that capital stock, within limits, increases with tax rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing availability of social statistics in Latin America opens new possibilities in terms of accountability and incentive mechanisms for policy makers. This paper addresses these issues within the institutional context of the Brazilian educational system. We build a theoretical model based on the theory of incentives to analyze the role of the recently launched Basic Education Development Index (Ideb) in the provision of incentives at the sub-national level. The first result is to demonstrate that an education target system has the potential to improve the allocation of resources to education through conditional transfers to municipalities and schools. Second, we analyze the local government’s decision about how to allocate its education budget when seeking to accomplish the different objectives contemplated by the index, which involves the interaction between its two components, average proficiency and the passing rate. We discuss as well policy issues concerning the implementation of the synthetic education index in the light of this model arguing that there is room for improving the Ideb’s methodology itself. In addition, we analyze the desirable properties of an ideal education index and we argue in favor of an ex-post relative learning evaluation system for different municipalities (schools) based on the value added across different grades

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that cointegration between the level of two variables (e.g. prices and dividends) is a necessary condition to assess the empirical validity of a present-value model (PVM) linking them. The work on cointegration,namelyon long-run co-movements, has been so prevalent that it is often over-looked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. This amounts to investigate whether short-run co-movememts steming from common cyclical feature restrictions are also present in such a system. In this paper we test for the presence of such co-movement on long- and short-term interest rates and on price and dividend for the U.S. economy. We focuss on the potential improvement in forecasting accuracies when imposing those two types of restrictions coming from economic theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper has two original contributions. First, we show that the present value model (PVM hereafter), which has a wide application in macroeconomics and fi nance, entails common cyclical feature restrictions in the dynamics of the vector error-correction representation (Vahid and Engle, 1993); something that has been already investigated in that VECM context by Johansen and Swensen (1999, 2011) but has not been discussed before with this new emphasis. We also provide the present value reduced rank constraints to be tested within the log-linear model. Our second contribution relates to forecasting time series that are subject to those long and short-run reduced rank restrictions. The reason why appropriate common cyclical feature restrictions might improve forecasting is because it finds natural exclusion restrictions preventing the estimation of useless parameters, which would otherwise contribute to the increase of forecast variance with no expected reduction in bias. We applied the techniques discussed in this paper to data known to be subject to present value restrictions, i.e. the online series maintained and up-dated by Shiller. We focus on three different data sets. The fi rst includes the levels of interest rates with long and short maturities, the second includes the level of real price and dividend for the S&P composite index, and the third includes the logarithmic transformation of prices and dividends. Our exhaustive investigation of several different multivariate models reveals that better forecasts can be achieved when restrictions are applied to them. Moreover, imposing short-run restrictions produce forecast winners 70% of the time for target variables of PVMs and 63.33% of the time when all variables in the system are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The onset of the financial crisis in 2008 and the European sovereign crisis in 2010 renewed the interest of macroeconomists on the role played by credit in business cycle fluctuations. The purpose of the present work is to present empirical evidence on the monetary policy transmission mechanism in Brazil with a special eye on the role played by the credit channel, using different econometric techniques. It is comprised by three articles. The first one presents a review of the literature of financial frictions, with a focus on the overlaps between credit activity and the monetary policy. It highlights how the sharp disruptions in the financial markets spurred central banks in developed and emerging nations to deploy of a broad set of non conventional tools to overcome the damage on financial intermediation. A chapter is dedicated to the challenge face by the policymaking in emerging markets and Brazil in particular in the highly integrated global capital market. This second article investigates the implications of the credit channel of the monetary policy transmission mechanism in the case of Brazil, using a structural FAVAR (SFAVAR) approach. The term “structural” comes from the estimation strategy, which generates factors that have a clear economic interpretation. The results show that unexpected shocks in the proxies for the external finance premium and the credit volume produce large and persistent fluctuations in inflation and economic activity – accounting for more than 30% of the error forecast variance of the latter in a three-year horizon. Counterfactual simulations demonstrate that the credit channel amplified the economic contraction in Brazil during the acute phase of the global financial crisis in the last quarter of 2008, thus gave an important impulse to the recovery period that followed. In the third articles, I make use of Bayesian estimation of a classical neo-Keynesian DSGE model, incorporating the financial accelerator channel developed by Bernanke, Gertler and Gilchrist (1999). The results present evidences in line to those already seen in the previous article: disturbances on the external finance premium – represented here by credit spreads – trigger significant responses on the aggregate demand and inflation and monetary policy shocks are amplified by the financial accelerator mechanism. Keywords: Macroeconomics, Monetary Policy, Credit Channel, Financial Accelerator, FAVAR, DSGE, Bayesian Econometrics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lucas (2000) estimates that the US welfare costs of inflation are around 1% of GDP. This measurement is consistent with a speci…c distorting channel in terms of the Bailey triangle under the demand for monetary base schedule (outside money): the displacement of resources from the production of consumption goods to the household transaction time à la Baumol. Here, we consider also several new types of distortions in the manufacturing and banking industries. Our new evidences show that both banks and firms demand special occupational employments to avoid the inflation tax. We de…ne the concept of ”the foat labor”: The occupational employments that are aflected by the in‡ation rates. More administrative workers are hired relatively to the bluecollar workers for producing consumption goods. This new phenomenon makes the manufacturing industry more roundabout. To take into account this new stylized fact and others, we redo at same time both ”The model 5: A Banking Sector -2” formulated by Lucas (1993) and ”The Competitive Banking System” proposed by Yoshino (1993). This modelling allows us to characterize better the new types of misallocations. We …nd that the maximum value of the resources wasted by the US economy happened in the years 1980-81, after the 2nd oil shock. In these years, we estimate the excess resources that are allocated for every speci…c distorting channel: i) The US commercial banks spent additional resources of around 2% of GDP; ii) For the purpose of the firm foating time were used between 2.4% and 4.1% of GDP); and iii) For the household transaction time were allocated between 3.1% and 4.5 % of GDP. The Bailey triangle under the demand for the monetary base schedule represented around 1% of GDP, which is consistent with Lucas (2000). We estimate that the US total welfare costs of in‡ation were around 10% of GDP in terms of the consumption goods foregone. The big di¤erence between our results and Lucas (2000) are mainly due to the Harberger triangle in the market for loans (inside money) which makes part of the household transaction time, of the …rm ‡oat labor and of the distortion in the banking industry. This triangle arises due to the widening interest rates spread in the presence of a distorting inflation tax and under a fractionally reserve system. The Harberger triangle can represent 80% of the total welfare costs of inflation while the remaining percentage is split almost equally between the Bailey triangle and the resources used for the bank services. Finally, we formulate several theorems in terms of the optimal nonneutral monetary policy so as to compare with the classical monetary theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objetivo deste estudo é propor a implementação de um modelo estatístico para cálculo da volatilidade, não difundido na literatura brasileira, o modelo de escala local (LSM), apresentando suas vantagens e desvantagens em relação aos modelos habitualmente utilizados para mensuração de risco. Para estimação dos parâmetros serão usadas as cotações diárias do Ibovespa, no período de janeiro de 2009 a dezembro de 2014, e para a aferição da acurácia empírica dos modelos serão realizados testes fora da amostra, comparando os VaR obtidos para o período de janeiro a dezembro de 2014. Foram introduzidas variáveis explicativas na tentativa de aprimorar os modelos e optou-se pelo correspondente americano do Ibovespa, o índice Dow Jones, por ter apresentado propriedades como: alta correlação, causalidade no sentido de Granger, e razão de log-verossimilhança significativa. Uma das inovações do modelo de escala local é não utilizar diretamente a variância, mas sim a sua recíproca, chamada de “precisão” da série, que segue uma espécie de passeio aleatório multiplicativo. O LSM captou todos os fatos estilizados das séries financeiras, e os resultados foram favoráveis a sua utilização, logo, o modelo torna-se uma alternativa de especificação eficiente e parcimoniosa para estimar e prever volatilidade, na medida em que possui apenas um parâmetro a ser estimado, o que representa uma mudança de paradigma em relação aos modelos de heterocedasticidade condicional.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to contribute on the forecasting literature in stock return for emerging markets. We use Autometrics to select relevant predictors among macroeconomic, microeconomic and technical variables. We develop predictive models for the Brazilian market premium, measured as the excess return over Selic interest rate, Itaú SA, Itaú-Unibanco and Bradesco stock returns. We nd that for the market premium, an ADL with error correction is able to outperform the benchmarks in terms of economic performance. For individual stock returns, there is a trade o between statistical properties and out-of-sample performance of the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop an affine jump diffusion (AJD) model with the jump-risk premium being determined by both idiosyncratic and systematic sources of risk. While we maintain the classical affine setting of the model, we add a finite set of new state variables that affect the paths of the primitive, under both the actual and the risk-neutral measure, by being related to the primitive's jump process. Those new variables are assumed to be commom to all the primitives. We present simulations to ensure that the model generates the volatility smile and compute the "discounted conditional characteristic function'' transform that permits the pricing of a wide range of derivatives.