942 resultados para equilibrium asset pricing models with latent variables
Resumo:
This paper studies a smooth-transition (ST) type cointegration. The proposed ST cointegration allows for regime switching structure in a cointegrated system. It nests the linear cointegration developed by Engle and Granger (1987) and the threshold cointegration studied by Balke and Fomby (1997). We develop F-type tests to examine linear cointegration against ST cointegration in ST-type cointegrating regression models with or without time trends. The null asymptotic distributions of the tests are derived with stationary transition variables in ST cointegrating regression models. And it is shown that our tests have nonstandard limiting distributions expressed in terms of standard Brownian motion when regressors are pure random walks, while have standard asymptotic distributions when regressors contain random walks with nonzero drift. Finite-sample distributions of those tests are studied by Monto Carlo simulations. The small-sample performance of the tests states that our F-type tests have a better power when the system contains ST cointegration than when the system is linearly cointegrated. An empirical example for the purchasing power parity (PPP) data (monthly US dollar, Italy lira and dollar-lira exchange rate from 1973:01 to 1989:10) is illustrated by applying the testing procedures in this paper. It is found that there is no linear cointegration in the system, but there exits the ST-type cointegration in the PPP data.
Resumo:
I dagens samhälle är det allt viktigare för företag att behålla sina existerande kunder då konkurrensen blir allt hårdare. Detta medför att företag försöker vidta åtgärder för att vårda relationer med sina kunder. Detta problem är även högst relevant inom IT-branschen. Inom IT-branschen är det vanligt att arbeta agilt i IT-projekt. Vår samarbetspartner har sett ett ökat behov av att mäta servicekvalitet på ett återkommande sätt inom IT-projekt, detta för att mäta relevanta variabler som sträcker sig utanför kravspecifikationen. För att mäta framgång gällande detta arbetssätt vill man kunna mäta Nöjd Kund Index (NKI) för att kunna jämföra IT-projekt internt i företaget. Då tidigare forskning visat avsaknad av modeller innehållande både mätning av servicekvalitet samt NKI har lämplig litteratur studerats där det framkommit att modellen SERVQUAL är vedertagen för mätning av servicekvalitet och modellen American Customer Satisfaction Index (ACSI) är vedertagen för mätning av NKI. Detta har legat till grund för arbetets problemformulering och syfte. Syftet med arbetet är att skapa en vidareutvecklad modell för mätning av NKI för att jämföra IT-projekt internt samt återkommande mätning av servicekvalitet inom IT-projekt. Framtagande av denna modell har sedan skett genom forskningsstrategin Design and Creation. Intervjuer har genomförts för kravfångst till den vidareutvecklade modellen. Resultatet av denna forskningsstrategi blev sedan en vidareutvecklad modell baserad på ovan nämnda modeller med återkommande förhållningssätt för mätning av servicekvalitet inom IT-projekt och mätning av NKI för att jämföra IT-projekt internt i företaget. Den framtagna modellen har sedan verifierats genom ytterligare intervjuer med respondenter som innehar god erfarenhet från kundsidan av IT-projekt. Från dessa intervjuer kunde sedan slutsats dras att denna modell är att anse som applicerbar i empirin gällande IT-projekt.
Resumo:
We present the hglm package for fitting hierarchical generalized linear models. It can be used for linear mixed models and generalized linear mixed models with random effects for a variety of links and a variety of distributions for both the outcomes and the random effects. Fixed effects can also be fitted in the dispersion part of the model.
Resumo:
We hypothesise that differences in people's attitudes and personality traits lead them to attribute varying importance to environmental considerations, safety, comfort, convenience and flexibility. Differences in personality traits call be revealed not only in the individuals' choice of transport, but also in other actions of their everyday lives-such as how much they recycle, whether they take precautions or avoid dangerous pursuits. Conditioning on a set of exogenous individual characteristics, we use indicators of attitudes and personality traits to form latent variables for inclusion in an, otherwise standard, discrete mode choice model. With a sample of Swedish commuters, we find that both attitudes towards flexibility and comfort, as well as being pro-environmentally inclined, influence the individual's choice of mode. Although modal time and cost still are important, it follows that there are other ways, apart from economic incentives, to attract individuals to the, from society's perspective, desirable public modes of transport. Our results should provide useful information to policy-makers and transportation planners developing sustainable transportation systems.
Resumo:
Background: The sensitivity to microenvironmental changes varies among animals and may be under genetic control. It is essential to take this element into account when aiming at breeding robust farm animals. Here, linear mixed models with genetic effects in the residual variance part of the model can be used. Such models have previously been fitted using EM and MCMC algorithms. Results: We propose the use of double hierarchical generalized linear models (DHGLM), where the squared residuals are assumed to be gamma distributed and the residual variance is fitted using a generalized linear model. The algorithm iterates between two sets of mixed model equations, one on the level of observations and one on the level of variances. The method was validated using simulations and also by re-analyzing a data set on pig litter size that was previously analyzed using a Bayesian approach. The pig litter size data contained 10,060 records from 4,149 sows. The DHGLM was implemented using the ASReml software and the algorithm converged within three minutes on a Linux server. The estimates were similar to those previously obtained using Bayesian methodology, especially the variance components in the residual variance part of the model. Conclusions: We have shown that variance components in the residual variance part of a linear mixed model can be estimated using a DHGLM approach. The method enables analyses of animal models with large numbers of observations. An important future development of the DHGLM methodology is to include the genetic correlation between the random effects in the mean and residual variance parts of the model as a parameter of the DHGLM.
Resumo:
Generalized linear mixed models are flexible tools for modeling non-normal data and are useful for accommodating overdispersion in Poisson regression models with random effects. Their main difficulty resides in the parameter estimation because there is no analytic solution for the maximization of the marginal likelihood. Many methods have been proposed for this purpose and many of them are implemented in software packages. The purpose of this study is to compare the performance of three different statistical principles - marginal likelihood, extended likelihood, Bayesian analysis-via simulation studies. Real data on contact wrestling are used for illustration.
Resumo:
AIMS/HYPOTHESIS: Soluble tumor necrosis factor receptors 1 and 2 (sTNFR1 and sTNFR2) contribute to experimental diabetic kidney disease, a condition with substantially increased cardiovascular risk when present in patients. Therefore, we aimed to explore the levels of sTNFRs, and their association with prevalent kidney disease, incident cardiovascular disease, and risk of mortality independently of baseline kidney function and microalbuminuria in a cohort of patients with type 2 diabetes. In pre-defined secondary analyses we also investigated whether the sTNFRs predict adverse outcome in the absence of diabetic kidney disease. METHODS: The CARDIPP study, a cohort study of 607 diabetes patients [mean age 61 years, 44 % women, 45 cardiovascular events (fatal/non-fatal myocardial infarction or stroke) and 44 deaths during follow-up (mean 7.6 years)] was used. RESULTS: Higher sTNFR1 and sTNFR2 were associated with higher odds of prevalent kidney disease [odd ratio (OR) per standard deviation (SD) increase 1.60, 95 % confidence interval (CI) 1.32-1.93, p < 0.001 and OR 1.54, 95 % CI 1.21-1.97, p = 0.001, respectively]. In Cox regression models adjusting for age, sex, glomerular filtration rate and urinary albumin/creatinine ratio, higher sTNFR1 and sTNFR2 predicted incident cardiovascular events [hazard ratio (HR) per SD increase, 1.66, 95 % CI 1.29-2.174, p < 0.001 and HR 1.47, 95 % CI 1.13-1.91, p = 0.004, respectively]. Results were similar in separate models with adjustments for inflammatory markers, HbA1c, or established cardiovascular risk factors, or when participants with diabetic kidney disease at baseline were excluded (p < 0.01 for all). Both sTNFRs were associated with mortality. CONCLUSIONS/INTERPRETATIONS: Higher circulating sTNFR1 and sTNFR2 are associated with diabetic kidney disease, and predicts incident cardiovascular disease and mortality independently of microalbuminuria and kidney function, even in those without kidney disease. Our findings support the clinical utility of sTNFRs as prognostic markers in type 2 diabetes.
Resumo:
Due to the increase in water demand and hydropower energy, it is getting more important to operate hydraulic structures in an efficient manner while sustaining multiple demands. Especially, companies, governmental agencies, consultant offices require effective, practical integrated tools and decision support frameworks to operate reservoirs, cascades of run-of-river plants and related elements such as canals by merging hydrological and reservoir simulation/optimization models with various numerical weather predictions, radar and satellite data. The model performance is highly related with the streamflow forecast, related uncertainty and its consideration in the decision making. While deterministic weather predictions and its corresponding streamflow forecasts directly restrict the manager to single deterministic trajectories, probabilistic forecasts can be a key solution by including uncertainty in flow forecast scenarios for dam operation. The objective of this study is to compare deterministic and probabilistic streamflow forecasts on an earlier developed basin/reservoir model for short term reservoir management. The study is applied to the Yuvacık Reservoir and its upstream basin which is the main water supply of Kocaeli City located in the northwestern part of Turkey. The reservoir represents a typical example by its limited capacity, downstream channel restrictions and high snowmelt potential. Mesoscale Model 5 and Ensemble Prediction System data are used as a main input and the flow forecasts are done for 2012 year using HEC-HMS. Hydrometeorological rule-based reservoir simulation model is accomplished with HEC-ResSim and integrated with forecasts. Since EPS based hydrological model produce a large number of equal probable scenarios, it will indicate how uncertainty spreads in the future. Thus, it will provide risk ranges in terms of spillway discharges and reservoir level for operator when it is compared with deterministic approach. The framework is fully data driven, applicable, useful to the profession and the knowledge can be transferred to other similar reservoir systems.
Resumo:
Esta dissertação de mestrado em economia foi motivada por uma questão complexa bastante estudada na literatura de economia política nos dias de hoje: as formas como campanhas políticas afetam votação em uma eleição. estudo procura modelar mercado eleitoral brasileiro para deputados federais senadores. Através de um modelo linear, conclui-se que os gastos em campanha eleitoral são fatores decisivos para eleição de um candidato deputado federal. Após reconhecer que variável que mede os gastos em campanha possui erro de medida (devido ao famoso "caixa dois", por exemplo), além de ser endógena uma vez que candidatos com maiores possibilidades de conseguir votos conseguem mais fontes de financiamento -, modelo foi estimado por variáveis instrumentais. Para senadores, utilizando modelos lineares modelos com variável resposta binaria, verifica-se também importância, ainda que em menor escala, da campanha eleitoral, sendo que um fator mais importante para corrida ao senado parece ser uma percepção priori da qualidade do candidato.
Resumo:
We argue that it is possible to adapt the approach of imposing restrictions on available plans through finitely effective debt constraints, introduced by Levine and Zame (1996), to encompass models with default and collateral. Along this line, we introduce in the setting of Araujo, Páscoa and Torres-Martínez (2002) and Páscoa and Seghir (2008) the concept of almost finite-time solvency. We show that the conditions imposed in these two papers to rule out Ponzi schemes implicitly restrict actions to be almost finite-time solvent. We define the notion of equilibrium with almost finite-time solvency and look on sufficient conditions for its existence. Assuming a mild assumption on default penalties, namely that agents are myopic with respect to default penalties, we prove that existence is guaranteed (and Ponzi schemes are ruled out) when actions are restricted to be almost finite-time solvent. The proof is very simple and intuitive. In particular, the main existence results in Araujo et al. (2002) and Páscoa and Seghir (2008) are simple corollaries of our existence result.
Resumo:
We prove the existence of a competitive equilibrium for exchange economies with a measure space of agents and for which the commodity space is ` p, 1 < p < +∞. A vector x = (xn) in ` p may be interpreted as a security which promises to deliver xn units of numeraire at state (or date) n. Under assumptions imposing uniform bounds on marginal rates of substitution, positive results on core-Walras equivalence were established in Rustichini–Yannelis [21] and Podczeck [20]. In this paper we prove that under similar assumptions on marginal rates of substitution, the set of competitive equilibria (and thus the core) is non-empty.
Resumo:
In this study, we verify the existence of predictability in the Brazilian equity market. Unlike other studies in the same sense, which evaluate original series for each stock, we evaluate synthetic series created on the basis of linear models of stocks. Following Burgess (1999), we use the “stepwise regression” model for the formation of models of each stock. We then use the variance ratio profile together with a Monte Carlo simulation for the selection of models with potential predictability. Unlike Burgess (1999), we carry out White’s Reality Check (2000) in order to verify the existence of positive returns for the period outside the sample. We use the strategies proposed by Sullivan, Timmermann & White (1999) and Hsu & Kuan (2005) amounting to 26,410 simulated strategies. Finally, using the bootstrap methodology, with 1,000 simulations, we find strong evidence of predictability in the models, including transaction costs.
Resumo:
Verdelhan (2009) mostra que desejando-se explicar o comporta- mento do prêmio de risco nos mercados de títulos estrangeiros usando- se o modelo de formação externa de hábitos proposto por Campbell e Cochrane (1999) será necessário especi car o retorno livre de risco de equilíbrio de maneira pró-cíclica. Mostramos que esta especi cação só é possível sobre parâmetros de calibração implausíveis. Ainda no processo de calibração, para a maioria dos parâmetros razoáveis, a razão preço-consumo diverge. Entretanto, adotando a sugestão pro- posta por Verdelhan (2009) - de xar a função sensibilidade (st) no seu valor de steady-state durante a calibração e liberá-la apenas du- rante a simulação dos dados para se garantir taxas livre de risco pró- cíclicas - conseguimos encontrar um valor nito e bem comportado para a razão preço-consumo de equilíbrio e replicar o foward premium anom- aly. Desconsiderando possíveis inconsistências deste procedimento, so- bre retornos livres de risco pró-cíclicos, conforme sugerido por Wachter (2006), o modelo utilizado gera curvas de yields reais decrescentes na maturidade, independentemente do estado da economia - resultado que se opõe à literatura subjacente e aos dados reais sobre yields.
Resumo:
Dois experimentos e um levantamento por amostragem foram analisados no contexto de dados espaciais. Os experimentos foram delineados em blocos completos casualizados sendo que no experimento um (EXP 1) foram avaliados oito cultivares de trevo branco, sendo estudadas as variáveis Matéria Seca Total (MST) e Matéria Seca de Gramíneas (MSGRAM) e no experimento dois (EXP 2) 20 cultivares de espécies forrageiras, onde foi estudada a variável Percentagem de Implantação (%IMPL). As variáveis foram analisadas no contexto de modelos mistos, sendo modelada a variabilidade espacial através de semivariogramas exponencias, esféricos e gaussianos. Verificou-se uma diminuição em média de 19% e 14% do Coeficiente de Variação (CV) das medias dos cultivares, e uma diminuição em média de 24,6% e 33,3% nos erros padrões dos contrastes ortogonais propostos em MST e MSGRAM. No levantamento por amostragem, estudou-se a associação espacial em Aristida laevis (Nees) Kunth , Paspalum notatum Fl e Demodium incanum DC, amostrados em uma transecção fixa de quadros contiguos, a quatro tamanhos de unidades amostrais (0,1x0,1m; 0,1x0,3m; 0,1x0,5m; e 0,1x1,0m). Nas espécies Aristida laevis (Nees) Kunth e Paspalum notatum Fl, existiu um bom ajuste dos semivariogramas a tamanhos menores das unidades amostrais, diminuíndo quando a unidade amostral foi maior. Desmodium incanum DC apresentou comportamento contrario, ajustando melhor os semivariogramas a tamanhos maiores das unidades amostrais.
Resumo:
O objetivo do presente trabalho é analisar as características empíricas de uma série de retornos de dados em alta freqüência para um dos ativos mais negociados na Bolsa de Valores de São Paulo. Estamos interessados em modelar a volatilidade condicional destes retornos, testando em particular a presença de memória longa, entre outros fenômenos que caracterizam este tipo de dados. Nossa investigação revela que além da memória longa, existe forte sazonalidade intradiária, mas não encontramos evidências de um fato estilizado de retornos de ações, o efeito alavancagem. Utilizamos modelos capazes de captar a memória longa na variância condicional dos retornos dessazonalizados, com resultados superiores a modelos tradicionais de memória curta, com implicações importantes para precificação de opções e de risco de mercado