942 resultados para equilibrium asset pricing models with latent variables
Resumo:
Negli ultimi anni i modelli VAR sono diventati il principale strumento econometrico per verificare se può esistere una relazione tra le variabili e per valutare gli effetti delle politiche economiche. Questa tesi studia tre diversi approcci di identificazione a partire dai modelli VAR in forma ridotta (tra cui periodo di campionamento, set di variabili endogene, termini deterministici). Usiamo nel caso di modelli VAR il test di Causalità di Granger per verificare la capacità di una variabile di prevedere un altra, nel caso di cointegrazione usiamo modelli VECM per stimare congiuntamente i coefficienti di lungo periodo ed i coefficienti di breve periodo e nel caso di piccoli set di dati e problemi di overfitting usiamo modelli VAR bayesiani con funzioni di risposta di impulso e decomposizione della varianza, per analizzare l'effetto degli shock sulle variabili macroeconomiche. A tale scopo, gli studi empirici sono effettuati utilizzando serie storiche di dati specifici e formulando diverse ipotesi. Sono stati utilizzati tre modelli VAR: in primis per studiare le decisioni di politica monetaria e discriminare tra le varie teorie post-keynesiane sulla politica monetaria ed in particolare sulla cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015) e regola del GDP nominale in Area Euro (paper 1); secondo per estendere l'evidenza dell'ipotesi di endogeneità della moneta valutando gli effetti della cartolarizzazione delle banche sul meccanismo di trasmissione della politica monetaria negli Stati Uniti (paper 2); terzo per valutare gli effetti dell'invecchiamento sulla spesa sanitaria in Italia in termini di implicazioni di politiche economiche (paper 3). La tesi è introdotta dal capitolo 1 in cui si delinea il contesto, la motivazione e lo scopo di questa ricerca, mentre la struttura e la sintesi, così come i principali risultati, sono descritti nei rimanenti capitoli. Nel capitolo 2 sono esaminati, utilizzando un modello VAR in differenze prime con dati trimestrali della zona Euro, se le decisioni in materia di politica monetaria possono essere interpretate in termini di una "regola di politica monetaria", con specifico riferimento alla cosiddetta "nominal GDP targeting rule" (McCallum 1988 Hall e Mankiw 1994; Woodford 2012). I risultati evidenziano una relazione causale che va dallo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo alle variazioni dei tassi di interesse di mercato a tre mesi. La stessa analisi non sembra confermare l'esistenza di una relazione causale significativa inversa dalla variazione del tasso di interesse di mercato allo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo. Risultati simili sono stati ottenuti sostituendo il tasso di interesse di mercato con il tasso di interesse di rifinanziamento della BCE. Questa conferma di una sola delle due direzioni di causalità non supporta un'interpretazione della politica monetaria basata sulla nominal GDP targeting rule e dà adito a dubbi in termini più generali per l'applicabilità della regola di Taylor e tutte le regole convenzionali della politica monetaria per il caso in questione. I risultati appaiono invece essere più in linea con altri approcci possibili, come quelli basati su alcune analisi post-keynesiane e marxiste della teoria monetaria e più in particolare la cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015). Queste linee di ricerca contestano la tesi semplicistica che l'ambito della politica monetaria consiste nella stabilizzazione dell'inflazione, del PIL reale o del reddito nominale intorno ad un livello "naturale equilibrio". Piuttosto, essi suggeriscono che le banche centrali in realtà seguono uno scopo più complesso, che è il regolamento del sistema finanziario, con particolare riferimento ai rapporti tra creditori e debitori e la relativa solvibilità delle unità economiche. Il capitolo 3 analizza l’offerta di prestiti considerando l’endogeneità della moneta derivante dall'attività di cartolarizzazione delle banche nel corso del periodo 1999-2012. Anche se gran parte della letteratura indaga sulla endogenità dell'offerta di moneta, questo approccio è stato adottato raramente per indagare la endogeneità della moneta nel breve e lungo termine con uno studio degli Stati Uniti durante le due crisi principali: scoppio della bolla dot-com (1998-1999) e la crisi dei mutui sub-prime (2008-2009). In particolare, si considerano gli effetti dell'innovazione finanziaria sul canale dei prestiti utilizzando la serie dei prestiti aggiustata per la cartolarizzazione al fine di verificare se il sistema bancario americano è stimolato a ricercare fonti più economiche di finanziamento come la cartolarizzazione, in caso di politica monetaria restrittiva (Altunbas et al., 2009). L'analisi si basa sull'aggregato monetario M1 ed M2. Utilizzando modelli VECM, esaminiamo una relazione di lungo periodo tra le variabili in livello e valutiamo gli effetti dell’offerta di moneta analizzando quanto la politica monetaria influisce sulle deviazioni di breve periodo dalla relazione di lungo periodo. I risultati mostrano che la cartolarizzazione influenza l'impatto dei prestiti su M1 ed M2. Ciò implica che l'offerta di moneta è endogena confermando l'approccio strutturalista ed evidenziando che gli agenti economici sono motivati ad aumentare la cartolarizzazione per una preventiva copertura contro shock di politica monetaria. Il capitolo 4 indaga il rapporto tra spesa pro capite sanitaria, PIL pro capite, indice di vecchiaia ed aspettativa di vita in Italia nel periodo 1990-2013, utilizzando i modelli VAR bayesiani e dati annuali estratti dalla banca dati OCSE ed Eurostat. Le funzioni di risposta d'impulso e la scomposizione della varianza evidenziano una relazione positiva: dal PIL pro capite alla spesa pro capite sanitaria, dalla speranza di vita alla spesa sanitaria, e dall'indice di invecchiamento alla spesa pro capite sanitaria. L'impatto dell'invecchiamento sulla spesa sanitaria è più significativo rispetto alle altre variabili. Nel complesso, i nostri risultati suggeriscono che le disabilità strettamente connesse all'invecchiamento possono essere il driver principale della spesa sanitaria nel breve-medio periodo. Una buona gestione della sanità contribuisce a migliorare il benessere del paziente, senza aumentare la spesa sanitaria totale. Tuttavia, le politiche che migliorano lo stato di salute delle persone anziane potrebbe essere necessarie per una più bassa domanda pro capite dei servizi sanitari e sociali.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Researchers often develop and test conceptual models containing formative variables. In many cases, these formative variables are specified as being endogenous. This article provides a clarification of formative variable theory, distinguishing between the formative latent variable and the formative composite variable. When an endogenous latent variable relies on formative indicators for measurement, empirical studies can say nothing about the relationship between exogenous variables and the endogenous formative latent variable: conclusions can only be drawn regarding the exogenous variables' relationships with a composite variable. The authors also show the dangers associated with developing theory about antecedents to endogenous formative variables at the (aggregate) formative latent variable level. Modeling relationships with endogenous formative variables at the (disaggregate) indicator level informs richer theory development, and encourages more precise empirical testing. When antecedents' relationships with endogenous formative variables are modeled at the formative latent variable level rather than the formative indicator level, theory construction can verge on the superficial, and empirical findings can be ambiguous in substantive meaning.
Resumo:
In machine learning, Gaussian process latent variable model (GP-LVM) has been extensively applied in the field of unsupervised dimensionality reduction. When some supervised information, e.g., pairwise constraints or labels of the data, is available, the traditional GP-LVM cannot directly utilize such supervised information to improve the performance of dimensionality reduction. In this case, it is necessary to modify the traditional GP-LVM to make it capable of handing the supervised or semi-supervised learning tasks. For this purpose, we propose a new semi-supervised GP-LVM framework under the pairwise constraints. Through transferring the pairwise constraints in the observed space to the latent space, the constrained priori information on the latent variables can be obtained. Under this constrained priori, the latent variables are optimized by the maximum a posteriori (MAP) algorithm. The effectiveness of the proposed algorithm is demonstrated with experiments on a variety of data sets. © 2010 Elsevier B.V.
Resumo:
In finance literature many economic theories and models have been proposed to explain and estimate the relationship between risk and return. Assuming risk averseness and rational behavior on part of the investor, the models are developed which are supposed to help in forming efficient portfolios that either maximize (minimize) the expected rate of return (risk) for a given level of risk (rates of return). One of the most used models to form these efficient portfolios is the Sharpe's Capital Asset Pricing Model (CAPM). In the development of this model it is assumed that the investors have homogeneous expectations about the future probability distribution of the rates of return. That is, every investor assumes the same values of the parameters of the probability distribution. Likewise financial volatility homogeneity is commonly assumed, where volatility is taken as investment risk which is usually measured by the variance of the rates of return. Typically the square root of the variance is used to define financial volatility, furthermore it is also often assumed that the data generating process is made of independent and identically distributed random variables. This again implies that financial volatility is measured from homogeneous time series with stationary parameters. In this dissertation, we investigate the assumptions of homogeneity of market agents and provide evidence for the case of heterogeneity in market participants' information, objectives, and expectations about the parameters of the probability distribution of prices as given by the differences in the empirical distributions corresponding to different time scales, which in this study are associated with different classes of investors, as well as demonstrate that statistical properties of the underlying data generating processes including the volatility in the rates of return are quite heterogeneous. In other words, we provide empirical evidence against the traditional views about homogeneity using non-parametric wavelet analysis on trading data, The results show heterogeneity of financial volatility at different time scales, and time-scale is one of the most important aspects in which trading behavior differs. In fact we conclude that heterogeneity as posited by the Heterogeneous Markets Hypothesis is the norm and not the exception.
Resumo:
The problem of social diffusion has animated sociological thinking on topics ranging from the spread of an idea, an innovation or a disease, to the foundations of collective behavior and political polarization. While network diffusion has been a productive metaphor, the reality of diffusion processes is often muddier. Ideas and innovations diffuse differently from diseases, but, with a few exceptions, the diffusion of ideas and innovations has been modeled under the same assumptions as the diffusion of disease. In this dissertation, I develop two new diffusion models for "socially meaningful" contagions that address two of the most significant problems with current diffusion models: (1) that contagions can only spread along observed ties, and (2) that contagions do not change as they spread between people. I augment insights from these statistical and simulation models with an analysis of an empirical case of diffusion - the use of enterprise collaboration software in a large technology company. I focus the empirical study on when people abandon innovations, a crucial, and understudied aspect of the diffusion of innovations. Using timestamped posts, I analyze when people abandon software to a high degree of detail.
To address the first problem, I suggest a latent space diffusion model. Rather than treating ties as stable conduits for information, the latent space diffusion model treats ties as random draws from an underlying social space, and simulates diffusion over the social space. Theoretically, the social space model integrates both actor ties and attributes simultaneously in a single social plane, while incorporating schemas into diffusion processes gives an explicit form to the reciprocal influences that cognition and social environment have on each other. Practically, the latent space diffusion model produces statistically consistent diffusion estimates where using the network alone does not, and the diffusion with schemas model shows that introducing some cognitive processing into diffusion processes changes the rate and ultimate distribution of the spreading information. To address the second problem, I suggest a diffusion model with schemas. Rather than treating information as though it is spread without changes, the schema diffusion model allows people to modify information they receive to fit an underlying mental model of the information before they pass the information to others. Combining the latent space models with a schema notion for actors improves our models for social diffusion both theoretically and practically.
The empirical case study focuses on how the changing value of an innovation, introduced by the innovations' network externalities, influences when people abandon the innovation. In it, I find that people are least likely to abandon an innovation when other people in their neighborhood currently use the software as well. The effect is particularly pronounced for supervisors' current use and number of supervisory team members who currently use the software. This case study not only points to an important process in the diffusion of innovation, but also suggests a new approach -- computerized collaboration systems -- to collecting and analyzing data on organizational processes.
Resumo:
When studying a biological regulatory network, it is usual to use boolean network models. In these models, boolean variables represent the behavior of each component of the biological system. Taking in account that the size of these state transition models grows exponentially along with the number of components considered, it becomes important to have tools to minimize such models. In this paper, we relate bisimulations, which are relations used in the study of automata (general state transition models) with attractors, which are an important feature of biological boolean models. Hence, we support the idea that bisimulations can be important tools in the study some main features of boolean network models.We also discuss the differences between using this approach and other well-known methodologies to study this kind of systems and we illustrate it with some examples.
Resumo:
Dissertação de mest. em Ciências Económicas e Empresariais, Unidade de Ciências Económicas e Empresariais, Univ. do Algarve, 1996
Resumo:
In this study, we examine the relationship between good corporate governance practices and the creation of value/performance of credit unions from 2010 to 2012. The objective was to create and validate a corporate governance index for credit unions, and to then analyse the relationship between good governance practices and the creation of value/performance. The problem question is: do good corporate governance practices provide value creation for credit unions? The research started by creating indices from factor analysis to identify latent dependent variables related to value creation and performance; next indices were created from the principal component analysis for the creation of independent latent variables related to corporate governance. Finally, based on panel data from regression models, the influence of the variables and indices related to corporate governance on the indices of value creation and performance was verified. Based on the research, it became evident that the Corporate Governance Index (IGC) is mainly impacted by Executive Management, with 40.31% of the IGC value, followed by the Representation and Participation dimension, with 34.07% of the IGC value. The contribution for academics was the creation of the Corporate Governance Index (IGC) applied for credit unions. As for the contribution to the system of credit unions, the highlight was the effectiveness of the mechanisms for economic-financial and asset management adopted by BACEN, credit unions and OCEMG.
Resumo:
El presente artículo, presenta un análisis de las decisiones de estructuración de capital de la compañía Merck Sharp & Dome S.A.S, desde la perspectiva de las finanzas comportamentales, comparando los métodos utilizados actualmente por la compañía seleccionada con la teoría tradicional de las finanzas, para así poder evaluar el desempeño teórico y real. Incorporar elementos comportamentales dentro del estudio permite profundizar más sobre de las decisiones corporativas en un contexto más cercano a los avances investigativos de las finanzas del comportamiento, lo cual lleva a que el análisis de este artículo se enfoque en la identificación y entendimiento de los sesgos de exceso de confianza y statu quo, pero sobre todo su implicación en las decisiones de financiación. Según la teoría tradicional el proceso de estructuración de capital se guía por los costos, pero este estudio de caso permitió observar que en la práctica esta relación de costo-decisión está en un segundo lugar, después de la relación riesgo-decisión a la hora del proceso de estructuración de capital.
Resumo:
This paper studies a portfolio choice problem such that the pricing rule may incorporate transaction costs and the risk measure is coherent and expectation bounded. We will prove the necessity of dealing with pricing rules such that there exists an essentially bounded stochastic discount factor, which must be also bounded from below by a strictly positive value. Otherwise good deals will be available to traders, i.e., depending on the selected risk measure, investors can build portfolios whose (risk, return) will be as close as desired to (−infinity, infinity) or (0, infinity). This pathologic property still holds for vector risk measures (i.e., if we minimize a vector valued function whose components are risk measures). It is worthwhile to point out that essentially bounded stochastic discount factors are not usual in financial literature. In particular, the most famous frictionless, complete and arbitrage free pricing models imply the existence of good deals for every coherent and expectation bounded (scalar or vector) measure of risk, and the incorporation of transaction costs will not guarantee the solution of this caveat.
Resumo:
Objective: The purpose of this study was to investigate effects of different manual techniques on cervical ranges of 17 motion and pressure pain sensitivity in subjects with latent trigger point of the upper trapezius muscle. 18 Methods: One hundred seventeen volunteers, with a unilateral latent trigger point on upper trapezius due to computer 19 work, were randomly divided into 5 groups: ischemic compression (IC) group (n = 24); passive stretching group (n = 20 23); muscle energy technique group (n = 23); and 2 control groups, wait-and-see group (n = 25) and placebo group 21 (n = 22). Cervical spine range of movement was measured using a cervical range of motion instrument as well as 22 pressure pain sensitivity by means of an algometer and a visual analog scale. Outcomes were assessed pretreatment, 23 immediately, and 24 hours after the intervention and 1 week later by a blind researcher. A 4 × 5 mixed repeated- 24 measures analysis of variance was used to examine the effects of the intervention and Cohen d coefficient was used. 25 Results: A group-by-time interaction was detected in all variables (P b .01), except contralateral rotation. The 26 immediate effect sizes of the contralateral flexion, ipsilateral rotation, and pressure pain threshold were large for 3 27 experimental groups. Nevertheless, after 24 hours and 1 week, only IC group maintained the effect size. 28 Conclusions: Manual techniques on upper trapezius with latent trigger point seemed to improve the cervical range of 29 motion and the pressure pain sensitivity. These effects persist after 1 week in the IC group. (J Manipulative Physiol 301 Ther 2013;xx:1-10)
Resumo:
The basic motivation of this work was the integration of biophysical models within the interval constraints framework for decision support. Comparing the major features of biophysical models with the expressive power of the existing interval constraints framework, it was clear that the most important inadequacy was related with the representation of differential equations. System dynamics is often modelled through differential equations but there was no way of expressing a differential equation as a constraint and integrate it within the constraints framework. Consequently, the goal of this work is focussed on the integration of ordinary differential equations within the interval constraints framework, which for this purpose is extended with the new formalism of Constraint Satisfaction Differential Problems. Such framework allows the specification of ordinary differential equations, together with related information, by means of constraints, and provides efficient propagation techniques for pruning the domains of their variables. This enabled the integration of all such information in a single constraint whose variables may subsequently be used in other constraints of the model. The specific method used for pruning its variable domains can then be combined with the pruning methods associated with the other constraints in an overall propagation algorithm for reducing the bounds of all model variables. The application of the constraint propagation algorithm for pruning the variable domains, that is, the enforcement of local-consistency, turned out to be insufficient to support decision in practical problems that include differential equations. The domain pruning achieved is not, in general, sufficient to allow safe decisions and the main reason derives from the non-linearity of the differential equations. Consequently, a complementary goal of this work proposes a new strong consistency criterion, Global Hull-consistency, particularly suited to decision support with differential models, by presenting an adequate trade-of between domain pruning and computational effort. Several alternative algorithms are proposed for enforcing Global Hull-consistency and, due to their complexity, an effort was made to provide implementations able to supply any-time pruning results. Since the consistency criterion is dependent on the existence of canonical solutions, it is proposed a local search approach that can be integrated with constraint propagation in continuous domains and, in particular, with the enforcing algorithms for anticipating the finding of canonical solutions. The last goal of this work is the validation of the approach as an important contribution for the integration of biophysical models within decision support. Consequently, a prototype application that integrated all the proposed extensions to the interval constraints framework is developed and used for solving problems in different biophysical domains.
Resumo:
Prepared for presentation at the Portuguese Finance Network International Conference 2014, Vilamoura, Portugal, June 18-20
Resumo:
"It is a widely accepted fact that the consumption-based capital asset pricing model (CCAPM) fails to provide a good explanation of many important features of the behaviour of financial market returns in a large range of countries over a long period of time. However, within a representative consumer/investor model, it is hard to see how the basic structure of the consumption based model can be safely abandoned." [introdução]