100 resultados para Future Value


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis focuses on theoretical asset pricing models and their empirical applications. I aim to investigate the following noteworthy problems: i) if the relationship between asset prices and investors' propensities to gamble and to fear disaster is time varying, ii) if the conflicting evidence for the firm and market level skewness can be explained by downside risk, Hi) if costly learning drives liquidity risk. Moreover, empirical tests support the above assumptions and provide novel findings in asset pricing, investment decisions, and firms' funding liquidity. The first chapter considers a partial equilibrium model where investors have heterogeneous propensities to gamble and fear disaster. Skewness preference represents the desire to gamble, while kurtosis aversion represents fear of extreme returns. Using US data from 1988 to 2012, my model demonstrates that in bad times, risk aversion is higher, more people fear disaster, and fewer people gamble, in contrast to good times. This leads to a new empirical finding: gambling preference has a greater impact on asset prices during market downturns than during booms. The second chapter consists of two essays. The first essay introduces a foramula based on conditional CAPM for decomposing the market skewness. We find that the major market upward and downward movements can be well preadicted by the asymmetric comovement of betas, which is characterized by an indicator called "Systematic Downside Risk" (SDR). We find that SDR can efafectively forecast future stock market movements and we obtain out-of-sample R-squares (compared with a strategy using historical mean) of more than 2.27% with monthly data. The second essay reconciles a well-known empirical fact: aggregating positively skewed firm returns leads to negatively skewed market return. We reconcile this fact through firms' greater response to negative maraket news than positive market news. We also propose several market return predictors, such as downside idiosyncratic skewness. The third chapter studies the funding liquidity risk based on a general equialibrium model which features two agents: one entrepreneur and one external investor. Only the investor needs to acquire information to estimate the unobservable fundamentals driving the economic outputs. The novelty is that information acquisition is more costly in bad times than in good times, i.e. counter-cyclical information cost, as supported by previous empirical evidence. Later we show that liquidity risks are principally driven by costly learning. Résumé Cette thèse présente des modèles théoriques dévaluation des actifs et leurs applications empiriques. Mon objectif est d'étudier les problèmes suivants: la relation entre l'évaluation des actifs et les tendances des investisseurs à parier et à crainadre le désastre varie selon le temps ; les indications contraires pour l'entreprise et l'asymétrie des niveaux de marché peuvent être expliquées par les risques de perte en cas de baisse; l'apprentissage coûteux augmente le risque de liquidité. En outre, des tests empiriques confirment les suppositions ci-dessus et fournissent de nouvelles découvertes en ce qui concerne l'évaluation des actifs, les décisions relatives aux investissements et la liquidité de financement des entreprises. Le premier chapitre examine un modèle d'équilibre où les investisseurs ont des tendances hétérogènes à parier et à craindre le désastre. La préférence asymétrique représente le désir de parier, alors que le kurtosis d'aversion représente la crainte du désastre. En utilisant les données des Etats-Unis de 1988 à 2012, mon modèle démontre que dans les mauvaises périodes, l'aversion du risque est plus grande, plus de gens craignent le désastre et moins de gens parient, conatrairement aux bonnes périodes. Ceci mène à une nouvelle découverte empirique: la préférence relative au pari a un plus grand impact sur les évaluations des actifs durant les ralentissements de marché que durant les booms économiques. Exploitant uniquement cette relation générera un revenu excédentaire annuel de 7,74% qui n'est pas expliqué par les modèles factoriels populaires. Le second chapitre comprend deux essais. Le premier essai introduit une foramule base sur le CAPM conditionnel pour décomposer l'asymétrie du marché. Nous avons découvert que les mouvements de hausses et de baisses majeures du marché peuvent être prédits par les mouvements communs des bêtas. Un inadicateur appelé Systematic Downside Risk, SDR (risque de ralentissement systématique) est créé pour caractériser cette asymétrie dans les mouvements communs des bêtas. Nous avons découvert que le risque de ralentissement systématique peut prévoir les prochains mouvements des marchés boursiers de manière efficace, et nous obtenons des carrés R hors échantillon (comparés avec une stratégie utilisant des moyens historiques) de plus de 2,272% avec des données mensuelles. Un investisseur qui évalue le marché en utilisant le risque de ralentissement systématique aurait obtenu une forte hausse du ratio de 0,206. Le second essai fait cadrer un fait empirique bien connu dans l'asymétrie des niveaux de march et d'entreprise, le total des revenus des entreprises positiveament asymétriques conduit à un revenu de marché négativement asymétrique. Nous décomposons l'asymétrie des revenus du marché au niveau de l'entreprise et faisons cadrer ce fait par une plus grande réaction des entreprises aux nouvelles négatives du marché qu'aux nouvelles positives du marché. Cette décomposition révélé plusieurs variables de revenus de marché efficaces tels que l'asymétrie caractéristique pondérée par la volatilité ainsi que l'asymétrie caractéristique de ralentissement. Le troisième chapitre fournit une nouvelle base théorique pour les problèmes de liquidité qui varient selon le temps au sein d'un environnement de marché incomplet. Nous proposons un modèle d'équilibre général avec deux agents: un entrepreneur et un investisseur externe. Seul l'investisseur a besoin de connaitre le véritable état de l'entreprise, par conséquent, les informations de paiement coutent de l'argent. La nouveauté est que l'acquisition de l'information coute plus cher durant les mauvaises périodes que durant les bonnes périodes, comme cela a été confirmé par de précédentes expériences. Lorsque la récession comamence, l'apprentissage coûteux fait augmenter les primes de liquidité causant un problème d'évaporation de liquidité, comme cela a été aussi confirmé par de précédentes expériences.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attrition in longitudinal studies can lead to biased results. The study is motivated by the unexpected observation that alcohol consumption decreased despite increased availability, which may be due to sample attrition of heavy drinkers. Several imputation methods have been proposed, but rarely compared in longitudinal studies of alcohol consumption. The imputation of consumption level measurements is computationally particularly challenging due to alcohol consumption being a semi-continuous variable (dichotomous drinking status and continuous volume among drinkers), and the non-normality of data in the continuous part. Data come from a longitudinal study in Denmark with four waves (2003-2006) and 1771 individuals at baseline. Five techniques for missing data are compared: Last value carried forward (LVCF) was used as a single, and Hotdeck, Heckman modelling, multivariate imputation by chained equations (MICE), and a Bayesian approach as multiple imputation methods. Predictive mean matching was used to account for non-normality, where instead of imputing regression estimates, "real" observed values from similar cases are imputed. Methods were also compared by means of a simulated dataset. The simulation showed that the Bayesian approach yielded the most unbiased estimates for imputation. The finding of no increase in consumption levels despite a higher availability remained unaltered. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current levels of endangerment and historical trends of species and habitats are the main criteria used to direct conservation efforts globally. Estimates of future declines, which might indicate different priorities than past declines, have been limited by the lack of appropriate data and models. Given that much of conservation is about anticipating and responding to future threats, our inability to look forward at a global scale has been a major constraint on effective action. Here, we assess the geography and extent of projected future changes in suitable habitat for terrestrial mammals within their present ranges. We used a global earth-system model, IMAGE, coupled with fine-scale habitat suitability models and parametrized according to four global scenarios of human development. We identified the most affected countries by 2050 for each scenario, assuming that no additional conservation actions other than those described in the scenarios take place. We found that, with some exceptions, most of the countries with the largest predicted losses of suitable habitat for mammals are in Africa and the Americas. African and North American countries were also predicted to host the most species with large proportional global declines. Most of the countries we identified as future hotspots of terrestrial mammal loss have little or no overlap with the present global conservation priorities, thus confirming the need for forward-looking analyses in conservation priority setting. The expected growth in human populations and consumption in hotspots of future mammal loss mean that local conservation actions such as protected areas might not be sufficient to mitigate losses. Other policies, directed towards the root causes of biodiversity loss, are required, both in Africa and other parts of the world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

X-ray microtomography has become a new tool in earth sciences to obtain non-destructive 3D-image data from geological objects in which variations in mineralogy, chemical composition and/or porosity create sufficient x-ray density contrasts.We present here first, preliminary results of an application to the external and internal morphology of Permian to Recent Larger Foraminifera. We use a SkyScan-1072 high-resolution desk-top micro-CT system. The system has a conical x-ray source with a spot size of about 5µm that runs at 20-100kV, 0-250µA, resulting in a maximal resolution of 5µm. X-ray transmission images are captured by a scintillator coupled via fibre optics to a 1024x1024 pixel 12-bit CCD. The object is placed between the x-ray source and the scintillator on a stub that rotates 360°around its vertical axis in steps as small as 0.24 degrees. Sample size is limited to 2 cm due to the absorption of geologic material for x-rays. The transmission images are back projected using a Feldkamp algorithm into a vertical stack of up to 1000 1Kx1K images that represent horizontal cuts of the object. This calculation takes 2 to several hours on a Double-Processor 2.4GHz PC. The stack of images (.bmp) can be visualized with any 3D-imaging software, used to produce cuts of Larger Foraminifera. Among other applications, the 3D-imaging software furnished by SkyScan can produce 3D-models by defining a threshold density value to distinguish "solid" from "void. Several models with variable threshold values and colors can be imbricated, rotated and cut together. The best results were obtained with microfossils devoid of chamber-filling cements (Permian, Eocene, Recent). However, even slight differences in cement mineralogy/composition can result in surprisingly good x-ray density contrasts.X-ray microtomography may develop into a powerful tool for larger microfossils with a complex internal structure, because it is non-destructive, requires no preparation of the specimens, and produces a true 3D-image data set. We will use these data sets in the future to produce cuts in any direction to compare them with arbitrary cuts of complex microfossils in thin sections. Many groups of benthic and planktonic foraminifera may become more easily determinable in thin section by this way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyzing the relationship between the baseline value and subsequent change of a continuous variable is a frequent matter of inquiry in cohort studies. These analyses are surprisingly complex, particularly if only two waves of data are available. It is unclear for non-biostatisticians where the complexity of this analysis lies and which statistical method is adequate.With the help of simulated longitudinal data of body mass index in children,we review statistical methods for the analysis of the association between the baseline value and subsequent change, assuming linear growth with time. Key issues in such analyses are mathematical coupling, measurement error, variability of change between individuals, and regression to the mean. Ideally, it is better to rely on multiple repeated measurements at different times and a linear random effects model is a standard approach if more than two waves of data are available. If only two waves of data are available, our simulations show that Blomqvist's method - which consists in adjusting for measurement error variance the estimated regression coefficient of observed change on baseline value - provides accurate estimates. The adequacy of the methods to assess the relationship between the baseline value and subsequent change depends on the number of data waves, the availability of information on measurement error, and the variability of change between individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To review the mechanisms underlying the metabolic syndrome, or syndrome X, in humans, and to delineate dietary and environmental strategies for its prevention. DESIGN: Review of selected papers of the literature. RESULTS: Hyperinsulinemia and insulin resistance play a key role in the development of the metabolic syndrome. Strategies aimed at reducing insulin resistance may be effective in improving the metabolic syndrome. They include low saturated fat intake, consumption of low-glycemic-index foods, physical exercise and prevention of obesity. CONCLUSIONS: Future research, in particular the genetic basis of the metabolic syndrome and the interorgan interactions responsible for insulin resistance, is needed to improve therapeutic strategies for the metabolic syndrome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A noticeable increase in mean temperature has already been observed in Switzerland and summer temperatures up to 4.8 K warmer are expected by 2090. This article reviews the observed impacts of climate change on biodiversity and consider some perspectives for the future at the national level. The following impacts are already evident for all considered taxonomic groups: elevation shifts of distribution toward mountain summits, spread of thermophilous species, colonisation by new species from warmer areas and phenological shifts. Additionally, in the driest areas, increasing droughts are affecting tree survival and fish species are suffering from warm temperatures in lowland regions. These observations are coherent with model projections, and future changes will probably follow the current trends. These changes will likely cause extinctions for alpine species (competition, loss of habitat) and lowland species (temperature or drought stress). In the very urbanised Swiss landscape, the high fragmentation of the natural ecosystems will hinder the dispersal of many species towards mountains. Moreover, disruptions in species interactions caused by individual migration rates or phenological shifts are likely to have consequences for biodiversity. Conversely, the inertia of the ecosystems (species longevity, restricted dispersal) and the local persistence of populations will probably result in lower extinction rates than expected with some models, at least in 21st century. It is thus very difficult to estimate the impact of climate change in terms of species extinctions. A greater recognition by society of the intrinsic value of biodiversity and of its importance for our existence will be essential to put in place effective mitigation measures and to safeguard a maximum number of native species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: The posterior circulation Acute Stroke Prognosis Early CT Score (pc-ASPECTS) quantifies the extent of early ischemic changes in the posterior circulation with a 10-point grading system. We hypothesized that pc-ASPECTS applied to CT angiography source images predicts functional outcome of patients in the Basilar Artery International Cooperation Study (BASICS). METHODS: BASICS was a prospective, observational registry of consecutive patients with acute symptomatic basilar artery occlusion. Functional outcome was assessed at 1 month. We applied pc-ASPECTS to CT angiography source images of patients with CT angiography for confirmation of basilar artery occlusion. We calculated unadjusted and adjusted risk ratios (RRs) of pc-ASPECTS dichotomized at ≥8 versus <8. Primary outcome measure was favorable outcome (modified Rankin Scale scores 0-3). Secondary outcome measures were mortality and functional independence (modified Rankin Scale scores 0-2). RESULTS: Of 158 patients included, 78 patients had a CT angiography source images pc-ASPECTS≥8. Patients with a pc-ASPECTS≥8 more often had a favorable outcome than patients with a pc-ASPECTS<8 (crude RR, 1.7; 95% CI, 0.98-3.0). After adjustment for age, baseline National Institutes of Health Stroke Scale score, and thrombolysis, pc-ASPECTS≥8 was not related to favorable outcome (RR, 1.3; 95% CI, 0.8-2.2), but it was related to reduced mortality (RR, 0.7; 95% CI, 0.5-0.98) and functional independence (RR, 2.0; 95% CI, 1.1-3.8). In post hoc analysis, pc-ASPECTS dichotomized at ≥6 versus <6 predicted a favorable outcome (adjusted RR, 3.1; 95% CI, 1.2-7.5). CONCLUSIONS: pc-ASPECTS on CT angiography source images independently predicted death and functional independence at 1 month in the CT angiography subgroup of patients in the BASICS registry.