869 resultados para Risk model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use CEX repeated cross-section data on consumption and income, to evaluate the nature of increased income inequality in the 1980s and 90s. We decompose unexpected changes in family income into transitory and permanent, and idiosyncratic and aggregate components, and estimate the contribution of each component to total inequality. The model we use is a linearized incomplete markets model, enriched to incorporate risk-sharing while maintaining tractability. Our estimates suggest that taking risk sharing into account is important for the model fit; that the increase in inequality in the 1980s was mainly permanent; and that inequality is driven almost entirely by idiosyncratic income risk. In addition we find no evidence for cyclical behavior of consumption risk, casting doubt on Constantinides and Duffie s (1995) explanation for the equity premium puzzle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Was the increase in income inequality in the US due to permanent shocks or merely to an increase in the variance of transitory shocks? The implications for consumption and welfare depend crucially on the answer to this question. We use CEX repeated cross-section data on consumption and income to decompose idiosyncratic changes in income into predictable life-cycle changes, transitory and permanent shocks and estimate the contribution of each to total inequality. Our model fits the joint evolution of consumption and income inequality well and delivers two main results. First, we find that permanent changes in income explain all of the increase in inequality in the 1980s and 90s. Second, we reconcile this finding with the fact that consumption inequality did not increase much over this period. Our results support the view that many permanent changes in income are predictable for consumers, even if they look unpredictable to the econometrician, consistent withmodels of heterogeneous income profiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Evidence for a better performance of different highly atherogenic versus traditional lipid parameters for coronary heart disease (CHD) risk prediction is conflicting. We investigated the association of the ratios of sma11 dense low density lipoprotein(LDL)/apoplipoprotein A, aolipoprotein B/apolipoprotein A-I and total cholesterol! HDL-cholesterol and CHD events in patients on combination antiretroviral therapy (cART).Methods: Case control study nested into the Swiss HIV Cohort Study: for each cART-treated patient with a first coronary event between April 1, 2000 and July 31, 2008 (case) we selected four control patients (1) that were without coronary events until the date of the event of the index case, (2) had a plasma sample within ±30 days of the sample date of the respective case, (3) received cART and (4) were then matched for age, gender and smoking status. Lipoproteins were measured by ultracentrifugation. Conditional logistic regression models were used to estimate the independent effects of different lipid ratios and the occurrence of coronary events.Results: In total, 98 cases (19 fatal myocardial infarctions [MI] and 79 non-fatal coronary events [53 definite MIs, 15 possible MIs and 11 coronary angioplasties or bypassesJ) were matched with 392 controls. Cases were more often injecting drug users, less likely to be virologically suppressed and more often on abacavir-containing regimens. In separa te multivariable models of total cholesterol, triglycerides, HDL-cholesterol, systolic blood pressure, abdominal obesity, diabetes and family history of CHD, small dense-LDL and apolipoprotein B were each statistically significantly associated with CHD events (for 1 mg/dl increase: odds ratio [OR] 1.05, 95% CI 1.00-1.11 and 1.15, 95% CI 1.01-1.31, respectively), but the ratiosof small dense-LDLlapolipoprotein A-I (OR 1.26, 95% CI 0.95-1.67), apolipoprotein B/apolipoprotein A-I (OR 1.02, 95% CI 0.97-1.07) and HDL-cholesterol! total cholesterol (OR 0.99 95% CI 0.98-1.00) were not. Following adjustment for HIV related and cART variables these associations were weakened in each model: apolipoprotein B (OR 1.27, 95% CI 1.00-1.30), sd-LDL (OR 1.04, 95% CI 0.99-1.20), small dense-LDLlapolipoprotein A-I (OR 1.17, 95% CI 0.87-1.58), apolipoprotein B/apolipoprotein A-I (OR 1.02, 95% CI 0.97-1.07) and total cholesterolJHDL- cholesterol (OR 0.99, 95% CI 0.99-1.00).Conclusions: In patients receiving cART, small dense-LDL and apolipoprotein B showed the strongest associations with CHD events in models controlling for traditional CHD risk factors including total cholesterol and triglycerides. Adding small dense LDLlapoplipoprotein A-l, apolipoprotein B/apolipoprotein A-I and total cholesterol! HDL-cholesterol ratios did not further improve models of lipid parameters and associations of increased risk for CHD events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The emphasis on integrated care implies new incentives that promote coordinationbetween levels of care. Considering a population as a whole, the resource allocation systemhas to adapt to this environment. This research is aimed to design a model that allows formorbidity related prospective and concurrent capitation payment. The model can be applied inpublicly funded health systems and managed competition settings.Methods: We analyze the application of hybrid risk adjustment versus either prospective orconcurrent risk adjustment formulae in the context of funding total health expenditures for thepopulation of an integrated healthcare delivery organization in Catalonia during years 2004 and2005.Results: The hybrid model reimburses integrated care organizations avoiding excessive risktransfer and maximizing incentives for efficiency in the provision. At the same time, it eliminatesincentives for risk selection for a specific set of high risk individuals through the use ofconcurrent reimbursement in order to assure a proper classification of patients.Conclusion: Prospective Risk Adjustment is used to transfer the financial risk to the healthprovider and therefore provide incentives for efficiency. Within the context of a National HealthSystem, such transfer of financial risk is illusory, and the government has to cover the deficits.Hybrid risk adjustment is useful to provide the right combination of incentive for efficiency andappropriate level of risk transfer for integrated care organizations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we address a problem arising in risk management; namely the study of price variations of different contingent claims in the Black-Scholes model due to anticipating future events. The method we propose to use is an extension of the classical Vega index, i.e. the price derivative with respect to the constant volatility, in thesense that we perturb the volatility in different directions. Thisdirectional derivative, which we denote the local Vega index, will serve as the main object in the paper and one of the purposes is to relate it to the classical Vega index. We show that for all contingent claims studied in this paper the local Vega index can be expressed as a weighted average of the perturbation in volatility. In the particular case where the interest rate and the volatility are constant and the perturbation is deterministic, the local Vega index is an average of this perturbation multiplied by the classical Vega index. We also study the well-known goal problem of maximizing the probability of a perfect hedge and show that the speed of convergence is in fact dependent of the local Vega index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze risk sharing and fiscal spending in a two-region model withcomplete markets. Fiscal policy determines tax rates for each state ofnature. When fiscal policy is decentralized, it can be used to affect prices of securities. To manipulate prices to their beneffit, regionschoose pro-cyclical fiscal spending. This leads to incomplete risk sharing,despite the existence of complete markets and the absence of aggregaterisk. When a fiscal union centralizes fiscal policy, securities pricescan no longer be manipulated and complete risk sharing ensues. If regionsare homogeneous, median income residents of both regions prefer the fiscalunion. If they are heterogeneous, the median resident of the rich regionprefers the decentralized setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: The purpose of this study was to evaluate the association between inflammation and heart failure (HF) risk in older adults. BACKGROUND: Inflammation is associated with HF risk factors and also directly affects myocardial function. METHODS: The association of baseline serum concentrations of interleukin (IL)-6, tumor necrosis factor-alpha, and C-reactive protein (CRP) with incident HF was assessed with Cox models among 2,610 older persons without prevalent HF enrolled in the Health ABC (Health, Aging, and Body Composition) study (age 73.6 +/- 2.9 years; 48.3% men; 59.6% white). RESULTS: During follow-up (median 9.4 years), HF developed in 311 (11.9%) participants. In models controlling for clinical characteristics, ankle-arm index, and incident coronary heart disease, doubling of IL-6, tumor necrosis factor-alpha, and CRP concentrations was associated with 29% (95% confidence interval: 13% to 47%; p < 0.001), 46% (95% confidence interval: 17% to 84%; p = 0.001), and 9% (95% confidence interval: -1% to 24%; p = 0.087) increase in HF risk, respectively. In models including all 3 markers, IL-6, and tumor necrosis factor-alpha, but not CRP, remained significant. These associations were similar across sex and race and persisted in models accounting for death as a competing event. Post-HF ejection fraction was available in 239 (76.8%) cases; inflammatory markers had stronger association with HF with preserved ejection fraction. Repeat IL-6 and CRP determinations at 1-year follow-up did not provide incremental information. Addition of IL-6 to the clinical Health ABC HF model improved model discrimination (C index from 0.717 to 0.734; p = 0.001) and fit (decreased Bayes information criterion by 17.8; p < 0.001). CONCLUSIONS: Inflammatory markers are associated with HF risk among older adults and may improve HF risk stratification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive an international asset pricing model that assumes local investorshave preferences of the type "keeping up with the Joneses." In aninternational setting investors compare their current wealth with that oftheir peers who live in the same country. In the process of inferring thecountry's average wealth, investors incorporate information from the domesticmarket portfolio. In equilibrium, this gives rise to a multifactor CAPMwhere, together with the world market price of risk, there existscountry-speciffic prices of risk associated with deviations from thecountry's average wealth level. The model performs signifficantly better, interms of explaining cross-section of returns, than the international CAPM.Moreover, the results are robust, both for conditional and unconditionaltests, to the inclusion of currency risk, macroeconomic sources of risk andthe Fama and French HML factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We model systemic risk in an interbank market. Banks face liquidityneeds as consumers are uncertain about where they need to consume. Interbank credit lines allow to cope with these liquidity shocks while reducing the cost of maintaining reserves. However, the interbank market exposes the system to a coordination failure(gridlock equilibrium) even if all banks are solvent. When one bankis insolvent, the stability of the banking system is affected in various ways depending on the patterns of payments across locations. We investigate the ability of the banking industry to withstand the insolvency of one bank and whether the closure ofone bank generates a chain reaction on the rest of the system. Weanalyze the coordinating role of the Central Bank in preventing payments systemic repercussions and we examine the justification ofthe Too-big-to-fail-policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study a situation in which an auctioneer wishes to sell an object toone of N risk-neutral bidders with heterogeneous preferences. Theauctioneer does not know bidders preferences but has private informationabout the characteristics of the ob ject, and must decide how muchinformation to reveal prior to the auction. We show that the auctioneerhas incentives to release less information than would be efficient andthat the amount of information released increases with the level ofcompetition (as measured by the number of bidders). Furthermore, in aperfectly competitive market the auctioneer would provide the efficientlevel of information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of existing studies have concluded that risk sharing allocations supported by competitive, incomplete markets equilibria are quantitatively close to first-best. Equilibrium asset prices in these models have been difficult to distinguish from those associated with a complete markets model, the counterfactual features of which have been widely documented. This paper asks if life cycle considerations, in conjunction with persistent idiosyncratic shocks which become more volatile during aggregate downturns, can reconcile the quantitative properties of the competitive asset pricing framework with those of observed asset returns. We begin by arguing that data from the Panel Study on Income Dynamics support the plausibility of such a shock process. Our estimates suggest a high degree of persistence as well as a substantial increase in idiosyncratic conditional volatility coincident with periods of low growth in U.S. GNP. When these factors are incorporated in a stationary overlapping generations framework, the implications for the returns on risky assets are substantial. Plausible parameterizations of our economy are able to generate Sharpe ratios which match those observed in U.S. data. Our economy cannot, however, account for the level of variability of stock returns, owing in large part to the specification of its production technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper I develop a general equilibrium model with risk averse entrepreneurialfirms and with public firms. The model predicts that an increase in uncertainty reducesthe propensity of entrepreneurial firms to innovate, while it does not affect thepropensity of public firms to innovate. Furthermore, it predicts that the negativeeffect of uncertainty on innovation is stronger for the less diversified entrepreneurialfirms, and is stronger in the absence of financing frictions in the economy. In thesecond part of the paper I test these predictions on a dataset of small and mediumItalian manufacturing firms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordination games arise very often in studies of industrial organization and international trade. This type of games has multiple strict equilibria, and therefore the identification of testable predictions isvery difficult. We study a vertical product differentiation model with two asymmetric players choosing first qualities and then prices. This game has two equilibria for some parameter values. However, we apply the risk dominance criterion suggested by Harsanyi and Selten and show that it always selects the equilibrium where the leader is the firm having some initial advantage. We then perform an experimental analysis totest whether the risk dominance prediction is supported by the behaviour oflaboratory agents. We show that the probability that the risk dominance prediction is right depends crucially on the degree of asymmetry of the game. The stronger the asymmetries the higher the predictive power of the risk dominance criterion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical {\sc vc} dimension, empirical {\sc vc} entropy, andmargin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.