76 resultados para Shannon’s measure of uncertainty
Resumo:
Our task in this paper is to analyze the organization of trading in the era of quantitative finance. To do so, we conduct an ethnography of arbitrage, the trading strategy that best exemplifies finance in the wake of the quantitative revolution. In contrast to value and momentum investing, we argue, arbitrage involves an art of association-the construction of equivalence (comparability) of properties across different assets. In place of essential or relational characteristics, the peculiar valuation that takes place in arbitrage is based on an operation that makes something the measure of something else-associating securities to each other. The process of recognizing opportunities and the practices of making novel associations are shaped by the specific socio-spatial and socio-technical configurations of the trading room. Calculation is distributed across persons and instruments as the trading room organizes interaction among diverse principles of valuation.
Resumo:
This paper analyzes empirically the volatility of consumption-based stochastic discount factors as a measure of implicit economic fears by studying its relationship with future economic and stock market cycles. Time-varying economic fears seem to be well captured by the volatility of stochastic discount factors. In particular, the volatility of recursive utility-based stochastic discount factor with contemporaneous growth explains between 9 and 34 percent of future changes in industrial production at short and long horizons respectively. They also explain ex-ante uncertainty and risk aversion. However, future stock market cycles are better explained by a similar stochastic discount factor with long-run consumption growth. This specification of the stochastic discount factor presents higher volatility and lower pricing errors than the specification with contemporaneous consumption growth.
Resumo:
This paper shows how recently developed regression-based methods for the decomposition ofhealth inequality can be extended to incorporate heterogeneity in the responses of health to the explanatory variables. We illustrate our method with an application to the GHQ measure of psychological well-being taken from the British Household Panel Survey. The results suggest that there is an important degree of heterogeneity in the association of health to explanatory variables across birth cohorts and genders which, in turn, accounts for a substantial percentage of the inequality in observed health.
Resumo:
Purpose - There has been much research on manufacturing flexibility, but supply chain flexibility is still an under-investigated area. This paper focuses on supply flexibility, the aspects of flexibility related to the upstream supply chain. Our purpose is to investigate why and how firms increase supply flexibility.Methodology/Approach An exploratory multiple case study was conducted. We analyzed seven Spanish manufacturers from different sectors (automotive, apparel, electronics and electrical equipment).Findings - The results show that there are some major reasons why firms need supply flexibility (manufacturing schedule fluctuations, JIT purchasing, manufacturing slack capacity, low level of parts commonality, demand volatility, demand seasonality and forecast accuracy), and that companies increase this type of flexibility by implementing two main strategies: to increase suppliers responsiveness capability and flexible sourcing . The results also suggest that the supply flexibility strategy selected depends on two factors: the supplier searching and switching costs and the type of uncertainty (mix, volume or delivery).Research limitations - This paper has some limitations common to all case studies, such as the subjectivity of the analysis, and the questionable generalizability of results (since the sample of firms is not statistically significant).Implications - Our study contributes to the existing literature by empirically investigating which are the main reasons for companies needing to increase supply flexibility, how they increase this flexibility, and suggesting some factors that could influence the selection of a particular supply flexibility strategy.
Resumo:
This paper presents a method for the measurement of changes in health inequality and income-related health inequality over time in a population.For pure health inequality (as measured by the Gini coefficient) andincome-related health inequality (as measured by the concentration index),we show how measures derived from longitudinal data can be related tocross section Gini and concentration indices that have been typicallyreported in the literature to date, along with measures of health mobilityinspired by the literature on income mobility. We also show how thesemeasures of mobility can be usefully decomposed into the contributions ofdifferent covariates. We apply these methods to investigate the degree ofincome-related mobility in the GHQ measure of psychological well-being inthe first nine waves of the British Household Panel Survey (BHPS). Thisreveals that dynamics increase the absolute value of the concentrationindex of GHQ on income by 10%.
Resumo:
A biplot, which is the multivariate generalization of the two-variable scatterplot, can be used to visualize the results of many multivariate techniques, especially those that are based on the singular value decomposition. We consider data sets consisting of continuous-scale measurements, their fuzzy coding and the biplots that visualize them, using a fuzzy version of multiple correspondence analysis. Of special interest is the way quality of fit of the biplot is measured, since it is well-known that regular (i.e., crisp) multiple correspondence analysis seriously under-estimates this measure. We show how the results of fuzzy multiple correspondence analysis can be defuzzified to obtain estimated values of the original data, and prove that this implies an orthogonal decomposition of variance. This permits a measure of fit to be calculated in the familiar form of a percentage of explained variance, which is directly comparable to the corresponding fit measure used in principal component analysis of the original data. The approach is motivated initially by its application to a simulated data set, showing how the fuzzy approach can lead to diagnosing nonlinear relationships, and finally it is applied to a real set of meteorological data.
Resumo:
This paper proposes an argument that explains incumbency advantage without recurring to the collective irresponsibility of legislatures. For that purpose, we exploit the informational value of incumbency: incumbency confers voters information about governing politicians not available from challengers. Because there are many reasons for high reelection rates different from incumbency status, we propose a measure of incumbency advantage that improves the use of pure reelection success. We also study the relationship between incumbency advantage and ideological and selection biases. An important implication of our analysis is that the literature linking incumbency and legislature irresponsibility most likely provides an overestimation of the latter.
Resumo:
Self-reported home values are widely used as a measure of housing wealth by researchers employing a variety of data sets and studying a number of different individual and household level decisions. The accuracy of this measure is an open empirical question, and requires some type of market assessment of the values reported. In this research, we study the predictive power of self-reported housing wealth when estimating sales prices utilizing the Health and Retirement Study. We find that homeowners, on average, overestimate the value of their properties by between 5% and 10%. More importantly, we are the first to document a strong correlation between accuracy and the economic conditions at the time of the purchase of the property (measured by the prevalent interest rate, the growth of household income, and the growth of median housing prices). While most individuals overestimate the value of their properties, those who bought during more difficult economic times tend to be more accurate, and in some cases even underestimate the value of their house. These results establish a surprisingly strong, likely permanent, and in many cases long-lived, effect of the initial conditions surrounding the purchases of properties, on how individuals value them. This cyclicality of the overestimation of house prices can provide some explanations for the difficulties currently faced by many homeowners, who were expecting large appreciations in home value to rescue them in case of increases in interest rates which could jeopardize their ability to live up to their financial commitments.
Resumo:
Age data frequently display excess frequencies at round or attractive ages, such as even numbers and multiples of five. This phenomenon of age heaping has been viewed as a problem in previous research, especially in demography and epidemiology. We see it as an opportunity and propose its use as a measure of human capital that can yield comparable estimates across a wide range of historical contexts. A simulation study yields methodological guidelines for measuring and interpreting differences in ageheaping, while analysis of contemporary and historical datasets demonstrates the existence of a robust correlation between age heaping and literacy at both the individual and aggregate level. To illustrate the method, we generate estimates of human capital in Europe over the very long run, which support the hypothesis of a major increase in human capital preceding the industrial revolution.
Resumo:
We find that trade and domestic market size are robust determinants of economic growth overthe 1960-1996 period when trade openness is measured as the US dollar value of imports andexports relative to GDP in PPP US$ ('real openness'). When trade openness is measured asthe US dollar value of imports and exports relative to GDP in exchange rate US$ ('nominalopenness') however, trade and the size of domestic markets are often non-robust determinantsof growth. We argue that real openness is the more appropriate measure of trade and that ourempirical results should be seen as evidence in favor of the extent-of-the-market hypothesis.
Resumo:
This paper presents a general equilibrium model of money demand wherethe velocity of money changes in response to endogenous fluctuations in the interest rate. The parameter space can be divided into two subsets: one where velocity is constant and equal to one as in cash-in-advance models, and another one where velocity fluctuates as in Baumol (1952). Despite its simplicity, in terms of paramaters to calibrate, the model performs surprisingly well. In particular, it approximates the variability of money velocity observed in the U.S. for the post-war period. The model is then used to analyze the welfare costs of inflation under uncertainty. This application calculates the errors derived from computing the costs of inflation with deterministic models. It turns out that the size of this difference is small, at least for the levels of uncertainty estimated for the U.S. economy.
Resumo:
In this article we examine the potential effect of market structureon hospital technical efficiency as a measure of performance controlled byownership and regulation. This study is relevant to provide an evaluationof the potential effects of recommended and initiated deregulation policiesin order to promote market reforms in the context of a European NationalHealth Service. Our goal was reached through three main empirical stages.Firstly, using patient origin data from hospitals in the region of Cataloniain 1990, we estimated geographic hospital markets through the Elzinga--Hogartyapproach, based on patient flows. Then we measured the market level ofconcentration using the Herfindahl--Hirschman index. Secondly, technicaland scale efficiency scores for each hospital was obtained specifying aData Envelopment Analysis. According to the data nearly two--thirds of thehospitals operate under the production frontier with an average efficiencyscore of 0.841. Finally, the determinants of the efficiency scores wereinvestigated using a censored regression model. Special attention waspaid to test the hypothesis that there is an efficiency improvement in morecompetitive markets. The results suggest that the number of competitors inthe market contributes positively to technical efficiency and there is someevidence that the differences in efficiency scores are attributed toseveral environmental factors such as ownership, market structure andregulation effects.
Resumo:
In this paper we present a simple theory-based measure of the variations in aggregate economic efficiency: the gap between the marginal product of labor and the household s consumption/leisure tradeoff. We show that this indicator corresponds to the inverse of the markup of price over social marginal cost, and give some evidence in support of this interpretation. We then show that, with some auxilliary assumptions our gap variable may be used to measure the efficiency costs of business fluctuations. We find that the latter costs are modest on average. However, to the extent the flexible price equilibrium is distorted,the gross efficiency losses from recessions and gains from booms may be large. Indeed, we find that the major recessions involved large efficiency losses. These results hold for reasonable parameterizations of the Frisch elasticity of labor supply, the coefficient of relative risk aversion, and steady state distortions.
Resumo:
We report a scaling law that governs both the elastic and frictional properties of a wide variety of living cell types, over a wide range of time scales and under a variety of biological interventions. This scaling identifies these cells as soft glassy materials existing close to a glass transition, and implies that cytoskeletal proteins may regulate cell mechanical properties mainly by modulating the effective noise temperature of the matrix. The practical implications are that the effective noise temperature is an easily quantified measure of the ability of the cytoskeleton to deform, flow, and reorganize.
Resumo:
Naive scale invariance is not a true property of natural images. Natural monochrome images possess a much richer geometrical structure, which is particularly well described in terms of multiscaling relations. This means that the pixels of a given image can be decomposed into sets, the fractal components of the image, with well-defined scaling exponents [Turiel and Parga, Neural Comput. 12, 763 (2000)]. Here it is shown that hyperspectral representations of natural scenes also exhibit multiscaling properties, observing the same kind of behavior. A precise measure of the informational relevance of the fractal components is also given, and it is shown that there are important differences between the intrinsically redundant red-green-blue system and the decorrelated one defined in Ruderman, Cronin, and Chiao [J. Opt. Soc. Am. A 15, 2036 (1998)].