56 resultados para measuring device
Resumo:
This paper analyzes empirically the volatility of consumption-based stochastic discount factors as a measure of implicit economic fears by studying its relationship with future economic and stock market cycles. Time-varying economic fears seem to be well captured by the volatility of stochastic discount factors. In particular, the volatility of recursive utility-based stochastic discount factor with contemporaneous growth explains between 9 and 34 percent of future changes in industrial production at short and long horizons respectively. They also explain ex-ante uncertainty and risk aversion. However, future stock market cycles are better explained by a similar stochastic discount factor with long-run consumption growth. This specification of the stochastic discount factor presents higher volatility and lower pricing errors than the specification with contemporaneous consumption growth.
Resumo:
This paper explores three aspects of strategic uncertainty: its relation to risk, predictability of behavior and subjective beliefs of players. In a laboratory experiment we measure subjects certainty equivalents for three coordination games and one lottery. Behavior in coordination games is related to risk aversion, experience seeking, and age.From the distribution of certainty equivalents we estimate probabilities for successful coordination in a wide range of games. For many games, success of coordination is predictable with a reasonable error rate. The best response to observed behavior is close to the global-game solution. Comparing choices in coordination games with revealed risk aversion, we estimate subjective probabilities for successful coordination. In games with a low coordination requirement, most subjects underestimate the probability of success. In games with a high coordination requirement, most subjects overestimate this probability. Estimating probabilistic decision models, we show that the quality of predictions can be improved when individual characteristics are taken into account. Subjects behavior is consistent with probabilistic beliefs about the aggregate outcome, but inconsistent with probabilistic beliefs about individual behavior.
Resumo:
As companies and shareholders begin to note the potential repercussions of intangible assets uponbusiness results, the inability of the traditional financial statement model to reflect these new waysof creating business value has become evident. Companies have widely adopted newmanagement tools, covering in this way the inability of the traditional financial statement model toreflect these new ways of creating business value.However, there are few prior studies measuring on a quantifiable manner the level of productivityunexplained in the financial statements. In this study, we measure the effect of intangible assets onproductivity using data from Spanish firms selected randomly by size and sector over a ten-yearperiod, from 1995 to 2004. Through a sample of more than 10,000 Spanish firms we analyse towhat extent labour productivity can be explained by physical capital deepening, by quantifiedintangible capital deepening and by firm s economic efficiency (or total factor productivity PTF).Our results confirm the hypothesis that PTF weigh has increased during the period studied,especially on those firms that have experienced a significant raise in quantified intangible capital,evidencing that there are some important complementary effects between capital investment andintangible resources in the explanation of productivity growth. These results have significantdifferences considering economic sector and firm s dimension.
Resumo:
Although the histogram is the most widely used density estimator, itis well--known that the appearance of a constructed histogram for a given binwidth can change markedly for different choices of anchor position. In thispaper we construct a stability index $G$ that assesses the potential changesin the appearance of histograms for a given data set and bin width as theanchor position changes. If a particular bin width choice leads to an unstableappearance, the arbitrary choice of any one anchor position is dangerous, anda different bin width should be considered. The index is based on the statisticalroughness of the histogram estimate. We show via Monte Carlo simulation thatdensities with more structure are more likely to lead to histograms withunstable appearance. In addition, ignoring the precision to which the datavalues are provided when choosing the bin width leads to instability. We provideseveral real data examples to illustrate the properties of $G$. Applicationsto other binned density estimators are also discussed.
Resumo:
This paper aims to estimate a translog stochastic frontier production function in the analysis of a panel of 150 mixed Catalan farms in the period 1989-1993, in order to attempt to measure and explain variation in technical inefficiency scores with a one-stage approach. The model uses gross value added as the output aggregate measure. Total employment, fixed capital, current assets, specific costs and overhead costs are introduced into the model as inputs. Stochasticfrontier estimates are compared with those obtained using a linear programming method using a two-stage approach. The specification of the translog stochastic frontier model appears as an appropriate representation of the data, technical change was rejected and the technical inefficiency effects were statistically significant. The mean technical efficiency in the period analyzed was estimated to be 64.0%. Farm inefficiency levels were found significantly at 5%level and positively correlated with the number of economic size units.
Resumo:
Subcompositional coherence is a fundamental property of Aitchison s approach to compositional data analysis, and is the principal justification for using ratios of components. We maintain, however, that lack of subcompositional coherence, that is incoherence, can be measured in an attempt to evaluate whether any given technique is close enough, for all practical purposes, to being subcompositionally coherent. This opens up the field to alternative methods, which might be better suited to cope with problems such as data zeros and outliers, while being only slightly incoherent. The measure that we propose is based on the distance measure between components. We show that the two-part subcompositions, which appear to be the most sensitive to subcompositional incoherence, can be used to establish a distance matrix which can be directly compared with the pairwise distances in the full composition. The closeness of these two matrices can be quantified using a stress measure that is common in multidimensional scaling, providing a measure of subcompositional incoherence. The approach is illustrated using power-transformed correspondence analysis, which has already been shown to converge to log-ratio analysis as the power transform tends to zero.
Resumo:
Correspondence analysis is introduced in the brand associationliterature as an alternative tool to measure dominance, for theparticular case of free choice data. The method is also used to analysedifferences, or asymmetries, between brand-attribute associations whereattributes are associated with evoked brands, and brand-attributeassociations where brands are associated with the attributes. Anapplication to a sample of deodorants is used to illustrate the proposedmethodology.
Resumo:
Over recent years, both governments and international aid organizations have been devoting large amounts of resources to simplifying the procedures for setting up and formalizing firms. Many of these actions have focused on reducing the initial costs of setting up the firm, disregarding the more important role of business registers as a source of reliable information for judges, government departments and, above all, other firms. This reliable information is essential for reducing transaction costs in future dealings with all sorts of economic agents, both public and private. The priorities of reform policies should therefore be thoroughly reviewed, stressing the value of the legal institutions rather than trivializing them as is often the case.
Resumo:
Diffuse flow velocimetry (DFV) is introduced as a new, noninvasive, optical technique for measuring the velocity of diffuse hydrothermal flow. The technique uses images of a motionless, random medium (e.g.,rocks) obtained through the lens of a moving refraction index anomaly (e.g., a hot upwelling). The method works in two stages. First, the changes in apparent background deformation are calculated using particle image velocimetry (PIV). The deformation vectors are determined by a cross correlation of pixel intensities across consecutive images. Second, the 2-D velocity field is calculated by cross correlating the deformation vectors between consecutive PIV calculations. The accuracy of the method is tested with laboratory and numerical experiments of a laminar, axisymmetric plume in fluids with both constant and temperaturedependent viscosity. Results show that average RMS errors are ∼5%–7% and are most accurate in regions of pervasive apparent background deformation which is commonly encountered in regions of diffuse hydrothermal flow. The method is applied to a 25 s video sequence of diffuse flow from a small fracture captured during the Bathyluck’09 cruise to the Lucky Strike hydrothermal field (September 2009). The velocities of the ∼10°C–15°C effluent reach ∼5.5 cm/s, in strong agreement with previous measurements of diffuse flow. DFV is found to be most accurate for approximately 2‐D flows where background objects have a small spatial scale, such as sand or gravel
Resumo:
Various experimental procedures aimed at measuring individual risk aversion involve alist of pairs of alternative prospects. We first study the widely used method by Holt andLaury (2002), for which we find that the removal of some items from the lists yields asystematic decrease in risk aversion. This bias is quite distinct from other confoundsthat have been previously observed in the use of the Holt and Laury method. It may berelated to empirical phenomena and theoretical developments where better prospectsincrease risk aversion. Nevertheless, we have also found that the more recent elicitationmethod due to Abdellaoui et al. (2011), also based on lists, does not display anystatistically significant bias when the corresponding items of the list are removed. Ourresults suggest that methods other than the popular Holt and Laury one may bepreferable for the measurement of risk aversion.
Resumo:
This paper explores the possibility of using data from social bookmarking services to measure the use of information by academic researchers. Social bookmarking data can be used to augment participative methods (e.g. interviews and surveys) and other, non-participative methods (e.g. citation analysis and transaction logs) to measure the use of scholarly information. We use BibSonomy, a free resource-sharing system, as a case study. Results show that published journal articles are by far the most popular type of source bookmarked, followed by conference proceedings and books. Commercial journal publisher platforms are the most popular type of information resource bookmarked, followed by websites, records in databases and digital repositories. Usage of open access information resources is low in comparison with toll access journals. In the case of open access repositories, there is a marked preference for the use of subject-based repositories over institutional repositories. The results are consistent with those observed in related studies based on surveys and citation analysis, confirming the possible use of bookmarking data in studies of information behaviour in academic settings. The main advantages of using social bookmarking data are that is an unobtrusive approach, it captures the reading habits of researchers who are not necessarily authors, and data are readily available. The main limitation is that a significant amount of human resources is required in cleaning and standardizing the data.
Resumo:
The objective of this study is to analyse the technical or productive efficiency ofthe refuse collection services in 75 municipalities located in the Spanish regionof Catalonia. The analysis has been carried out using various techniques. Firstly we have calculated a deterministic parametric frontier, then a stochastic parametric frontier, and finally, various non-parametric approaches (DEA and FDH). Concerning the results, these naturally differ according to the technique used to approach the frontier. Nevertheless, they have an appearance of solidity, at least with regard to the ordinal concordance among the indices of efficiency obtained by the different approaches, as is demonstrated by the statistical tests used. Finally, we have attempted to search for any relation existing between efficiency and the method (public or private) of managing the services. No significant relation was found between the type of management and efficiencyindices
Resumo:
This paper estimates a model of airline competition for the Spanish air transport market. I test the explanatory power of alternative oligopoly models with capacity constraints. In addition, I analyse the degree of density economies. Results show that Spanish airlines conduct follows a price-leadership scheme so that it is less competitive than the Cournot solution. I also find evidence that thin routes can be considered as natural monopolies
Resumo:
The objective of this study is to analyse the technical or productive efficiency ofthe refuse collection services in 75 municipalities located in the Spanish regionof Catalonia. The analysis has been carried out using various techniques. Firstly we have calculated a deterministic parametric frontier, then a stochastic parametric frontier, and finally, various non-parametric approaches (DEA and FDH). Concerning the results, these naturally differ according to the technique used to approach the frontier. Nevertheless, they have an appearance of solidity, at least with regard to the ordinal concordance among the indices of efficiency obtained by the different approaches, as is demonstrated by the statistical tests used. Finally, we have attempted to search for any relation existing between efficiency and the method (public or private) of managing the services. No significant relation was found between the type of management and efficiencyindices