81 resultados para Corporate Value
Resumo:
This paper presents an application of the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) approach to the estimation of quantities of Gross Value Added (GVA) referring to economic entities defined at different scales of study. The method first estimates benchmark values of the pace of GVA generation per hour of labour across economic sectors. These values are estimated as intensive variables –e.g. €/hour– by dividing the various sectorial GVA of the country (expressed in € per year) by the hours of paid work in that same sector per year. This assessment is obtained using data referring to national statistics (top down information referring to the national level). Then, the approach uses bottom-up information (the number of hours of paid work in the various economic sectors of an economic entity –e.g. a city or a province– operating within the country) to estimate the amount of GVA produced by that entity. This estimate is obtained by multiplying the number of hours of work in each sector in the economic entity by the benchmark value of GVA generation per hour of work of that particular sector (national average). This method is applied and tested on two different socio-economic systems: (i) Catalonia (considered level n) and Barcelona (considered level n-1); and (ii) the region of Lima (considered level n) and Lima Metropolitan Area (considered level n-1). In both cases, the GVA per year of the local economic entity –Barcelona and Lima Metropolitan Area – is estimated and the resulting value is compared with GVA data provided by statistical offices. The empirical analysis seems to validate the approach, even though the case of Lima Metropolitan Area indicates a need for additional care when dealing with the estimate of GVA in primary sectors (agriculture and mining).
Resumo:
A method to estimate an extreme quantile that requires no distributional assumptions is presented. The approach is based on transformed kernel estimation of the cumulative distribution function (cdf). The proposed method consists of a double transformation kernel estimation. We derive optimal bandwidth selection methods that have a direct expression for the smoothing parameter. The bandwidth can accommodate to the given quantile level. The procedure is useful for large data sets and improves quantile estimation compared to other methods in heavy tailed distributions. Implementation is straightforward and R programs are available.
Resumo:
Encara falta per fer possible una transformació estratègica d'Europa del sistema d'energia, però el que és de la mateixa importància com a objectius a llarg termini de la FER i Reduccions de GEH són vinculants i forts objectius d'eficiència energètica, no només per 2020, però també per al 2030, 2040 i 2050, com aquesta força ajudaria a fixar l'augment de les energies renovables en el total d'energia consum i per reduir el total Emissions de GEH d'Europa en general, i les del sector de l'energia en particular, encara sent un dels majors emissors de gasos d'efecte hivernacle de tots els sectors. La refosa Directiva, prevista per 2011/12 ha de ser un bones finestres d'oportunitat per finalment establir objectius vinculants d'eficiència energètica, l'únic pilar que encara falta en la força energia interdependents i estratègia sobre el clima de la UE, basat en la reducció de gasos d'efecte hivernacle i i l'eficiència energètica.
Resumo:
We propose a model of investment, duration, and exit strategies for start-ups backed by venture capital (VC) funds that accounts for the high level of uncertainty, the asymmetry of information between insiders and outsiders, and the discount rate. Our analysis predicts that start-ups backed by corporate VC funds remain for a longer period of time before exiting and receive larger investment amounts than those financed by independent VC funds. Although a longer duration leads to a higher likelihood of an exit through an acquisition, a larger investment increases the probability of an IPO exit. These predictions find strong empirical support.
Resumo:
This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified
Resumo:
The spectral efficiency achievable with joint processing of pilot and data symbol observations is compared with that achievable through the conventional (separate) approach of first estimating the channel on the basis of the pilot symbols alone, and subsequently detecting the datasymbols. Studied on the basis of a mutual information lower bound, joint processing is found to provide a non-negligible advantage relative to separate processing, particularly for fast fading. It is shown that, regardless of the fading rate, only a very small number of pilot symbols (at most one per transmit antenna and per channel coherence interval) shouldbe transmitted if joint processing is allowed.
Resumo:
The aim of this paper is to examine the pros and cons of book and fair value accounting from the perspective of the theory of banking. We consider the implications of the two accounting methods in an overlapping generations environment. As observed by Allen and Gale(1997), in an overlapping generation model, banks have a role as intergenerational connectors as they allow for intertemporal smoothing. Our main result is that when dividends depend on profits, book value ex ante dominates fair value, as it provides better intertemporal smoothing. This is in contrast with the standard view that states that, fair value yields a better allocation as it reflects the real opportunity cost of assets. Banking regulation play an important role by providing the right incentives for banks to smooth intertemporal consumption whereas market discipline improves intratemporal efficiency.
Resumo:
Our task in this paper is to analyze the organization of trading in the era of quantitative finance. To do so, we conduct an ethnography of arbitrage, the trading strategy that best exemplifies finance in the wake of the quantitative revolution. In contrast to value and momentum investing, we argue, arbitrage involves an art of association-the construction of equivalence (comparability) of properties across different assets. In place of essential or relational characteristics, the peculiar valuation that takes place in arbitrage is based on an operation that makes something the measure of something else-associating securities to each other. The process of recognizing opportunities and the practices of making novel associations are shaped by the specific socio-spatial and socio-technical configurations of the trading room. Calculation is distributed across persons and instruments as the trading room organizes interaction among diverse principles of valuation.
Weak and Strong Altruism in Trait Groups: Reproductive Suicide, Personal Fitness, and Expected Value
Resumo:
A simple variant of trait group selection, employing predators as the mechanism underlying group selection, supports contingent reproductive suicide as altruism (i.e., behavior lowering personal fitness while augmenting that of another) without kin assortment. The contingent suicidal type may either saturate the population or be polymorphic with a type avoiding suicide, depending on parameters. In addition to contingent suicide, this randomly assorting morph may also exhibit continuously expressed strong altruism (sensu Wilson 1979) usually thought restricted to kin selection. The model will not, however, support a sterile worker caste as such, where sterility occurs before life history events associated with effective altruism; reproductive suicide must remain fundamentally contingent (facultative sensu West Eberhard 1987; Myles 1988) under random assortment. The continuously expressed strong altruism supported by the model may be reinterpreted as probability of arbitrarily committing reproductive suicide, without benefit for another; such arbitrary suicide (a "load" on "adaptive" suicide) is viable only under a more restricted parameter space relative to the necessarily concomitant adaptive contingent suicide.
Resumo:
In this paper we argue that socially responsible policies have a positive impact on a firm's brand equity in the short-term as well as in the long-term. Moreover, once we distinguish between different stakeholders, we posit that secondary stakeholders such as community are even more important than primary stakeholders (customers, shareholders, workers and suppliers) in generating brand equity. Policies aimed at satisfied community interests act as a mechanism to reinforce trust that gives further credibility to social responsible polices with other stakeholders. The result is a decrease in conflicts among stakeholders and greater stakeholder willingness to provide intangible resources that enhance brand equity. We provide support of our theoretical contentions making use of a panel data composed of 57 firms from 10 countries (the US, Japan, South Korea, France, the UK, Italy, Germany, Finland, Switzerland and the Netherlands) for the period 2002 to 2007. We use detailed information on brand equity obtained from Interbrand and on corporate social responsibility (CSR) provided by the SiRi Global Profile database, as compiled by the Sustainable Investment Research International Company (SiRi).
Resumo:
We argue that when stakeholder protection is left to the voluntary initiative of managers, concessions to social activists and pressure groups can turn into a self-entrenchment strategy for incumbent CEOs. Stakeholders other than shareholders thus benefit from corporate governance rules putting managers under a tough replacement threat. We show that a minimal amount of formal stakeholder protection, or the introduction of explicit covenants protecting stakeholder rights in the firm charter, may deprive CEOs of the alliance with powerful social activists, thus increasing managerial turnover and shareholder value. These results rationalize a recent trend whereby well-known social activists like Friends of the Earth and active shareholders like CalPERS are showing a growing support for each other s agendas.
Resumo:
There are many situations in which individuals have a choice of whether or notto observe eventual outcomes. In these instances, individuals often prefer to remainignorant. These contexts are outside the scope of analysis of the standard vonNeumann-Morgenstern (vNM) expected utility model, which does not distinguishbetween lotteries for which the agent sees the final outcome and those for which hedoes not. I develop a simple model that admits preferences for making an observationor for remaining in doubt. I then use this model to analyze the connectionbetween preferences of this nature and risk-attitude. This framework accommodatesa wide array of behavioral patterns that violate the vNM model, and thatmay not seem related, prima facie. For instance, it admits self-handicapping, inwhich an agent chooses to impair his own performance. It also accommodatesa status quo bias without having recourse to framing effects, or to an explicitdefinition of reference points. In a political economy context, voters have strictincentives to shield themselves from information. In settings with other-regardingpreferences, this model predicts observed behavior that seems inconsistent witheither altruism or self-interested behavior.
Resumo:
This paper investigates the role of learning by private agents and the central bank(two-sided learning) in a New Keynesian framework in which both sides of the economyhave asymmetric and imperfect knowledge about the true data generating process. Weassume that all agents employ the data that they observe (which may be distinct fordifferent sets of agents) to form beliefs about unknown aspects of the true model ofthe economy, use their beliefs to decide on actions, and revise these beliefs througha statistical learning algorithm as new information becomes available. We study theshort-run dynamics of our model and derive its policy recommendations, particularlywith respect to central bank communications. We demonstrate that two-sided learningcan generate substantial increases in volatility and persistence, and alter the behaviorof the variables in the model in a significant way. Our simulations do not convergeto a symmetric rational expectations equilibrium and we highlight one source thatinvalidates the convergence results of Marcet and Sargent (1989). Finally, we identifya novel aspect of central bank communication in models of learning: communicationcan be harmful if the central bank's model is substantially mis-specified.
Resumo:
In this paper we offer the first large sample evidence on the availability and usage ofcredit lines in U.S. public corporations and use it to re-examine the existing findings oncorporate liquidity. We show that the availability of credit lines is widespread and thataverage undrawn credit is of the same order of magnitude as cash holdings. We test thetrade-off theory of liquidity according to which firms target an optimum level of liquidity,computed as the sum of cash and undrawn credit lines. We provide support for the existenceof a liquidity target, but also show that the reasons why firms hold cash and credit linesare very different. While the precautionary motive explains well cash holdings, the optimumlevel of credit lines appears to be driven by the restrictions imposed by the credit line itself,in terms of stated purpose and covenants. In support to these findings, credit line drawdownsare associated with capital expenditures, acquisitions, and working capital.