886 resultados para Heterogeneous firms trade model
Resumo:
This paper introduces a new rationale for the existence of “Directors’ and Officers’” (D&O) insurance. We use a model with volatile stock markets where shareholders design compensation schemes that incentivize managers to stimulate short-term increases in stock prices that do not maximize long run stock market value. We show that D&O insurance provides a convenient instrument for the initial shareholders of a company to take advantage of differences in beliefs between insiders and outsiders in capital markets. The empirical results support the idea that both the insurance coverage and the premium are higher in the presence of new shareholders and volatile markets. The results prove robust in various empirical model specifications.
Resumo:
Patients suffering from cystic fibrosis (CF) show thick secretions, mucus plugging and bronchiectasis in bronchial and alveolar ducts. This results in substantial structural changes of the airway morphology and heterogeneous ventilation. Disease progression and treatment effects are monitored by so-called gas washout tests, where the change in concentration of an inert gas is measured over a single or multiple breaths. The result of the tests based on the profile of the measured concentration is a marker for the severity of the ventilation inhomogeneity strongly affected by the airway morphology. However, it is hard to localize underlying obstructions to specific parts of the airways, especially if occurring in the lung periphery. In order to support the analysis of lung function tests (e.g. multi-breath washout), we developed a numerical model of the entire airway tree, coupling a lumped parameter model for the lung ventilation with a 4th-order accurate finite difference model of a 1D advection-diffusion equation for the transport of an inert gas. The boundary conditions for the flow problem comprise the pressure and flow profile at the mouth, which is typically known from clinical washout tests. The natural asymmetry of the lung morphology is approximated by a generic, fractal, asymmetric branching scheme which we applied for the conducting airways. A conducting airway ends when its dimension falls below a predefined limit. A model acinus is then connected to each terminal airway. The morphology of an acinus unit comprises a network of expandable cells. A regional, linear constitutive law describes the pressure-volume relation between the pleural gap and the acinus. The cyclic expansion (breathing) of each acinus unit depends on the resistance of the feeding airway and on the flow resistance and stiffness of the cells themselves. Special care was taken in the development of a conservative numerical scheme for the gas transport across bifurcations, handling spatially and temporally varying advective and diffusive fluxes over a wide range of scales. Implicit time integration was applied to account for the numerical stiffness resulting from the discretized transport equation. Local or regional modification of the airway dimension, resistance or tissue stiffness are introduced to mimic pathological airway restrictions typical for CF. This leads to a more heterogeneous ventilation of the model lung. As a result the concentration in some distal parts of the lung model remains increased for a longer duration. The inert gas concentration at the mouth towards the end of the expirations is composed of gas from regions with very different washout efficiency. This results in a steeper slope of the corresponding part of the washout profile.
Resumo:
Large-scale tectonic processes introduce a range of crustal lithologies into the Earth's mantle. These lithologies have been implicated as sources of compositional heterogeneity in mantle-derived magmas. The model being explored here assumes the presence of widely dispersed fragments of residual eclogite (derived from recycled oceanic crust), stretched and stirred by convection in the mantle. Here we show with an experimental study that these residual eclogites continuously melt during upwelling of such heterogeneous mantle and we characterize the melting reactions and compositional changes in the residue minerals. The chemical exchange between these partial melts and more refractory peridotite leads to a variably metasomatised mantle. Re-melting of these metasomatised peridotite lithologies at given pressures and temperatures results in diverse melt compositions, which may contribute to the observed heterogeneity of oceanic basalt suites. We also show that heterogeneous upwelling mantle is subject to diverse local freezing, hybridization and carbonate-carbon-silicate redox reactions along a mantle adiabat.
Resumo:
After major volcanic eruptions the enhanced aerosol causes ozone changes due to greater heterogeneous chemistry on the particle surfaces (HET-AER) and from dynamical effects related to the radiative heating of the lower stratosphere (RAD-DYN). We carry out a series of experiments with an atmosphere–ocean–chemistry–climate model to assess how these two processes change stratospheric ozone and Northern Hemispheric (NH) polar vortex dynamics. Ensemble simulations are performed under present day and preindustrial conditions, and with aerosol forcings representative of different eruption strength, to investigate changes in the response behaviour. We show that the halogen component of the HET-AER effect dominates under present-day conditions with a global reduction of ozone (−21 DU for the strongest eruption) particularly at high latitudes, whereas the HET-AER effect increases stratospheric ozone due to N2O5 hydrolysis in a preindustrial atmosphere (maximum anomalies +4 DU). The halogen-induced ozone changes in the present-day atmosphere offset part of the strengthening of the NH polar vortex during mid-winter (reduction of up to −16 m s-1 in January) and slightly amplify the dynamical changes in the polar stratosphere in late winter (+11 m s-1 in March). The RAD-DYN mechanism leads to positive column ozone anomalies which are reduced in a present-day atmosphere by amplified polar ozone depletion (maximum anomalies +12 and +18 DU for present day and preindustrial, respectively). For preindustrial conditions, the ozone response is consequently dominated by RAD-DYN processes, while under present-day conditions, HET-AER effects dominate. The dynamical response of the stratosphere is dominated by the RAD-DYN mechanism showing an intensification of the NH polar vortex in winter (up to +10 m s-1 in January). Ozone changes due to the RAD-DYN mechanism slightly reduce the response of the polar vortex after the eruption under present-day conditions.
Resumo:
Open innovation is increasingly being adopted in business and describes a situation in which firms exchange ideas and knowledge with external participants, such as customers, suppliers, partner firms, and universities. This article extends the concept of open innovation with a push model of open innovation: knowledge is voluntarily created outside a firm by individuals and organisations who proceed to push knowledge into a firm’s open innovation project. For empirical analysis, we examine source code and newsgroup data on the Eclipse Development Platform. We find that outsiders invest as much in the firm’s project as the founding firm itself. Based on the insights from Eclipse, we develop four propositions: ‘preemptive generosity’ of a firm, ‘continuous commitment’, ‘adaptive governance structure’, and ‘low entry barrier’ are contexts that enable the push model of open innovation.
Resumo:
When firms contribute to open source projects, they in fact invest into public goods which may be used by everyone, even by their competitors. This seemingly paradoxical behavior can be explained by the model of private-collective innovation where private investors participate in collective action. Previous literature has shown that companies benefit through the production process providing them with unique incentives such as learning and reputation effects. By contributing to open source projects firms are able to build a network of external individuals and organizations participating in the creation and development of the software. As will be shown in this doctoral dissertation firm-sponsored communities involve the formation of interorganizational relationships which eventually may lead to a source of sustained competitive advantage. However, managing a largely independent open source community is a challenging balancing act between exertion of control to appropriate value creation, and openness in order to gain and preserve credibility and motivate external contributions. Therefore, this dissertation consisting of an introductory chapter and three separate research papers analyzes characteristics of firm-driven open source communities, finds reasons why and mechanisms by which companies facilitate the creation of such networks, and shows how firms can benefit most from their communities.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
We develop a monopolistic competition model with nonhomothetic factor input bundles where increasing quality requires increasing use of skilled workers. As a result more skill abundant countries export higher quality, higher priced goods. Using a multicountry dataset we test and confirm the findings in Schott (2004) of a positive effect of skill abundance on unit values identified with US data. We extend the core model with per unit trade costs leading to the Washington-apples effect that goods shipped over larger distance are of higher quality. The combination of high-quality goods being relatively skill intensive with the Washington-apples effect implies that countries at a larger distance from their trading partners display a higher skill premium. Simulating our model we find that a doubling of distance of a country relative to all its trading partners raises the skill premium in a country by about 2.3 percent.
Resumo:
A wide variety of spatial data collection efforts are ongoing throughout local, state and federal agencies, private firms and non-profit organizations. Each effort is established for a different purpose but organizations and individuals often collect and maintain the same or similar information. The United States federal government has undertaken many initiatives such as the National Spatial Data Infrastructure, the National Map and Geospatial One-Stop to reduce duplicative spatial data collection and promote the coordinated use, sharing, and dissemination of spatial data nationwide. A key premise in most of these initiatives is that no national government will be able to gather and maintain more than a small percentage of the geographic data that users want and desire. Thus, national initiatives depend typically on the cooperation of those already gathering spatial data and those using GIs to meet specific needs to help construct and maintain these spatial data infrastructures and geo-libraries for their nations (Onsrud 2001). Some of the impediments to widespread spatial data sharing are well known from directly asking GIs data producers why they are not currently involved in creating datasets that are of common or compatible formats, documenting their datasets in a standardized metadata format or making their datasets more readily available to others through Data Clearinghouses or geo-libraries. The research described in this thesis addresses the impediments to wide-scale spatial data sharing faced by GIs data producers and explores a new conceptual data-sharing approach, the Public Commons for Geospatial Data, that supports user-friendly metadata creation, open access licenses, archival services and documentation of parent lineage of the contributors and value- adders of digital spatial data sets.
Resumo:
The paper develops a growth model in an overlapping generations framework of a financially repressed small open economy, and analyzes the effects of financial liberalization. The following observations are made: An increase (decrease) of interest rate (reserve requirements) reduces (increases) the steady-state stock of capital and the trade balance, but improves (deteriorates) the level of foreign exchange reserves. However, financial liberalization, in any form, is always welfare-improving. The paper, thus, advocates financial liberalization policies to be oriented towards reduction of reserve requirements rather than interest rate deregulation, if foreign reserve holding is not in a critical position.
Resumo:
In the last two decades, trade liberalization under GATT/WTO has been partly offset by an increase in antidumping protection. Economists have argued convincingly that this is partly due to the inclusion of sales below cost in the definition of dumping during the GATT Tokyo Round. The introduction of the cost- based dumping definition gives regulating authorities a better opportunity to choose protection according to their liking. This paper investigates the domestic government's antidumping duty choice in an asymmetric information framework where the foreign firm's cost is observed by the domestic firm, but not by the government. To induce truthful revelation, the government can design a tariff schedule, contingent on firms' cost reports, accompanied by a threat to collect additional information for report verification (i.e., auditing) and, in case misreporting is detected, to set penalty duties. We show that depending on the concrete assumptions, the domestic government may not only be able to extract the true cost information, but also succeeds in implementing the full-information, governmental welfare-maximizing duty. In this case, the antidumping framework within GATT/WTO does not only offer the means to pursue strategic trade policy disguised as fair trade policy, but it also helps overcome the informational problems with regard to correctly determining the optimal strategic trade policy.
Resumo:
Labor market imperfections are commonly believed to be a major reason for imposing trade impediments. In this paper, I introduce labor market rigidities that are prevalent in continental European countries into the well-known protection for sale model proposed by Grossman and Helpman (1994). I show that contrary to commonly held views, imperfections in the labor market do not necessarily increase equilibrium trade protection. A testable equilibrium trade protection equation is also derived. The findings in this paper are hence particularly relevant for empirical tests of trade policy determinants in economies with more regulated labor markets.
Resumo:
We develop a portfolio balance model with real capital accumulation. The introduction of real capital as an asset as well as a good produced and demanded by firms enriches extant portfolio balance models of exchange rate determination. We show that expansionary monetary policy causes exchange rate overshooting, not once, but twice; the secondary repercussion comes through the reaction of firms to changed asset prices and the firms' decisions to invest in real capital. The model sheds further light on the volatility of real and nominal exchange rates, and it suggests that changes in corporate sector profitability may affect exchange rates through portfolio diversification in corporate securities.
Resumo:
The consumption capital asset pricing model is the standard economic model used to capture stock market behavior. However, empirical tests have pointed out to its inability to account quantitatively for the high average rate of return and volatility of stocks over time for plausible parameter values. Recent research has suggested that the consumption of stockholders is more strongly correlated with the performance of the stock market than the consumption of non-stockholders. We model two types of agents, non-stockholders with standard preferences and stock holders with preferences that incorporate elements of the prospect theory developed by Kahneman and Tversky (1979). In addition to consumption, stockholders consider fluctuations in their financial wealth explicitly when making decisions. Data from the Panel Study of Income Dynamics are used to calibrate the labor income processes of the two types of agents. Each agent faces idiosyncratic shocks to his labor income as well as aggregate shocks to the per-share dividend but markets are incomplete and agents cannot hedge consumption risks completely. In addition, consumers face both borrowing and short-sale constraints. Our results show that in equilibrium, agents hold different portfolios. Our model is able to generate a time-varying risk premium of about 5.5% while maintaining a low risk free rate, thus suggesting a plausible explanation for the equity premium puzzle reported by Mehra and Prescott (1985).
Resumo:
Credit-rationing model similar to Stiglitz and Weiss [1981] is combined with the information externality model of Lang and Nakamura [1993] to examine the properties of mortgage markets characterized by both adverse selection and information externalities. In a credit-rationing model, additional information increases lenders ability to distinguish risks, which leads to increased supply of credit. According to Lang and Nakamura, larger supply of credit leads to additional market activities and therefore, greater information. The combination of these two propositions leads to a general equilibrium model. This paper describes properties of this general equilibrium model. The paper provides another sufficient condition in which credit rationing falls with information. In that, external information improves the accuracy of equity-risk assessments of properties, which reduces credit rationing. Contrary to intuition, this increased accuracy raises the mortgage interest rate. This allows clarifying the trade offs associated with reduced credit rationing and the quality of applicant pool.